Httpie: [Enhancement] Provide support for streaming uploads and progress bar

Created on 24 Mar 2016  ·  11Comments  ·  Source: httpie/httpie

Uploading files using input redirection fails for large files.
I presume the whole file is getting read into memory first.

Last time I checked requests library can perform streaming uploads.
http://docs.python-requests.org/en/master/user/advanced/#streaming-uploads

It would be nice to integrate this logic into httpie client may be using a command line switch.
The file upload should show a progress bar (ETA) as well.

Please let me know if this functionality already exists. I can have a look at the code if required.

feature

Most helpful comment

@darshanime streamed uploads have already been implemented in master, so you should have no issues with large files (it will be released with v2.3.0). Upload progress bar coming soon.

All 11 comments

Please define how the upload fails. What is the exact message?

Very easy to reproduce. This test is on a 512MB machine with 1GB swap.

[Fri Mar 25 10:50:35] sandeep@stream:~⟫ http --version
0.9.3
[Fri Mar 25 10:41:39] sandeep@stream:~⟫ sudo fallocate -l 1G test.log
[Fri Mar 25 10:41:51] sandeep@stream:~⟫ ls -alh test.log
-rw-r--r-- 1 root root 1.0G Mar 25 10:41 test.log
[Fri Mar 25 10:41:55] sandeep@stream:~⟫
[Fri Mar 25 10:42:46] sandeep@stream:~⟫ http PUT https://transfer.sh/test.log < test.log

http: error: MemoryError:

@sandeep048

I'm actually working on this (streamed uploads for redirected input) these days. Stay tuned :sunglasses:

(Relevant kevin1024/pytest-httpbin#33 & kennethreitz/requests#3035)

Any chance for an update ?

@macnibblet yes, I'm looking into it these days.

Considering what the behaviour should be. I believe curl enables chunked uploads when the Transfer-Encoding: chunked header is specified.

HTTPie could do same: keep the current default behaviour (buffered uploads) and switch to chunked when the user sets Transfer-Encoding: chunked.

Perhaps for piped stdin it could switch to streaming automatically. That would probably be sensible. On the other hand, it'd break backwards compatibility. Also, it would add another mode which makes the behaviour harder to understand.

Same error with a form POST (without redirection input).
Sending a 200M binary file causes memory error in Debian virtual box with 250M RAM.

Using (as in docs):
Item Type: Form File Fields field@/dir/file
Description: Only available with --form, -f. For example screenshot@~/Pictures/img.png. The presence of a file field results in a multipart/form-data request.

Command line:
$ http -f POST http://10.0.2.2:8000/uploads/ [email protected]

Error:
http: error: MemoryError:

I wrote streaming and chunked upload support for Requester, an HTTP client I built for Sublime Text. It's also built on top of Requests.

In Requests, streaming uploads are really simple. You just a pass a file handle to the data arg of requests.

Chunked uploads happen automatically if you pass a __generator__ to the data arg. Requests will set the "Transfer-Encoding": "chunked" header on the request if you do so. Passing a generator provides the same memory benefits as a streaming upload (you don't have to read the entire file into memory), and also allows you to run code each time your generator yields another chunk of data, which allows you to display a progress bar.

This is how I solved this problem in Requester. I wrote a function called read_in_chunks that accepts a handle_read callback, and each time the function reads another chunk, it passes the chunk count and chunk size to handle_read. handle_read goes ahead and displays a status bar.

Not all servers accept chunked uploads. From what I understand, requests-toolbelt lets you invoke a function on each iteration of __streaming__ uploads as well, but I didn't want to pull in another dependency to make this work.

What's the latest here? I'm also seeing "http: error: Request timed out (30s)." for large files

Any news?

@darshanime streamed uploads have already been implemented in master, so you should have no issues with large files (it will be released with v2.3.0). Upload progress bar coming soon.

@dausruddin 🔝

Was this page helpful?
0 / 5 - 0 ratings

Related issues

rashthedude picture rashthedude  ·  3Comments

a-x- picture a-x-  ·  7Comments

maciej picture maciej  ·  4Comments

filipesperandio picture filipesperandio  ·  3Comments

tonsV2 picture tonsV2  ·  4Comments