Aws-sdk-js: S3.putObject only accepts streams that it can determine the length of

Created on 16 Nov 2019  路  9Comments  路  Source: aws/aws-sdk-js

Is your feature request related to a problem? Please describe.

According to https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property the Body element can be a ReadableStream, however in practice, it will only succeed if the sdk can determine the length (see #2661 or https://github.com/aws/aws-sdk-js/blob/master/lib/event_listeners.js#L167).
Looking at https://github.com/aws/aws-sdk-js/blob/master/lib/util.js#L198 a stream will only work if there is a path. This means that only things like fs.createReadStream will work. If the stream is transformed in any way, it will no longer work.

e.g.

Body = fs.createReadStream('./someFile').pipe(someTransform)
s3.putObject({ Bucket, Key, Body }).promise().then(console.log)

Error: Cannot determine length of [object Object]
  at Object.byteLength (aws-sdk/lib/util.js:200:26)
  at Request.SET_CONTENT_LENGTH (aws-sdk/lib/event_listeners.js:163:40)
  at Request.callListeners (aws-sdk/lib/sequential_executor.js:106:20)
  at Request.emit (aws-sdk/lib/sequential_executor.js:78:10)
  at Request.emit (aws-sdk/lib/request.js:683:14)
  at Request.transition (aws-sdk/lib/request.js:22:10)
  at AcceptorStateMachine.runTo (aws-sdk/lib/state_machine.js:14:12)
  at aws-sdk/lib/state_machine.js:26:10
  at Request.<anonymous> (aws-sdk/lib/request.js:38:9)
  at Request.<anonymous> (aws-sdk/lib/request.js:685:12)

Describe the solution you'd like

Update the documentation to more clearly identify which streams will work,
and point users to S3.upload

Describe alternatives you've considered

A caller could include the content length, but I think that S3.upload is just a better answer.

documentation service-api

Most helpful comment

Thanks for info @seebees & @gbataille!!!

I fixed it to use the s3 upload method

import fetch from 'node-fetch';

const res = await fetch(url)
const stream = res.body
s3.upload(
    { ACL: 'public-read', Body: buffer, Bucket: 'test', Key: 'fileName' },
    (err, data) => (err ? reject(err) : resolve(data))
  );

Is the new code :)

All 9 comments

@seebees I reached out to the respective service teams, will update here once I hear back from them.

Agreed. I have the case where I'm getting the stream from a Request body (from a graphQL) API.
I have to first read the stream into a Buffer to then be able to invoke putObject

This is quite disturbing, and as it is undocumented, it actually through me off for a few hours before I understood what was wrong

By the way, I think it is the same as #2442

This is still an issue...

EDIT - I POSTED UPDATED CODE IN A MESSAGE BELOW

I was having this issue using node-fetch

I got it to work by reading the stream into a Buffer like what @gbataille said.

const res = await fetch(url)
const buffer = await res.buffer()
s3.putObject(
    { ACL: 'public-read', Body: buffer, Bucket: 'test', Key: 'fileName' },
    (err, data) => (err ? reject(err) : resolve(data))
  );

EDIT - I POSTED UPDATED CODE IN A MESSAGE BELOW

@amouly @RusseII @gbataille https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#upload-property should do what you want.

Under it all S3 must know the size of the object,
but upload will intelligently chunk the message into S3 for you.

@RusseII I quickly hit memory issues with reading everything in a Buffer :D

But then indeed, as @seebees mentions, the upload method seems to be of higher level and it works with any kind of stream (it seems)

I don't quite know why those are different. I think putObject simply exposes the Web API while the S3 Service adds some helper methods...

upload wraps the multi part upload.
putObject is just an S3 put, so it requires knowing the exact size.

You can use upload to manipulate the partSize per the documentation linked above.

Thanks for info @seebees & @gbataille!!!

I fixed it to use the s3 upload method

import fetch from 'node-fetch';

const res = await fetch(url)
const stream = res.body
s3.upload(
    { ACL: 'public-read', Body: buffer, Bucket: 'test', Key: 'fileName' },
    (err, data) => (err ? reject(err) : resolve(data))
  );

Is the new code :)

Was this page helpful?
0 / 5 - 0 ratings