Sharp: Handling sharp's output to upload into cloud storage buckets?

Created on 16 Mar 2017  Â·  11Comments  Â·  Source: lovell/sharp

Hi,

Not sure what would be the best approach to handle and upload the result of sharp's transformations. To clarify, I have a google storage bucket that can upload (using google-cloud-node) pointing from a local file location, or more specifically stream from that local file location. example:

bucket.upload('/var/folders/8p/image.png', function(err, file, apiResponse) {
  // Your bucket now contains:
  // - "image.png" (with the contents of '/var/folders/8p/image.png')
  // `file` is an instance of a File object that refers to your new file.
});

or

fs.createReadStream('/var/folders/8p/...')
  .pipe(file.createWriteStream())
  .on('error', function(err) {})
  .on('finish', function() {
    // The file upload is complete.
  });

So my question is: is the best way to write sharp's result into memory and then upload the file via location path, or is it possible to create a stream of the result and upload the result directly?

Sorry for the novice question in advance, I saw some discussions that were somewhat related, but didn't really provide any direct solutions.

question

Most helpful comment

Hello, you can stream the result.

Using the example at https://github.com/GoogleCloudPlatform/google-cloud-node#preview-2

const remoteWriteStream = gcs
  .bucket('my-existing-bucket')
  .file('zebra.jpg')
  .createWriteStream();

sharp(input)
  .resize(100, 100)
  .pipe(remoteWriteStream);

All 11 comments

Hello, you can stream the result.

Using the example at https://github.com/GoogleCloudPlatform/google-cloud-node#preview-2

const remoteWriteStream = gcs
  .bucket('my-existing-bucket')
  .file('zebra.jpg')
  .createWriteStream();

sharp(input)
  .resize(100, 100)
  .pipe(remoteWriteStream);

@lovell Wow! Such a simple and elegant solution that works! Thank you very much!

Sorry to reopen this issue, but I'm running into this strange problem of the image being cutoff in sharp. I initially thought there was an issue with the way I was cropping it, but removed the method, and now suspect the stream/sharp processing is ending prematurely? image link (also tried with variety of images)

Could it be a memory issue?

Are you able to share more of the code you're using, especially regarding the input image?

Sure thing: original image

app.post('/', function (req, res, next) {
    var form = new multiparty.Form();
    let userProfileId = shortid.generate()

    let gcs = storage({
        projectId: '...',
        keyFilename: '...'
    });

    form.on('error', (err)=>{
        console.log('Error parsing form: ' + err.stack);
        res.send('error parsing form')
        return next()
    });

    form.parse(req, (err, fields, files)=>{
        //{userId: userId, profilePic: file} (incoming form data)
        const metadata = {
                            contentType: files.profilePic[0].headers["content-type"],
                        };
        var remoteWriteStreamProfile = gcs
            .bucket('bucketName')
            .file('profile/' + userProfileId)
            .createWriteStream();

        sharp(files.profilePic[0].path)
            .pipe(remoteWriteStreamProfile)
            .on('finish', ()=>{
                gcs.bucket('bucketName')
                    .file('profile/' + userProfileId)
                    .setMetadata(metadata).then((response)=>{
                    console.log(response)
                });
            })
    })
    form.parse(req);
}

Some extra info:
I'm using multiparty to parse the incoming form body. There isn't any constraints in terms of size/fields for requests. There is a 'finish' event handler on the bucket to set the correct metadata for the image type. (I've removed it that thinking it was the issue, but doesn't appear to be). I'm also using postman to send the requests too (not sure if that helps)

The finish listener needs to be attached to the Stream responsible for uploading, namely remoteWriteStreamProfile:

sharp(files.profilePic[0].path)
  .pipe(remoteWriteStreamProfile);

remoteWriteStreamProfile.on('finish', ...

Ah, agreed. Thanks for pointing that out.

The issue still exists though, but one pattern I do notice in the final corrupted upload files is the similar file sizes

Bklh3dtse    55.15 KB   image/jpeg  Multi-Regional  3/17/17, 11:12 AM   

  H1FAEW1oe 55.54 KB    image/jpeg  Multi-Regional  3/17/17, 11:06 AM   

  H1FsrKKsx 51.5 KB —   Multi-Regional  3/17/17, 11:49 AM   

  Hyazh_tjg 55.15 KB    image/jpeg  Multi-Regional  3/17/17, 11:09 AM   

  ryj-pOKox 59.24 KB    image/jpeg  Multi-Regional  3/17/17, 11:13 AM   

  S17meYKie 51.73 KB    image/jpeg  Multi-Regional  3/17/17, 11:26 AM   

  SJq_2_Yox 55.62 KB    image/jpeg  Multi-Regional  3/17/17, 11:11 AM   

  SyhZRuKsg 51.48 KB    image/jpeg  Multi-Regional  3/17/17, 11:18 AM   

Not sure if that could be a possible clue....

I'm starting to realize it's not an issue at all with sharp, and with the file stream itself. Did a direct upload without sharp, and noticing the same issue, but with a alpha channel instead of the grey cutoff. Meaning that the image/file resolution is preserved, but nothing after that ~50kb upload.

Thanks again for the help! I'll close out this issue since it isn't related to sharp

Just for anyone who stumbles onto this thread:

A workaround solution seems to be:

instead of using

 form.parse(req, (err, fields, files)=>{...})

use:

form.on('part', (part)=>{})

Not sure why this works, but using the latter exposes the objects/files stream being uploaded, where as trying to access it via parse method => then url location, creates an issue. Perhaps it has something to do with not finishing storing it on memory before the upload is executed.

Thanks again for the help @lovell

Hello, you can stream the result.

Using the example at https://github.com/GoogleCloudPlatform/google-cloud-node#preview-2

const remoteWriteStream = gcs
  .bucket('my-existing-bucket')
  .file('zebra.jpg')
  .createWriteStream();

sharp(input)
  .resize(100, 100)
  .pipe(remoteWriteStream);

@lovell
I'm afraid the link is outdated. That code doesn't seem work, does it? Thanks for awesome job btw!

@taruyar The two and a half year old example code is from https://github.com/googleapis/google-cloud-node so you'll need to check/ask in that repo.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

genifycom picture genifycom  Â·  3Comments

henbenla picture henbenla  Â·  3Comments

natural-law picture natural-law  Â·  3Comments

zilions picture zilions  Â·  3Comments

janaz picture janaz  Â·  3Comments