Sharp: Resize using size (bytes) constraints

Created on 23 Apr 2019  路  3Comments  路  Source: lovell/sharp

How do I enforce maximum constraints on pictures? I'd like to force the following constraints:

1) maximum resolution is 1000x1000
2) maximum size in bytes is 2 megabytes.

If either of these constraints are broken, I'd like sharp to do some magic, and ensure the constraints still work.

I've written code to handle the "maximum resolution of 1000x1000" and it is working fine, but how do I enforce the 2 megabytes limit (how do I know how much do I need to down-scale the image in order to get to the 2 megabytes limit)?

async function resizePhoto({photo, maxWidth, maxHeight, maxSizeInBytes}) {

    const base64 = photo.split('base64,')[1].trim();

    const bufferBase64 = Buffer.from(base64, 'base64');
    const photoSharp = sharp(bufferBase64)

    let metadata = await photoSharp.metadata();

    if (metadata.width > maxWidth || metadata.height > maxHeight) {

        const ratio = metadata.width / metadata.height; 

        if (metadata.width > metadata.height) {
            photoSharp.resize(maxWidth, maxWidth / ratio);
        } else {
            photoSharp.resize(maxHeight * ratio, maxHeight);
        }
    }

    metadata = await photoSharp.metadata();

    if (metadata.size > maxSizeInBytes) {
        // TODO;
       // what now?
    }

    return photoSharp
        .toBuffer()
}
question

Most helpful comment

This should do the trick. It would retry resize of the buffer and lower quality until required limit reached. Obviously it would be nicer if that would be done internally by sharp without the need to rerun resize each time.

async function constraintImage(buffer, quality = 82, drop = 2) {

    const done = await sharp(buffer).resize({
        width: 1000,
        height: 1000,
        fit: sharp.fit.inside
    }).jpeg({
        quality
    }).toBuffer();

    if (done.byteLength > 2000000) {
        return constraintImage(buffer, quality - drop);
    }

    return done;
}

All 3 comments

Are you looking to limit input bytes or output bytes? In the example above the former could be based on the size of the input Buffer but the latter would be almost impossible to know without performing the decompress-process-compress pipeline.

Is there an example of something else that provides a similar feature to what you're looking for? What does "do some magic" mean?

I am looking to limit the output bytes.

My use case is following: I have mobile app users who can upload pictures. These pictures need to have specific constraints because they will be shown to other users, and I want to ensure no big images get through that take up a lot of bandwidth and load slowly.

Instead of just throwing error to them "Hey, the uploaded file is too big", I was thinking, instead, I could do some auto-processing that can compress it enough to make it pass the constraint.

I am not sure with the "magic", but I understand the problem. I guess I could run divide and conquer style algorithm with the decompress-process-compress pipeline until I find a image resolution that satisfies (I assume resolution of picture is directly related to the output bytes):

compressedOutputSize <= maxOutputSizeInBytes
compressedOutputSize >= maxOutputSizeInBytes * 0.9

unless there is more common way to approach this problem.

This should do the trick. It would retry resize of the buffer and lower quality until required limit reached. Obviously it would be nicer if that would be done internally by sharp without the need to rerun resize each time.

async function constraintImage(buffer, quality = 82, drop = 2) {

    const done = await sharp(buffer).resize({
        width: 1000,
        height: 1000,
        fit: sharp.fit.inside
    }).jpeg({
        quality
    }).toBuffer();

    if (done.byteLength > 2000000) {
        return constraintImage(buffer, quality - drop);
    }

    return done;
}
Was this page helpful?
0 / 5 - 0 ratings