Phantomjs: Canvas toDataURL encode is different

Created on 25 Mar 2012  路  8Comments  路  Source: ariya/phantomjs

_[email protected] commented:_

Which version of PhantomJS are you using? 1.5

What steps will reproduce the problem?

  1. I created a simple html page with a 200x200 canvas elemnt
  2. In console: document.getElementById('canvas').toDataURL();
  3. In PhantomJS I opend a page to the html file then after the page loaded I run an evaluate function to get the toDataURL data

What is the expected output? What do you see instead?

See Files attached

Which operating system are you using? Ubuntu 11.04

Did you use binary PhantomJS or did you compile it from source? Latest binary

Please provide any additional information below.

Disclaimer:
This issue was migrated on 2013-03-15 from the project's former issue tracker on Google Code, Issue #455.
:star2:   4 people had starred this issue at the time of migration.

Most helpful comment

I'm able to get a png of proper size if I use toDataURL with an explicit quality-argument:

var base64 = canvas.toDataURL("image/png", 0);

It's when I don't provide an explicit quality-argument, that I end up with a very large png. I traced the problem to JSHTMLCanvasElement::toDataURL.

There, we have the following few lines of code:

double quality;
double* qualityPtr = 0;
if (exec->argumentCount() > 1) {
    JSValue v = exec->argument(1);
    if (v.isNumber()) {
        quality = v.toNumber(exec);
        qualityPtr = &quality;
    }
}

If an explicit quality-argument is not provided, qualityPtr remains null. Later, this pointer is passed to encodeImage in ImageBufferQt.cpp. There, we have the following lines:

int compressionQuality = 100;
if (quality && *quality >= 0.0 && *quality <= 1.0)
    compressionQuality = static_cast<int>(*quality * 100 + 0.5);

Since quality (which is qualityPtr) is not a valid pointer, the if clause never gets executed, and you end up with compressionQuality set to 100. When the image is finally saved to the buffer using QPixmap::save, you end up with a large, uncompressed file; from the documentation:

The quality factor must be in the range [0,100] or -1. Specify 0 to obtain small compressed files, 100 for large uncompressed files, and -1 to use the default settings.

According to the HTML5 spec, 搂4.12.4.4 says that the quality argument only applies to images of type image/jpeg. It makes sense to default to 100 (although I think Firefox and Chrome use 92) in the case of JPEG, because you still want to render the image with decent quality. But in phantomjs, this number ends up affects the level of _compression_ that is performed for PNGs. The compression doesn't affect the quality of the image, since PNG is lossless; it only ends up affecting the size. So a quality of 100 results in a PNG image without any compression, and 0 results in a PNG image with maximum compression.

I think encodeImage should take the image type into account before setting the quality value. It should only honor that argument if the type is JPEG; otherwise it should use -1, so that QPixmap::save uses the default settings for that type.

I can get started on a pull-request for this.

Fixed here.

All 8 comments

_[email protected] commented:_

I'm seeing this issue. I'm using Ubuntu 12.04. When calling .toDataURL() on canvas, I get a much larger base64 string. For example, the actual size of my base64 image is 55,360 characters. I get this exact size when I load my image into the canvas on the browser. However, doing it through phantom using page.eval, I get a much larger base64 string from .toDataURL() (7,003,334 characters). The size of the image is 1280x1024.

_alessandro.[email protected] commented:_

I decoded your dataurl-from-*.txt files to png and attached them below. Both are apparently valid files, with the same visual result in an image viewer. But the PhantomJS version has an insane amount of zeroes in (or after?) the IDAT chunks.

If anyone with good knowledge about the png structure could tell us what is so fishy about the PhantomJS version, I could investigate in the PhantomJS/Qt/QtWebkit code.

If anyone is interested, I have a work around for generating PNG files at a reasonable size.

I use context.getImageData to pull out the rgba values for each pixel, send them back to node, interpret them via ndarray, and export a PNG via save-pixels.

My implementation can be found here (I convert the values to strings to reduce padding):

https://github.com/twolfson/phantomjssmith/blob/38ffa678bc895016da6667091aa7a4b90d2d8520/lib/exporters.js#L82-L94

https://github.com/twolfson/phantomjssmith/blob/38ffa678bc895016da6667091aa7a4b90d2d8520/lib/scripts/compose.html#L54-L66

Links of interest:

https://github.com/mikolalysenko/ndarray

https://github.com/mikolalysenko/get-pixels/blob/2ac98645119244d6e52afcef5fe52cc9300fb27b/dom-pixels.js

https://github.com/mikolalysenko/save-pixels

pngcrush seems to work also.

I'm able to get a png of proper size if I use toDataURL with an explicit quality-argument:

var base64 = canvas.toDataURL("image/png", 0);

It's when I don't provide an explicit quality-argument, that I end up with a very large png. I traced the problem to JSHTMLCanvasElement::toDataURL.

There, we have the following few lines of code:

double quality;
double* qualityPtr = 0;
if (exec->argumentCount() > 1) {
    JSValue v = exec->argument(1);
    if (v.isNumber()) {
        quality = v.toNumber(exec);
        qualityPtr = &quality;
    }
}

If an explicit quality-argument is not provided, qualityPtr remains null. Later, this pointer is passed to encodeImage in ImageBufferQt.cpp. There, we have the following lines:

int compressionQuality = 100;
if (quality && *quality >= 0.0 && *quality <= 1.0)
    compressionQuality = static_cast<int>(*quality * 100 + 0.5);

Since quality (which is qualityPtr) is not a valid pointer, the if clause never gets executed, and you end up with compressionQuality set to 100. When the image is finally saved to the buffer using QPixmap::save, you end up with a large, uncompressed file; from the documentation:

The quality factor must be in the range [0,100] or -1. Specify 0 to obtain small compressed files, 100 for large uncompressed files, and -1 to use the default settings.

According to the HTML5 spec, 搂4.12.4.4 says that the quality argument only applies to images of type image/jpeg. It makes sense to default to 100 (although I think Firefox and Chrome use 92) in the case of JPEG, because you still want to render the image with decent quality. But in phantomjs, this number ends up affects the level of _compression_ that is performed for PNGs. The compression doesn't affect the quality of the image, since PNG is lossless; it only ends up affecting the size. So a quality of 100 results in a PNG image without any compression, and 0 results in a PNG image with maximum compression.

I think encodeImage should take the image type into account before setting the quality value. It should only honor that argument if the type is JPEG; otherwise it should use -1, so that QPixmap::save uses the default settings for that type.

I can get started on a pull-request for this.

Fixed here.

Is this going to be fixed?

The fix is upstream in Qt-webkit.

Yup. Closing. Thank you @vivin !

Was this page helpful?
0 / 5 - 0 ratings