Three.js: Gamma correct treatment of sRGB textures in THREE.js

Created on 11 May 2017  路  24Comments  路  Source: mrdoob/three.js

Description of the problem

As far as I can tell, my sRGB texture is not getting handled correctly within the THREE.js framework.

Essentially, the checkerboard area of my image gets too dark due to naive, seemingly gamma-unaware decoding, sampling, filtering, or encoding.

The image is mostly a black and white checkerboard pattern, with three stripes of (188,188,188) (Which is 50% grey). You can visually confirm that the intensity of (188,188,188) is 50% by looking at this image on any reasonably calibrated display at native 1:1 resolution with no scaling. The solid bars should be the same intensity as the checkerboard.

explaination: see Wikipedia article for sRGB which says that a normalized 50% intensity should get an sRGB value of (1.055*0.5^(1/2.4))-0.055 = 0.735358, which is about 187.516 in 8-bit sRGB, hence the logic of encoding it as 188.

gamma_srgb

When scaling down this image, most naive software will simply average the sRGB encodings and arrive at bogus intensities that are too dark. The black and white pixels will be naively averaged as (0+255)/2, to get a color of (128,128,128), which is only about 22% intensity (((128/255)+0.055)/(1.055)^2.4) =~ 0.215861. This is considerably darker than the desired intensity of 50% that would be the average intensity of equal contributions of 0% and 100%. And intuitively, "half brightness" is 50%.

But here's what it looks like when rendered with THREE.js. Notice the bars are easily visible but they're not supposed to be. The checkerboard parts of the image are too dark.

qms2l

In OpenGL, this is easy. You simply treat the texture as sRGB and use an sRGB frame buffer. Texture samples are converted from the sRGB encoding to linear when sampled, and linear values are encoded as sRGB when written to the frame buffer and all is well.

I'm at a complete loss for how to achieve this within the three.js framework though. Can someone help?

Additional Info

See my post at stackoverflow for slightly more detailed information about the math of my expectation.

Live Repro

Here is a live repro. https://jsfiddle.net/fvusxp1g/

Three.js version
  • master / r85
Browser
  • [x] All of them
OS
  • [x] All of them
Hardware Requirements (graphics card, VR Device, ...)

Nothing special required.

Question

Most helpful comment

Try this fiddle which is the same as my previous fiddle, except the box is being slowly translated.

All 24 comments

Try this fiddle, which uses MeshBasicMaterial and avoids the complications of the lighting.

Thank's @WestLangley, for distilling the repro down to a more fundamental level by eliminating the lighting (which, as it happens, is irrelevant.) It's too bad that you (perhaps accidentally) avoided the issue entirely, though by rendering the image with a 1:1 scale.

Here's my fork of your fiddle that demonstrates the problem. Simply reduce the size of the geometry a little bit which avoids the 1:1 transfer of texels to framebuffer pixels and induces some amount of linear filtering.

The problem comes when a sampler reads multiple pixels. When a black pixel (0% intensity), encoded as (0,0,0) and a white pixel (100% intensity), encoded as (255,255,255) both give equal (50-50) contributions to a final sample, the naive implementation uses an incorrect, but computationally inexpensive, technique of linearly combining the encoded values. In a nutshell, (0 + 255) / 2, gives 128 (after rounding). But 128 represents an intensity of only about 22%. The result is that everything gets too dark. I concede that unless you're rendering content that people care about, the results are often unobjectionable. But it is incorrect.

(((128/255)+0.055)/(1.055)^2.4) =~ 0.215861 =~ 22%

One correct approach is to convert the pixel values to their linear normalized values according to the sRGB to Linear formula, then compute an equal weighted linear combination (0 + 1) / 2 = 0.5. Then convert that linear intensity to an appropriate sRGB encoding 0.5 => (188,188,188).

This extra computation is can be costly. The sRGB formulae require exponentiation calculations. There's one required during the decoding of each texture pixel that is read, and one more is required when encoding the linear values as their final sRGB format in the output framebuffer.

Those extra operations are simply omitted by most render engines to gain the speed boost. Render engines instead do arithmetic directly on the encoded sRGB values.

When the naive approach gives results that are close enough for the given application and the results are greatly improved frame-rates, then the trade-off is made. It is made almost ubiquitously. But any application that wants accurate color and intensity should be making the correct calculations. In my example, I've tried to opt into the extra computation by asking three.js to respect the sRGB encodings and do the encoding-aware calculations to give me the correct colors and intensities, and I'm willing to trade off some speed to make sure the colors are correct and the quality is high.

That's what I assume would happen when you declare that a texture uses an sRGB encoding and then enable "gamma correction" -- to get the correct results rather than the fast approximation.

So how do I get correct results, as opposed to these "fast but much too dark" results?

Again, this is a piece of cake in OpenGL. Just use SRGB textures and SRGB frame buffers and enable SRGB in the context and all is well. But how do I get there from three.js?

Try this fiddle which is the same as my previous fiddle, except the box is being slowly translated.

What's happening in the last fiddle? If i understand it correctly, you're saying "yes, this is an issue"? I'm a bit confused with the 188 value, but the issue in the last fiddle is not how dark is the checker pattern, but how much it changes intensity when the plane is moving?

Whatever the case may be, https://github.com/mrdoob/three.js/pull/10791 would allow you to inject your own glsl logic into existing material templates, and do your own correction the way you see fit without hacking the library.

OK, @WestLangley that last fiddle also demonstrates the problem. Translating the image with a sub-pixel offset, you induced linear filtering resulting in combinations of black and white pixels. That also successfully reproduces the problem I'm complaining about. Thanks for the additional repro..

This issue has nothing to do with Three.js or Gamma.
It's about image pixels interpolation.
Just rescale that 512x512 image to 256x256 using any image editor you want (including Photoshop).
You'll get a "wrong" (darker) result.

"Just rescale" is ambigous, you have a few different interpolation modes, which will thus alter the level of "wrong" (darker).

@pailhead, allow me to try to explain where 188 comes from. And @RemusMar, it absolutely has everything to do with gamma.

TL;DR: 188 is the sRGB value that represents a 50% gray color. And by 50% gray, I mean the color that is just as bright (on average) as an area that is half black and half white, like a checkerboard with tiny squares.

With sRGB, (and just about any typical 8-bit color format actually) the values of 0 through 255 do not represent linear intensity ("brightness") of the pixels. In other words, 128 is not "half" as bright as 255 (and not even just because of rounding - it's not even close). The relationship between coded pixel values and brightness follows a "gamma curve", (It's roughly exponential, where the ratio of the intensities of adjacent codes is approximately the same. The exact relationship between linear intensity and the srgb encoded value is given by a curve specified in two segments, one linear and one exponential:

if linear <= 0.0031308 then srgb := 12.92 linear
if linear > 0.0031308 then srgb := 1.055 * linear^(1/2.4) - 0.055

If you plug in a linear intensity of 0.5 (half) you get (1.055*0.5^(1/2.4))-0.055 = 0.735358, which is about 187.516 in 8-bit sRGB, and when you round it off, you should be using 188 as a representation for 50% intensity.

It works, as the sample image I provided uses this encoding and the intensity of the solid color patch of (188,188,188) is the same average intensity as an area that is half black (0,0,0) and half white (255,255,255): the checkered pixels area of my example image.

If you instead incorrectly tried to represent 50% grey with (128,128,128), you'd find that the apparent color is much too dark compared to a checkerboard that is half white and half black. That's because, if you invert the equations, you'll find that 128 actually is the encoding for a linear intensity of only about 22%.

The reverse equations are as follows:

if srgb <= 0.04045 then linear := srgb / 12.92
if srgb > 0.04045 then linear := ((srgb + 0.055)/1.055)^2.4

And therefore, (128,128,128) is only approximately 22% linear intensity (after rounding).

(((128/255)+0.055)/(1.055)^2.4) =~ 0.215861.

Does that help explain it?

I suggest reading the Wikipedia articles about sRGB or gamma correction.

@RemusMar

This issue has nothing to do wuth Three.js or Gamma.
It's about image pixels interplation.
Just rescale that 512x512 image to 256x256 using any image editor you want (including Photoshop).
You'll get a "wrong" (darker) result.

This is not strictly true. If you use Photoshop and convert your image to 32 bits/channel first (Image > Mode > 32 Bits/channel), then when you resize the image, you'll see the "correct" linear treatment of color.

Beware of the many factors that can lead to these artifacts: the resolution of the image, the resolution of the drawing buffer, the texture filter modes, the use (or not) of mipmaps, the uv-alignment, the device pixel ratio, the resolution of the canvas, and in general, how the drawing buffer is composted with the canvas by the browser. These artifacts can occur regardless of the color space of the texture.

Try this fiddle, which is the same as my last fiddle, except gamma corrections have been commented out.

@WestLangley, _of course_ they occur when the gamma correction code is commented out. The issue I'm reporting is that gamma correction is the mechanism that is supposed to fix it, but it currently doesn't. So your last fiddle would not qualify as a repro of my issue.

In OpenGL, to get a gamma-correct pipeline, we would utilize GL_EXT_texture_sRGB for correct samples from textures (conversion from sRGB to linear upon sampling), and GL_ARB_framebuffer_sRGB for correct writes to the frame buffer (conversions from linear to sRGB upon writing and blending).

Not 100% sure how to take advantage of this within the three.js framework. I'm assuming tagging the texture as THREE.sRGBEncoding is an essential part of it... what else needs to be done?

Any chance you could post the same example done with openGL and the extensions?

Here's another fiddle with a procedural texture.

@wyckster

If you use Photoshop and convert your image to 32 bits/channel first (Image > Mode > 32 Bits/channel),
then when you resize the image, you'll see the "correct" linear treatment of color.

1) why should I do that if the image has 8 bit per channel (24 bits per pixel, alpha is not even required) ?
2) three.js is about 3D and runtime. Do you expect Photoshop results in Three.js ???

  1. Yeah if you program thee to do the same thing as PS. What is the point you are trying to make exactly?

@pailhead

Yeah if you program thee to do the same thing as PS.
What is the point you are trying to make exactly?

I thought it's obvious ...
PhotoShop is not about runtime.

I still don't understand how about using slightly more complex sentences and a few more of them?

@RemusMar, thanks for your reply. Your question is "why should I do that if the image has 8 bit per channel (24 bits per pixel, alpha is not even required) ?"

I wish I could have a conversation with you to try to get more familiar with what you do or do not understand about gamma curves. I'm not sure if you want me to explain it all here or not. I'm happy to explain everything, but I suspect you want a shorter response. So here's the simplest way I can describe it.

The side effect of converting your image to 32 bits per channel is that now it has enough headroom to perform linear arithmetic on the pixels without a loss of precision -- so it does.

There's not enough fidelity to store linear data at 8-bits per channel. The blacks would have highly visible bands and most of your code points would be used to represent the same bright levels of almost white. So perhaps counter-intuitively, when operating in 8-bits per channel, to maintain fidelity, you have to convert to linear internally at high-precision, then perform any arithmetic, then convert back to 8-bit for storage in the 8-bits-per-channel framebuffer. That's remarkably slow because of the exponentiation operation required, so, when dealing with 8-bit data, most graphics engines just do naive arithmetic on the 8-bits reinterpreting them as if they were actually linear (they're not) and the results are (for most cases) "good enough".

If you want high precision to do linear arithmetic then you just sacrifice speed and use more bits per channel.

The exception to all this is, of course, when dealing with GPUs that have hardware dedicated to performing the conversion between sRGB and Linear. In that case, it's easy to store 8-bit sRGB data in the buffers, but convert them to a high-precision linear format internally when performing arithmetic on the pixel values. We rely on hardware to speed up the slow stuff. Sadly, the "good enough" pipeline is also faster in hardware, and people still prefer it to get high fps rather than more accurate color, depending on the application.

Your second question was, "Do you expect Photoshop results in Three.js ???".

Not exactly. But sort of. I expect that I should get a reasonable high-level and convenient abstraction of the WebGL API. WebGL accommodates sRGB and I expect that I should be able to make use of it somehow. Presumably that's the direction that things are going when we allow specifying that a texture is sRGB encoded, and that we want gamma correction enabled.

All I was mentioning with Photoshop is that both pipelines are available, but you have to choose between them by selecting your buffer format. 1) In 8-bits-per-channel formats, you get the high-speed, "good enough" operations that sacrifice correct treatment of color to gain speed. 2) In 32-bits-per-channel format you get the low-speed but high precision pipeline that performs linear operations. In Photoshop, you simply convert once to an 8-bits-per-channel format when saving your final results.

This would be the same approach as using a floating-point render target in GL. You can't rasterize a floating point format to the video generator of your display, you have to convert to 8-bits-per-channel to display the information in your floating-point format.

@wyckster

I expect that I should get a reasonable high-level and convenient abstraction of the WebGL API.

I'm not sure if it fits with the purpose of the Three.js library:
lightweight,easy to use and running as fast as possible

Should we ditch SRGBEncoding then? And simply say, "not supported - use raw WebGL or some other library instead?"

/ping @bhouston

I would be very interested to hear how sRGB is currently being handled and how THREE wants to handle it. Getting renders to look higher quality requires correct srgb handling. I've been trying to figure out how THREE handles the framebuffer specifically - what the correct wayto enable the EXT_sRGB webgl extension, if the standard shader supports it, etc. The documentation is completely silent (as far as I can tell) on the subject.

@wyckster

http://jsfiddle.net/uz0sdzLn/32/
is this one correct?

Please correct me if it's not right below:

for speed/storage reason, usually images are saved as 8bit format but in this format the color is actually not linear, so here comes gamma correction.
for an engine, it must process color in linear space for e.g. mixing, lighting etc purposes in order to keep a accurate color value so it usually needs to decode some encoded space into linear first,
and for the final output framebuffer, we have to correct the color with gamma curve for the verious of different displays as they have different constract/brightness settings which causes the grey looks different by eyes (although normally default factor is 2.2).

in three.js
the gammaInput / gammaOutput are only for gamma premultipled textures, which means you cannot specify encoding for your texture otherwise the engine won't do gamma correction by the gammaFactor value.
when you turn on renderer.gammaInput = true (if texture.encoding not set or set as LinearEncoding), the engine will consider that your texture are gamma premultipled, otherwise the engine will use what encoding you set for your texture
so in these two situations:

1, gammaInput = true
in PS create a pic as you discribled, and finally CTRL+L to open gamma correction panel, input 0.735358 into the middle inputbox then save your pic, and then in three.js do gammaInput/Ouput=true with gammaFactor = 2.2 (don't set encoding for texture);
actually this way the shader processes these steps:
a> gammatolinear -> pow(pixel.rgb, gammaFactor)
b> lineartogamma -> pow(pixel.rgb, 1 / gammaFactor)

2, texture.encoding = sRGBEncoding
create pic in PS as what you discribed without gamma correction and save your pic,
then in threejs set gammaInput = false(default) and gammaOutput = false(default)
this way the shader does:
a> srgbtolinear ->
mix(
pow(pixel.rgb * 0.9478672986 + 0.0521327014, 2.4),
pixel.rgb * 0.0773993808,
pixel.rgb <= 0.04045 ? 1 : 0
);
b> lineartosrgb ->
mix(
pow(pixel.rgb, 0.41666 * 1.055 - 0.055),
pixel.rgb * 12.92,
pixel.rgb <= 0.0031308 ? 1 : 0
);

so conclusions:
renderer.gammaInput / output / texture.encoding are only for decoding a spece into linear space then do lighting, mixing whatever and then finally encode it back to what encoding you specified when output.
if you wanna do myMap.texture.encoding = sRGBEncoding and output things with gamma correction, I think there is no way in three.js for now(till r91), or write your own shader (post effect maybe) to add gamma correction with a factor parameter.

just found an issue, it seems that @WestLangley wants to add these new features. see #11337.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

Horray picture Horray  路  3Comments

scrubs picture scrubs  路  3Comments

konijn picture konijn  路  3Comments

jlaquinte picture jlaquinte  路  3Comments

ghost picture ghost  路  3Comments