Hi there,
I'm working on an application that renders the same scene twice in a single frame, once without post processing and once using EffectsComposer. The use case is basically an editor with a thumbnail preview of the scene with postfx.
This seems to trigger a shader recompilation, as has been discovered in a few other issues*. The advice so far has been to only use EffectsComposer and apply gamma correction as a pass in the effects chain, but this is pretty problematic for our use case as we use line rendering quite a bit for e.g. reference grids so losing hardware AA when using RTT results in unacceptable image quality.
A potential work around is setting the encoding of all render targets in the EffectComposer chain to use sRGB encoding, but I'm not certain what problems this might produce and feels brittle since there is no way to _force_ every pass to use the correct encoding as far as I know. But results should still be correct, since my understanding of using sRGB encoding for a texture is that hardware then performs the appropriate conversions on reads/writes, so shaders can still run code as if the data was linear.
It seems that the necessary conditions to reproduce are:
renderer.outputEncodingStepping through the code it seems specifically to do with the renderContext.state.lights version being different for each camera, which then triggers a check of the various properties against those when the program was created, which in will fail due to the different renderTargets and their encoding.
https://jsfiddle.net/BruOps/qy6o74pn/9/
Introduced in r112
None
But results should still be correct, since my understanding of using sRGB encoding for a texture is that hardware then performs the appropriate conversions on reads/writes, so shaders can still run code as if the data was linear.
I'm afraid that's not correct since the decoding does not happen automatically (it does only happen automatically when the WebGL extension EXT_sRGB is used. But three.js does not support it so far).
Notice that post-processing in sRGB is unusual but not necessarily wrong. However, _many_ passes assume linear color space. So when you encode the beauty pass in sRGB right from the beginning, the post-processing might work in the wrong color space and thus produce incorrect results. However, there is no general rule since it really depends on the specific use case.
That makes it hard for the engine to automate the encoding process so it fits for all possible use cases. Notice that before WebGLRenderer.outputEncoding was introduced, wrong FX setups were silently ignored. Inconsistent setups are now more visible because of the performance issue that pops up when the encodings are different.
BTW: I don't think #14121 is related to this issue.
I guess we need to find a way to pass the output encoding more flexible into the shaders. Probably as a uniform...
Ahhh I thought it _was_ using EXT_sRGB! :confounded: So just to check my understanding, texture.encoding and renderer.outputEncoding is actually used during shader compilation to modify how textures are read from and how the fragment color is saved? So we never actually properly interpolate sRGB textures on read?
Could a ShaderPass use the input texture encoding to also perform the conversion on read?
I'm a little confused by the remark that "inconsistent setups are now more visible", since I don't think it's actually wrong to have a back buffer that's sRGB encoded and then to have an internal render chain that isn't, as long as the final pass renders to the backbuffer in the proper fashion. Right now the way we identify shader programs for recompilation means this is actually impossible in certain conditions... But I guess that's the cost of doing texture reads "automagically" without supporting EXT_sRGB?
And yeah regarding #14121 you're right, I've edited the OP.
Edit: For now I may just modify the shaders in the relevant beauty passes we use to ignore renderTarget "encoding" and treat everything as linear, while having .encoding = sRGBEncoding in the application layer to prevent recompilation... it's kind of a shitty hack but at least it's correct?
So just to check my understanding, texture.encoding and renderer.outputEncoding is actually used during shader compilation to modify how textures are read from and how the fragment color is saved?
Correct.
I don't think it's actually wrong to have a back buffer that's sRGB encoded and then to have an internal render chain that isn't
You are of course right. I was not 100% clear. I was referring to use cases we had earlier where performing beauty passes in a consistent color space makes actually more sense (a scene with a mirror).
Yes, If you want to perform two beauty passes in different encodings, you currently hit the mentioned issue.