Three.js: Share WebGLRenderTarget across renderers

Created on 31 Mar 2019  路  5Comments  路  Source: mrdoob/three.js

As documented in #13745 it is not currently possible to use a texture generated by rendering to a WebGLRenderTarget in another renderer. Rendering to the same canvas is fine for the use-case mentioned in that issue, but I would like to generate some complex textures off thread using workers and OffscreenCanvas, then use them as textures in the scene displayed to the user.

Right now it appears that the only way to do this would be something like https://stackoverflow.com/questions/29325906/can-you-use-raw-webgl-textures-with-three-js but it strikes me that this is a legitimate case that might warrant a feature request.

Most helpful comment

If it should work then i will definitely go and build a minimal example to see what I'm doing wrong.

I'm afraid this won't work since you can't share any WebGL resource (e.g. buffers, textures, shader programs) across different contexts. So sharing WebGLRenderTarget across multiple renderers is just not doable.

All 5 comments

are you saying it is possible to share textures between 2 contexts?

I tried doing it, shall we say the naive way by making a WebGLRenderTarget and rendering it with renderer A, then attaching it to a mesh in a scene rendered by renderer B. This didn't appear to work. If it should work then i will definitely go and build a minimal example to see what I'm doing wrong.

However, I don't think it is supposed to work. (Probably because the renderer doesnt know about that render target's gl texture id?) I made this issue because if the only way to do off-thread texture rendering is something like

// from https://stackoverflow.com/questions/29325906/can-you-use-raw-webgl-textures-with-three-js 
const texture = new THREE.Texture();
renderer.setTexture2D(texture, 0);  // force three.js to init the texture
const texProps = renderer.properties.get(texture);
texProps.__webglTexture = glTex;

i think that this is something worth supporting in the future.

Though i now realize that even this example is using the same canvas context. Hmm.

If it should work then i will definitely go and build a minimal example to see what I'm doing wrong.

I'm afraid this won't work since you can't share any WebGL resource (e.g. buffers, textures, shader programs) across different contexts. So sharing WebGLRenderTarget across multiple renderers is just not doable.

you need to read pixels from 1st context and send them to 2nd context as data texture (not a good idea if the texture is changing often)

Ok, thank you for explaining!

Was this page helpful?
0 / 5 - 0 ratings

Related issues

alexprut picture alexprut  路  3Comments

akshaysrin picture akshaysrin  路  3Comments

ghost picture ghost  路  3Comments

jack-jun picture jack-jun  路  3Comments

Horray picture Horray  路  3Comments