Hi.
My web app loads WebGL textures. I want to pass these textures into the C++ code and use them normally.
Is there any workaround for it? WebGLTexture doesn't seem to have ID.
A possible solution would be to load image binary data and pass it to the virtual file system. But I'm interested in keeping current app workflow and just use it's textures in C++.
You may be able to add an existing texture to the ones managed by ids. Look at GL.textures in the generated code, and how glGenTextures adds textures to there.
Great, thank you!
Adding to the GL.textures arrays should work, although it is slightly an internal method, as the APIs are not exposed as any kind of public interface. Be sure to assign an ID number to the WebGLTexture when you put it in the texture array.
I'm doing the following, taken from the code generated by emscripten
function RegisterNativeTextureId(textureResource)
{
var id = GL.getNewId(GL.textures); // already included in code generated by emscripten
textureResource.name = id;
GL.textures[id] = textureResource;
return id;
}
function UnregisterNativeTextureId(nativeTextureId)
{
var tex = GL.textures[nativeTextureId];
tex.name = 0;
GL.textures[nativeTextureId] = null;
return tex;
}
var GLTexture = gl.createTexture();
var GLTextureNativeId = RegisterNativeTextureId(GLTexture);
// You can pass GLTextureNativeId to the native side for usage with gl* functions now
// Once you are done with it, you can do the following
var tex = UnregisterNativeTextureId(GLTextureNativeId);
gl.deleteTexture(tex);
This issue has been automatically marked as stale because there has been no activity in the past year. It will be closed automatically if no further activity occurs in the next 7 days. Feel free to re-open at any time if this issue is still relevant.
Most helpful comment
I'm doing the following, taken from the code generated by emscripten