Three.js: THREE.UnsignedShortType

Created on 25 Mar 2012  ยท  20Comments  ยท  Source: mrdoob/three.js

Hi,
I'm making a terrain renderer, and I'm using 16bit grayscale pngs as displacement maps. When I load them using THREE.ImageUtils.loadTexture, it works, but I get noticable rounding errors. So I tried to modifiy it like this:

function loadDEM(path, mapping, callback) {
    var image = new Image(), texture = new THREE.Texture(image, mapping);
    texture.type = THREE.UnsignedShortType;
    image.onload = function() {
        texture.needsUpdate = true;
        if (callback) {
            callback(this);
        }
    }
    image.crossOrigin = "anonymous";
    image.src = path;
    return texture;
}

Just added texture.type = THREE.UnsignedShortType, and now I get these errors in console and terrain is flat.

WebGL: INVALID_ENUM: texImage2D: invalid texture type index.html:1
WebGL: INVALID_OPERATION: generateMipmap: level 0 not power of 2 or not all the same size

So how can I use 16bit images?

This is the actual terrain tile:
Terrain tile

Question

Most helpful comment

Like all image formats in browsers it's up to the browser what image formats are supported and what the browser does with them. ๐Ÿ˜ญ For example Chrome and Firefox support webp but Safari does not.

Also Safari supports (or used to support) TIFF files as <img> tags and it would load and display floating point TIFFs but I'm 99% it did not support uploading those floating point TIFFs as floating point textures as there is no conformance test for it (not that Apple has ever cared about passing the conformance tests ๐Ÿ˜‚)

Currently though the WebGL2 spec explicitly disallows 16 int formats from images ๐Ÿ™„

When the data source is a DOM element (HTMLImageElement, HTMLCanvasElement, or HTMLVideoElement), or is an ImageBitmap, ImageData, or OffscreenCanvas object, commonly each channel's representation is an unsigned integer type of at least 8 bits. Converting such representation to signed integers or unsigned integers with more bits is not clearly defined. For example, when converting RGBA8 to RGBA16UI, it is unclear whether or not the intention is to scale up values to the full range of a 16-bit unsigned integer. Therefore, only converting to unsigned integer of at most 8 bits, half float, or float is allowed.

All 20 comments

Bump. I'd really like to use 16bit pngs.

I just manage to set the texure to UnsignedShortType, but now the displacement is not working (terrain is flat). :(

Code: https://github.com/Pitel/DIP/blob/c4121c7bad5dea454e31c99b39d6e9d97ff7fa82/ChunkedLOD.js#L23

was this issue resolved, or is it simply not supported?

Uhm, not sure why I closed it. I guess a jsfiddle would be helpful.

Any updates on this ? none of the texture.type enum values work.

I've no idea if anyone got 16-bit grayscale PNGs to work, it looks like it would be possible to write a quick specific-case parser for a subclass of BinaryTextureLoader -- I might have to do that myself in the next day or two for a test, though it would not be general enough to merit a PR (imo -- since I only care about one specific kind of PNG, while a proper PR-worthy edit would probably need to handle more varying types of PNG).

But given the age of this thread, is it possible that someone has already made such a parser, or read-in 16-bit grayscales via some other means?

I finally decided to write a 16bit parser for png images and then use the DataTexture to load the array . Heres the 16bit image parser : https://www.npmjs.com/package/png-coder

Nice! Though it require()'s a few extra bits -- the THREE.BinaryTextureLoader class does much the same thing (loads some data into a DataTexture) but via a buffer from THREE.XHRLoader

I am curious how you worked in 16 bit textures. afaik un/signed integers are not well (if at all) supported for textures under most browsers: http://jsfiddle.net/62nu1z6r/1/

Using three.js to take the texture array as input and then process it to a
more usable case. We need to maintain the data stored in the 16bit pixels.

-Sisil

On Wed, Jul 8, 2015 at 3:02 PM, Justin Bruce Van Horne <
[email protected]> wrote:

I am curious how you worked in 16 bit textures. afaik un/signed integers
are not well (if at all) supported for textures under most browsers:
http://jsfiddle.net/62nu1z6r/1/

โ€”
Reply to this email directly or view it on GitHub
https://github.com/mrdoob/three.js/issues/1582#issuecomment-119744720.

Regards,
Sisil Mehta https://sites.google.com/site/sisilmehta/
SDE II @ Amazon

I'm going to hazard a guess that this issue can be closed.

@DefinitelyMaybe Why do you think so?

Well, if we're going by what OP said

So how can I use 16bit images?

@achillessin created a praser of sorts which could be used.

I'd have thought that if OP was still worried about the issue they'd have posted. lets check @Pitel

@Mugen87 why do you think not?

No, I'm not interested in this anymore.

I've used this in my master thesis almost 10 years ago (wow, time flies). Now I have my master degree in CS and I've moved with my career from webdev to Android.

So yeah, if you ask me, feel free to close it. Or do whatever you want with this, I don't care aymore.

I have quickly create a 16bit grayscale PNG, loaded it via TextureLoader, applied THREE.UnsignedShortType as type and use it as a displacement map (which seems a valid use case).

https://jsfiddle.net/6c0Lm3xh/1/

The fiddle produces the following WebGL warning:

three.module.js:21153 WebGL: INVALID_VALUE: texImage2D: packImage error

It was never clarified why this error happens. Also sharing a workaround as a live example would be great so interested readers quickly see a solution.

Well, it's fine if the OP does not care about this anymore. But there might be other users who try this and looking for an answer.

To solve this conundrum: At least in WebGL 1, it's only possible to use 8 bits per color channel. 16 or even 32 are not supported. As outlined by this post, you have to convert your data to a different representation e.g. to (half) floating point as a workaround.

BTW: THREE.UnsignedShortType can still be used in WebGL 1 in context of THREE.DepthTexture.

@greggman Unfortunately, this opens up a new question: Is it possible to use 16 bit images with WebGL 2 ? I've prepared a fiddle but it seems the setup is invalid.

https://jsfiddle.net/2ng16fo4/1/

The idea is to configure the texture like so:

texture.type = THREE.UnsignedShortType;
texture.format = THREE.RGBAIntegerFormat;
texture.internalFormat = 'RGBA16UI';

Congrats on the master's!

@Mugen87 is there a doc page somewhere where this information might be better placed?

Like all image formats in browsers it's up to the browser what image formats are supported and what the browser does with them. ๐Ÿ˜ญ For example Chrome and Firefox support webp but Safari does not.

Also Safari supports (or used to support) TIFF files as <img> tags and it would load and display floating point TIFFs but I'm 99% it did not support uploading those floating point TIFFs as floating point textures as there is no conformance test for it (not that Apple has ever cared about passing the conformance tests ๐Ÿ˜‚)

Currently though the WebGL2 spec explicitly disallows 16 int formats from images ๐Ÿ™„

When the data source is a DOM element (HTMLImageElement, HTMLCanvasElement, or HTMLVideoElement), or is an ImageBitmap, ImageData, or OffscreenCanvas object, commonly each channel's representation is an unsigned integer type of at least 8 bits. Converting such representation to signed integers or unsigned integers with more bits is not clearly defined. For example, when converting RGBA8 to RGBA16UI, it is unclear whether or not the intention is to scale up values to the full range of a 16-bit unsigned integer. Therefore, only converting to unsigned integer of at most 8 bits, half float, or float is allowed.

Currently though the WebGL2 spec explicitly disallows 16 int formats from images.

@greggman Thank your for clarifying! In this case, the issue can safely be closed.

Was this page helpful?
0 / 5 - 0 ratings