Pixi.js: Uint8Array to Texture conversion seems to be off by one.

Created on 29 Mar 2020  路  7Comments  路  Source: pixijs/pixi.js

I'm working on a custom shader that takes some state as a texture, with one item per pixel. The state is stored in a Uint8Array which is converted to a one dimensional texture like this:

const state = new Uint8Array(items * 4);
const texture = new PIXI.Texture.fromBuffer(state, items, 1);

When the texture was passed to the shader (as a uniform), the values all seemed slightly off. I tried multiplying each value by 255.0 the dividing the result by 256.0, and everything worked correctly.

It appears that PIXI is dividing the unsigned 8-bit integers in the Uint8Array by 255 to convert them to floating point fractions of one, when it should be dividing by 256.

All 7 comments

https://github.com/pixijs/pixi.js/blob/dev/packages/core/src/textures/resources/BufferResource.ts

Nothing special, no extra divisions.

Its not pixi, its general rule of conversion for normalized values in webgl. Usually this rule is applied to colors: 0 means 0, 255 means 1.0.

Uint8->float : x/255.0 . Back : round(x * 255.0)
Uint16 -> float : x / 65535.0

You can find some of conversion functions here: https://chromium.googlesource.com/angle/angle/+/chromium/3611/src/compiler/translator/BuiltInFunctionEmulatorGLSL.cpp

I know that people sometimes make mistake by 1 using 256 instead of 255 and vice versa when packing two or more uints to float and back. @eXponenta can confirm that (wink wink)

As I understand it, if we begin with an array of ten items and normalize the offsets, the first item will be at 0.0, the second at 0.1 and so on, with the last item at 0.9. In general:

normalizedIndex = integerIndex * (1.0 / array.length)

I'm implementing a bitmapped text mode (like the Linux virtual terminal), and passing character ordinals (as indexes between 0 and 255 inclusive) inside the RGBA channels of the texture that stores the terminal state.

A second texture stores the bitmap data for 256 glyphs.

PIXI converts each ordinal to a normalized float, which I should be able to use in the shader with texture2D(glyphs, vec2(normalizedIndex, 0.0)) to access the corresponding glyph data.

Currently, I have to multiply the normalized ordinal by 255 to get my original integer value back (though represented as a float), then divide that by 256 to get the normalized offset into the glyphs texture.

Are you saying that's how it should work? That seems wrong.

Yes, as a mathematician I also felt dirty first time when I used that. After several experiments, after writing down some formulas I saw that it works fine. dividing by 2^N-1 in binary based floating numbers is stable enough.

Yes, 0.5 in that case is between two int numbers. Now that I think about it - displacementFilter isnt stable because we subtract 0.5 there, need 128/255 instead.

When an 8-bit unsigned int with the value 255 is normalized, does it become 1 or 0.99609375? Sorry, I'm new to writing shaders.

255 is 1.0

I'm happy to share that experience with you. Its not often people stumble across it, and yes it should be confusing.

OK, that makes "sense". Sorry. My misunderstanding. Thank you for your patience.

Was this page helpful?
0 / 5 - 0 ratings

Related issues

softshape picture softshape  路  3Comments

sntiagomoreno picture sntiagomoreno  路  3Comments

neciszhang picture neciszhang  路  3Comments

MRVDH picture MRVDH  路  3Comments

Makio64 picture Makio64  路  3Comments