I use WebGL to render a binary encoded grid file. The binary file is written in big-endian format (I can check this by opening the file in a hex editor or viewing network traffic using a violinist). When I try to read a binary answer using Float32Array or Int32Array, the binary is interpreted as little-endian and my values ββare incorrect:
// Interpret first 32bits in buffer as an int var wrongValue = new Int32Array(binaryArrayBuffer)[0];
I cannot find references to the standard specification of typed arrays at http://www.khronos.org/registry/typedarray/specs/latest/ , so I wonder what kind of deal this is? Should I assume that all binary data should be insignificant when reading using typed arrays?
To work around the problem, I can use the DataView object (discussed in the previous link) and call:
// Interpret first 32bits in buffer as an int var correctValue = new DataView(binaryArrayBuffer).getInt32(0);
DataView functions, such as "getInt32", by default read the default values.
(Note: I tested Google Chrome 15 and Firefox 8, and they both behave the same)
javascript endianness webgl arraybuffer typed-arrays
Bob Oct 23 2018-11-23T00: 00Z
source share