I need some quick advice.
I would like to simulate cellular automata (from a Simple, efficient method for realistic cloud animations ) on a GPU. However, I am limited to OpenGL ES 2.0 shaders (in WebGL) that do not support bitwise operations.
Since each cell in this cellular automaton is a logical value, saving 1 bit per cell would be ideal. So what is the most efficient way to represent this data in OpenGL texture formats? Are there any tricks or should I just stick with the straight RGBA texture?
EDIT: Here are my thoughts so far ...
At the moment, I am going to go with either the usual GL_RGBA8, GL_RGBA4 or GL_RGB5_A1:
Perhaps I could select GL_RGBA8 and try to extract the original bits using floating point operations. For instance. x*255.0gives an approximate integer value. However, extracting individual bits is a bit of a pain (i.e. splitting into 2 and rounding a couple of times). I also fear the exact problems.
If I select GL_RGBA4, I could store 1.0 or 0.0 per component, but then I probably could try the same trick as before with GL_RGBA8. In this case, it is only x*15.0. Not sure if this will be faster or not visible, as there should be fewer operations to extract bits, but less information on reading the texture.
Using GL_RGB5_A1, I can try and see if I can compose my cells with additional information, such as the color on the voxel, where the alpha channel keeps the state of the 1-bit cell.
source
share