My question came up while experimenting with a bunch of different techniques, none of which I have a lot of experience. Unfortunately, I donβt even know if I accept a stupid logical error, if I use the glium box glium , regardless of whether I got confused in GLSL , etc. Despite this, I managed to start a new Rust project from scratch, working on a minimal example showing my problem, and the problem reproduces on my computer at least.
The minimum example ends up being difficult to explain, so I will first make an even more minimal example that does what I want to do, albeit breaking bits and restricting to 128 elements (four times 32 bits, in GLSL uvec4 ). Based on this, the step to the version in which my problem arises is quite simple.
Working version with simple uniform and bit offset
The program creates one rectangle on the screen with texture coordinates from 0.0 to 128.0 horizontally. The program contains one vertex shader for a rectangle and a fragment shader that uses texture coordinates to draw vertical stripes on the rectangle: if the texture coordinate (clamped to uint ) is odd, it draws one color, when the texture coordinate is even, it draws a different color.
// GLIUM, the crate I'll use to do "everything OpenGL"
But this is not good enough ...
This program works and shows a rectangle with alternating stripes, but has an explicit restriction on limiting to 128 stripes (or 64 stripes, I think the remaining 64 are the "background of the rectangle"). To allow arbitrarily many bands (or, generally speaking, transfer arbitrarily much data to a fragmented shader), uniform buffer objects can be used that glium provides . the most suitable example in glium repo , unfortunately, cannot be compiled on my machine: the GLSL version is not supported, the buffer keyword is a syntax error in supported versions, calculating shaders is not supported at all (using glium on my machine), and are not contexts renderings without title.
Not much working version, with uniform buffer
So, unable to start with this example, I had to start from scratch using the documentation. In the above example, I came up with the following:
// Nothing changed here...
I would expect this to create the same striped rectangle (or give some error or crash if something I did wrong). Instead, it displays a rectangle having the rightmost square in solid bright red (i.e., βThe bit seemed set when the fragment reader read itβ), and the remaining three quarters are darker than red (that is, the bit was canceled when the fragment shader read it ").
Update since publication
I really poke in the dark here, so I think it could be a low-level error with memory ordering, content, over- / underrun buffer, etc. I tried various ways to populate the "neighboring" memory cells with easily distinguishable bit patterns (for example, one bit in every three sets, every fourth, two sets, which are followed twice, etc.). This did not change the output.
One of the obvious ways to get memory "next to" uint values[128] is to put it in the Data structure, just before the values (behind values not allowed, since Data values: [u32] has a dynamic size). As stated above, this does not change the output. However, placing the correctly filled uvec4 inside the uniform_data buffer, and using the main function, similar to the first example , does the original result. This shows that glium::uniforms::UniformBuffer<Data> in se works .
I therefore updated the title to understand that the problem seems to be lying somewhere else.
After Eli's answer
@Eli Friedman's answer helped me move towards a solution, but I'm not quite there yet.
Selecting and filling the buffer four times more changed the output, starting from a quarter of the filled rectangle to a completely filled rectangle. Oh, that's not what I wanted. My shader is now reading from the correct memory words. All these words should have been filled in with the correct bit diagram. However, not a single part of the rectangle became striped. Since bit_should_be_set_at must set every bit, I developed a hypothesis that the following happens:
Bits: 1010101010101010101010101010101010101 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: all bits set
To test this hypothesis, I changed bit_should_be_set_at to return true by multiples of 3, 4, 5, 6, 7, and 8. The results are consistent with my hypothesis:
Bits: 1001001001001001001001001001001001001 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: first bit set, then repeating two unset, one set. Bits: 1000100010001000100010001000100010001 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: all bits set Bits: 1000010000100001000010000100001000010 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: first bit set, then repeating four unset, one set. Bits: 1000001000001000001000001000001000001 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: first bit set, then repeating two unset, one set. Bits: 1000000100000010000001000000100000010 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: first bit set, then repeating six unset, one set. Bits: 1000000010000000100000001000000010000 Seen: ^ ^ ^ ^ ^ ^ ^ ^ ^ ^ What it looks like: first bit set, then every other bit set.
Does this hypothesis make sense? And it doesnβt matter: does the problem seem to be related to setting data up (from the Rust side) or reading it back (from the GLSL side)?