Using glMultiDrawElements in a 64-bit OS

I recently switched from a 32-bit environment to a 64-bit one, and it went smoothly, except for one problem: glMultiDrawElements uses some arrays that do not work without any configuration under a 64-bit OS.

 glMultiDrawElements( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT, reinterpret_cast< const GLvoid** >( iOffset_ ), mesh().faces().size() ); 

I use VBOs for both vertices and vertex indices. fCount_ and iOffset_ are arrays of GLsizei . Since the buffer is bound to GL_ELEMENT_ARRAY_BUFFER , the GL_ELEMENT_ARRAY_BUFFER elements iOffset_ used as byte offsets from the start of the VBO. This works fine under 32bit OS.

If I change glMultiDrawElements to glDrawElements and put it in a loop, it works fine on both platforms:

 int offset = 0; for ( Sy_meshData::Faces::ConstIterator i = mesh().faces().constBegin(); i != mesh().faces().constEnd(); ++i ) { glDrawElements( GL_LINE_LOOP, i->vertexIndices.size(), GL_UNSIGNED_INT, reinterpret_cast< const GLvoid* >( sizeof( GLsizei ) * offset ) ); offset += i->vertexIndices.size(); } 

I think I see that OpenGL reads 64-bit iOffset_ fragments leading to massive numbers, but glMultiDrawElements does not support any type wider than 32bit ( GL_UNSIGNED_INT ), so I'm not sure how to fix it.

Has anyone else had this situation and solved it? Or am I dealing with this completely wrong and just lucky on a 32-bit OS?

Update

Switching my existing code to:

 typedef void ( *testPtr )( GLenum mode, const GLsizei* count, GLenum type, const GLuint* indices, GLsizei primcount ); testPtr ptr = (testPtr)glMultiDrawElements; ptr( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT, iOffset_, mesh().faces().size() ); 

The results have exactly the same result.

+4
source share
2 answers

The simple reason is that glMultiDrawElements does not expect an array of integer offsets (32 bits on your platform), but an array of pointers (64 bits on your platform), interpreted as buffer offsets.

But you just throw an array of integers (or a pointer to) into an array of pointers (or pointers to) that will not work, since the function will now simply reinterpret your n consecutive 32-bit values ​​as n consecutive 64-bit values, of course, it works for glDrawElements , because you just throw a single integer into a single pointer, which essentially converts your 32-bit value to a 64-bit value.

What you need to do is not to indicate your pointer / array, but each individual value in this offset array:

 std::vector<void*> pointers(mesh().faces().size()); for(size_t i=0; i<pointers.size(); ++i) pointers[i] = static_cast<void*>(iOffset_[i]); glMultiDrawElements( GL_LINE_LOOP, fCount_, GL_UNSIGNED_INT, &pointers.front(), mesh().faces().size() ); 

Or better, just save the offsets as pointers instead of integers from the start.

+5
source

You encounter problems that I have examined in detail at fooobar.com/questions/370794 / ...

I suggest you follow my suggestion at the very end of the answer and not try to cast your digits into something that the compiler considers as a pointer, but inserts a function signature into what takes a number.

Note that in the case of glMultiDrawElements, the first indirectness does not go to VBO, but to client memory. Thus, the signature to act, for example,

 void myglMultiDrawElementsOffset(GLenum mode, const GLsizei * count, GLenum type, const uintptr_t * indices, GLsizei primcount); 
0
source

Source: https://habr.com/ru/post/1389154/


All Articles