I have a program in which I need to apply a two-dimensional texture (simple image) to a surface generated using the marching cubes algorithm. I have access to geometry and can add texture coordinates with relative ease, but the best way to generate coordinates eludes me.
Each point in a volume represents a single unit of data, and each unit of data can have different properties. To simplify the situation, I consider sorting them into “types” and assign each type a texture (or part of one large texture atlas).
My problem: I have no idea how to create the corresponding coordinates. I can save the location of the type texture in the type class and use this, but then the seams will be terribly stretched (if two adjacent points use different parts of the atlas). If possible, I would like to mix the texture at the seams, but I'm not sure if this is the best way to do this. Blending is optional, but I need to texture the vertices in some way. It is possible, but not desirable, to break the geometry into pieces for each type or duplicate vertices for texturing purposes.
I would like to avoid using shaders if possible, but if necessary, I can use the vertex and / or fragment shader to mix textures. If I use shaders, what would be the most effective way to say it was a texture or part of a pattern? It seems that passing a type through a parameter will be the easiest way, but possibly slow.
My volumes are relatively small, 8-16 points in each dimension (I keep them smaller to speed up the creation, but there are a lot on the screen at the moment). I briefly considered the possibility of making an isosurface twice the resolution of the volume, so each point has more vertices (theoretically 8), which can simplify texturing. It does not look like it would facilitate mixing.
, OpenGL . , , , .
, , ? , - ?
: , - . , . , , .
2: , . , , , .
, (, ). , .
, / ( "" ) , , , ( , , , ).
- , , , . - , . , , .
, - , .