Webgl, texture coordinates and obj

I find it difficult to understand the relationship between the coordinates of vertices and textures when rendering data. I have a cube that is drawn using drawElements form data processed from obj. I got the textures somewhere close to working with a simple plane, where the number of vertices for the position and for the coordinates of the texture, but as soon as I use a more complex model or even just a more complicated uv decoupling, I get that the texture is all wrong.

From what I read, it is not visible how to use texture texture indices in the same way as for the vertex position, which is unsuccessful because obj has this information. The way I got closer to work was to build an array of texture coordinates from the given indices in obj. But since the length of the vertex and texture coordinate arrays is different (for example, in obj there are 8 vertices for a cube and up to 36 texture coordinates, depending on whether the grid was expanded), they do not correlate.

What is the correct workflow for using drawElements and matching vertices with its correct texture coordinates.

+1
javascript webgl
Feb 24 '14 at 12:40
source share
1 answer

You are right, you cannot easily use different indexes for different attributes (in your cases and texture coordinates).

A common example is a cube. If you want to display a backlit cube, you need normals. There are only 8 positions on the cube, but each side of the cube needs 3 different normals for the same positions as for each face dividing this position. This means that you only need 24 vertices, 4 for each of the 6 faces of the cube.

If you have a file format that has separate indexes for different attributes, you need to expand them so that each unique combination of attributes (position, normal, texture coordination, etc.) is in your buffers.

Most game engines will do this offline. In other words, they will write some tool that reads the OBJ file, expands the various attributes, and then writes the data back in advance. This is because generating extended data may take a lot of time at runtime for a large model if you are trying to optimize the data and save only unique vertices.

If you donโ€™t need optimal data, just expand it by index. The number of indexes for each attribute type must be the same.

Note: line items are not special. I talk about this because you said that there is no way to use texture coordinate indices in the same way as for an vertex position. WebGL has no concept of "position". It just has attributes that describe how to retrieve data from buffers. What about these attributes (positions, normals, random data, whatever) is up to you. gl.drawElements indexes the entire combination of attributes that you supply. If you pass an index of 7, it will give you element 7 of each attribute.

Note that the above describes how all 3D engines written in WebGL work. However, you can become creative if you want.

Here is a program that stores positions and normals in textures. Then it puts the indexes in buffers. Since textures are random access, they can therefore have different indices for positions and normals.

 var canvas = document.getElementById("c"); var gl = canvas.getContext("webgl"); var ext = gl.getExtension("OES_texture_float"); if (!ext) { alert("need OES_texture_float extension cause I'm lazy"); //return; } if (gl.getParameter(gl.MAX_VERTEX_TEXTURE_IMAGE_UNITS) < 2) { alert("need to be able to access textures from vertex shaders"); //return; } var m4 = twgl.m4; var v3 = twgl.v3; var programInfo = twgl.createProgramInfo(gl, ["vshader", "fshader"]); // Cube data var positions = [ -1, -1, -1, // 0 lbb +1, -1, -1, // 1 rbb 2---3 -1, +1, -1, // 2 ltb /| /| +1, +1, -1, // 3 rtb 6---7 | -1, -1, +1, // 4 lbf | | | | +1, -1, +1, // 5 rbf | 0-|-1 -1, +1, +1, // 6 ltf |/ |/ +1, +1, +1, // 7 rtf 4---5 ]; var positionIndices = [ 3, 7, 5, 3, 5, 1, // right 6, 2, 0, 6, 0, 4, // left 6, 7, 3, 6, 3, 2, // top 0, 1, 5, 0, 5, 4, // bottom 7, 6, 4, 7, 4, 5, // front 2, 3, 1, 2, 1, 0, // back ]; var normals = [ +1, 0, 0, -1, 0, 0, 0, +1, 0, 0, -1, 0, 0, 0, +1, 0, 0, -1, ] var normalIndices = [ 0, 0, 0, 0, 0, 0, // right 1, 1, 1, 1, 1, 1, // left 2, 2, 2, 2, 2, 2, // top 3, 3, 3, 3, 3, 3, // bottom 4, 4, 4, 4, 4, 4, // front 5, 5, 5, 5, 5, 5, // back ]; function degToRad(deg) { return deg * Math.PI / 180; } var bufferInfo = twgl.createBufferInfoFromArrays(gl, { a_positionIndex: { size: 1, data: positionIndices }, a_normalIndex: { size: 1, data: normalIndices, }, }); var textures = twgl.createTextures(gl, { positions: { format: gl.RGB, type: gl.FLOAT, height: 1, src: positions, min: gl.NEAREST, mag: gl.NEAREST, wrap: gl.CLAMP_TO_EDGE, }, normals: { format: gl.RGB, type: gl.FLOAT, height: 1, src: normals, min: gl.NEAREST, mag: gl.NEAREST, wrap: gl.CLAMP_TO_EDGE, }, }); var xRot = degToRad(30); var yRot = degToRad(20); var lightDir = v3.normalize([-0.2, -0.1, 0.5]); function draw(time) { time *= 0.001; // convert to seconds twgl.resizeCanvasToDisplaySize(gl.canvas); gl.viewport(0, 0, gl.canvas.width, gl.canvas.height); yRot = time; gl.enable(gl.DEPTH_TEST); gl.enable(gl.CULL_FACE); gl.useProgram(programInfo.program); var persp = m4.perspective( degToRad(45), gl.canvas.clientWidth / gl.canvas.clientHeight, 0.1, 100.0); var mat = m4.identity(); mat = m4.translate(mat, [0.0, 0.0, -5.0]); mat = m4.rotateX(mat, xRot); mat = m4.rotateY(mat, yRot); var uniforms = { u_positions: textures.positions, u_positionsSize: [positions.length / 3, 1], u_normals: textures.normals, u_normalsSize: [normals.length / 3, 1], u_mvpMatrix: m4.multiply(persp, mat), u_mvMatrix: mat, u_color: [0.5, 0.8, 1, 1], u_lightDirection: lightDir, }; twgl.setBuffersAndAttributes(gl, programInfo, bufferInfo); twgl.setUniforms(programInfo, uniforms); twgl.drawBufferInfo(gl, bufferInfo); requestAnimationFrame(draw); } requestAnimationFrame(draw); 
 body { margin: 0; } canvas { width: 100vw; height: 100vh; display: block; } 
 <script src="//twgljs.org/dist/2.x/twgl-full.min.js"></script> <script id="vshader" type="whatever"> attribute float a_positionIndex; attribute float a_normalIndex; attribute vec4 a_pos; uniform sampler2D u_positions; uniform vec2 u_positionsSize; uniform sampler2D u_normals; uniform vec2 u_normalsSize; uniform mat4 u_mvpMatrix; uniform mat4 u_mvMatrix; varying vec3 v_normal; // to index the value in the texture we need to // compute a texture coordinate that will access // the correct texel. To do that we need access from // the middle of the first texel to the middle of the // last texel. // // In other words if we had 3 values (and therefore // 3 texels) we'd have something like this // // ------3x1 ----- texels ---------- // [ ][ ][ ] // 0.0 |<----------------------------->| 1.0 // // If we just did index / numValues we'd get // // [ ][ ][ ] // | | | // 0.0 0.333 0.666 // // Which is right between texels so we add a // a halfTexel to get this // // [ ][ ][ ] // | | | // 0.167 0.5 0.833 // note: In WebGL2 we could just use `textureFetch` // which takes integer pixel locations vec2 texCoordFromIndex(const float index, const vec2 textureSize) { vec2 colRow = vec2( mod(index, textureSize.x), // columm floor(index / textureSize.x)); // row return vec2((colRow + 0.5) / textureSize); } void main() { vec2 ptc = texCoordFromIndex(a_positionIndex, u_positionsSize); vec3 position = texture2D(u_positions, ptc).rgb; vec2 ntc = texCoordFromIndex(a_normalIndex, u_normalsSize); vec3 normal = texture2D(u_normals, ntc).rgb; gl_Position = u_mvpMatrix * vec4(position, 1); v_normal = (u_mvMatrix * vec4(normal, 0)).xyz; } </script> <script id="fshader" type="whatever"> precision mediump float; uniform vec4 u_color; uniform vec3 u_lightDirection; varying vec3 v_normal; void main() { float light = dot( normalize(v_normal), u_lightDirection) * 0.5 + 0.5; gl_FragColor = vec4(u_color.rgb * light, u_color.a); } </script> <canvas id="c"></canvas> 
+3
Feb 25 '14 at 9:10
source share
โ€” -



All Articles