Webgl / quad projection mapping warp / angled pin

I want to display a rectangular texture in a square with angles in arbitrary positions corresponding to the pixels of the screen. Something like drawQuad( x1,y1, x2,y2, x3,y3, x4,y4 ) . Are there any demos using WebGL?

All I did with WebGL was with Three.js, which simplifies geometry, lighting, and camera. But it's simpler: no lighting or camera angles are needed, just by entering the absolute position of the four corners. I am not against using Three.js if I can get to this lower level.

I found an explanation using the GLSL vertex shader in Quartz Composer.


Update

In my project, I ended up working with CSS for now, because I can stretch any element (canvas, img, div, video) and not get breaks with non-planar shapes ... In the end, I could return the WebGL option. If anyone has an example dragging four corners to arbitrary positions, I will add it here.

meemoo mapper

+6
source share
1 answer

I highly recommend reading the tutorials in the "Learning webGL" section.

For each vertex of the square, you send your "UV coordinates". UV coordinates are vec2 data that indicates how much of the texture is associated with the vertex.

So, for example, (0,0) uv represents the upper left corner of the texture, while (0,3, 0,4) represents the position on the texture that corresponds to 30% of the width of the texture and 40% of the height.

There is a special function in the GLSL shader - texture2D, but here it is used:

 uniform sampler2D uTexture; varying vec2 vUV; void main() vec4 color_from_texture = texture2D ( uTexture, vUV ); gl_FragColor = color_from_texture; } 

This is called a sample texture, and you read the data from the texture at some position by calling the texture2D function.

Thus, for quads, the vertices would have positions (x1, y1), (x2, y1), (x1, y2), (x2, y2) and the corresponding UV coordinates would be (0.0, 0.0), (1.0, 0 , 0), (0,0, 1,0), (1,0, 1,0). Notice how we stretch the texture by fully defining the UV coordinates from one point to another.

Vertex shader will look something like this:

 attribute vec2 aUV; attribute vec2 aVertexPos; varying vec2 vUV; void main() { vUV = aUv; gl_Position = vec4( aVertexPos, 1.0); // maybe modelViewMatrix * projMatrix * vec4(aVertexPos, 1.0) } 

The shader will interpolate the UV coordinates between the vertices that combine the same triangle so that each fragment gets its own, interpolated UV value and thus displays different texture data.

Read this carefully, and after that creating an ATV should be simple: http://learningwebgl.com/blog/?p=507

Hope this helps.

Real-time example: http://abstract-algorithm.com/quad.html

+3
source

Source: https://habr.com/ru/post/944726/


All Articles