I highly recommend reading the tutorials in the "Learning webGL" section.
For each vertex of the square, you send your "UV coordinates". UV coordinates are vec2 data that indicates how much of the texture is associated with the vertex.
So, for example, (0,0) uv represents the upper left corner of the texture, while (0,3, 0,4) represents the position on the texture that corresponds to 30% of the width of the texture and 40% of the height.
There is a special function in the GLSL shader - texture2D, but here it is used:
uniform sampler2D uTexture; varying vec2 vUV; void main() vec4 color_from_texture = texture2D ( uTexture, vUV ); gl_FragColor = color_from_texture; }
This is called a sample texture, and you read the data from the texture at some position by calling the texture2D function.
Thus, for quads, the vertices would have positions (x1, y1), (x2, y1), (x1, y2), (x2, y2) and the corresponding UV coordinates would be (0.0, 0.0), (1.0, 0 , 0), (0,0, 1,0), (1,0, 1,0). Notice how we stretch the texture by fully defining the UV coordinates from one point to another.
Vertex shader will look something like this:
attribute vec2 aUV; attribute vec2 aVertexPos; varying vec2 vUV; void main() { vUV = aUv; gl_Position = vec4( aVertexPos, 1.0);
The shader will interpolate the UV coordinates between the vertices that combine the same triangle so that each fragment gets its own, interpolated UV value and thus displays different texture data.
Read this carefully, and after that creating an ATV should be simple: http://learningwebgl.com/blog/?p=507
Hope this helps.
Real-time example: http://abstract-algorithm.com/quad.html