WebGL Lesson 5 – introducing textures

<< Lesson 4Lesson 6 >>

Welcome to my number five in my series of WebGL tutorials, based on number 6 in the NeHe OpenGL tutorials. This time we’re going to add a texture to a 3D object — that is, we will cover it with an image that we load from a separate file. This is a really useful way to add detail to your 3D scene without having to make the objects you’re drawing incredibly complex. Imagine a stone wall in a maze-type game; you probably don’t want to model each block in the wall as a separate object, so instead you create an image of masonry and cover the wall with it; a whole wall can now be just one object.

Here’s what the lesson looks like when run on a browser that supports WebGL:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t.

More on how it all works below…

The usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on in the code, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the differences between the code for lesson 4 and the new code.

There may be bugs and misconceptions in this tutorial. If you spot anything wrong, let me know in the comments and I’ll correct it ASAP.

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there. Either way, once you have the code, load it up in your favourite text editor and take a look.

The trick to understanding how textures work is that they are a special way of setting the colour of a point on a 3D object. As you will remember from lesson 2, colours are specified by fragment shaders, so what we need to do is load the image and send it over to the fragment shader. The fragment shader also needs to know which bit of the image to use for the fragment it’s working on, so we need to send that information over to it too.

Let’s start off by looking at the code that loads the texture. We call it right at the start of the execution of our page’s JavaScript, in webGLStart at the bottom of the page (new code in red):

  function webGLStart() {
    var canvas = document.getElementById("lesson05-canvas");
    initGL(canvas);
    initShaders();
    initBuffers();
    initTexture();

    gl.clearColor(0.0, 0.0, 0.0, 1.0);

Let’s look at initTexture — it’s about a third of the way from the top of the file, and is all new code:

  var neheTexture;
  function initTexture() {
    neheTexture = gl.createTexture();
    neheTexture.image = new Image();
    neheTexture.image.onload = function() {
      handleLoadedTexture(neheTexture)
    }

    neheTexture.image.src = "nehe.gif";
  }

So, we’re creating a global variable to hold the texture; obviously in a real-world example you’d have multiple textures and wouldn’t use globals, but we’re keeping things simple for now. We use gl.createTexture to create a texture reference to put into the global, then we create a JavaScript Image object and put it into a a new attribute that we attach to the texture, yet again taking advantage of JavaScript’s willingness to set any field on any object; texture objects don’t have an image field by default, but it’s convenient for us to have one, so we create one. The obvious next step is to get the Image object to load up the actual image it will contain, but before we do that we attach a callback function to it; this will be called when the image has been fully loaded, and so it’s safest to set it first. Once that’s set up, we set the image’s src property, and we’re done. The image will load asynchronously — that is, the code that sets the src of the image will return immediately, and a background thread will load the image from the web server. Once it’s done, our callback gets called, and it calls handleLoadedTexture:

  function handleLoadedTexture(texture) {
    gl.bindTexture(gl.TEXTURE_2D, texture);
    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
    gl.bindTexture(gl.TEXTURE_2D, null);
  }

The first thing we do is tell WebGL that our texture is the “current” texture. WebGL texture functions all operate on this “current” texture instead of taking a texture as a parameter, and bindTexture is how we set the current one; it’s similar to the gl.bindBuffer pattern that we’ve looked at before.

Next, we tell WebGL that all images we load into textures need to be flipped vertically. We do this because of a difference in coordinates; for our texture coordinates, we use coordinates that, like the ones you would normally use in mathematics, increase as you move upwards along the vertical axis; this is consistent with the X, Y, Z coordinates we’re using to specify our vertex positions. By contrast, most other computer graphics systems — for example, the GIF format we use for the texture image — use coordinates that increase as you move downwards on the vertical axis. The horizontal axis is the same in both coordinate systems. This difference on the vertical axis means that from the WebGL perspective, the GIF image we’re using for our texture is already flipped vertically, and we need to “unflip” it. (Thanks to Ilmari Heikkinen for clarifying that in the comments.)

The next step is to upload our freshly-loaded image to the texture’s space in the graphics card using texImage2D. The parameters are, in order, what kind of image we’re using, the level of detail (which is something we’ll look at in a later lesson), the format in which we want it to be stored on the graphics card (repeated twice for reasons we’ll also look at later), the size of each “channel” of the image (that is, the datatype used to store red, green, or blue), and finally the image itself.

On to the next two lines: these specify special scaling parameters for the texture. The first tells WebGL what to do when the texture is filling up a large amount of the screen relative to the image size; in other words, it gives it hints on how to scale it up. The second is the equivalent hint for how to scale it down. There are various kinds of scaling hints you can specify; NEAREST is the least attractive of these, as it just says you should use the original image as-is, which means that it will look very blocky when close-up. It has the advantage, however, of being really fast, even on slow machines. In the next lesson we’ll look at using different scaling hints, so you can compare the performance and appearance of each.

Once this is done, we set the current texture to null; this is not strictly necessary, but is good practice; a kind of tidying up after yourself.

So, that’s all the code required to load the texture. Next, let’s move on to initBuffers. This has, of course, lost all of the code relating to the pyramid that we had in lesson 4 but have now removed, but a more interesting change is the replacement of the cube’s vertex colour buffer with a new one — the texture coordinate buffer. It looks like this:

    cubeVertexTextureCoordBuffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    var textureCoords = [
      // Front face
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,

      // Back face
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,

      // Top face
      0.0, 1.0,
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,

      // Bottom face
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,
      1.0, 0.0,

      // Right face
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,

      // Left face
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
    ];
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(textureCoords), gl.STATIC_DRAW);
    cubeVertexTextureCoordBuffer.itemSize = 2;
    cubeVertexTextureCoordBuffer.numItems = 24;

You should be pretty comfortable with this kind of code now, and see that all we’re doing is specifying a new per-vertex attribute in an array buffer, and that this attribute has two values per vertex. What these texture coordinates specify is where, in cartesian x, y coordinates, the vertex lies in the texture. For the purposes of these coordinates, we treat the texture as being 1.0 wide by 1.0 high, so (0, 0) is at the bottom left, (1, 1) the top right. The conversion from this to the real resolution of the texture image is handled for us by WebGL.

That’s the only change in initBuffers, so let’s move on to drawScene. The most interesting changes in this function are, of course, the ones that make it use the texture. However, before we go through these, there are a number of changes related to really simple stuff like the removal of the pyramid and the fact that the cube is now spinning around in a different way. I won’t describe these in detail, as they should be pretty easy to work out; they’re highlighted in red in this snippet from the top of the drawScene function:

  var xRot = 0;
  var yRot = 0;
  var zRot = 0;
  function drawScene() {
    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);

    mat4.identity(mvMatrix);

    mat4.translate(mvMatrix, [0.0, 0.0, -5.0]);

    mat4.rotate(mvMatrix, degToRad(xRot), [1, 0, 0]);
    mat4.rotate(mvMatrix, degToRad(yRot), [0, 1, 0]);
    mat4.rotate(mvMatrix, degToRad(zRot), [0, 0, 1]);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, cubeVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

There are also matching changes in the animate function to update xRot, yRot and zRot, which I won’t go over.

So, with those out of the way, let’s look at the texture code. In initBuffers we set up a buffer containing the texture coordinates, so here we need to bind it to the appropriate attribute so that the shaders can see it:

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, cubeVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

…and now that WebGL knows which bit of the texture each vertex uses, we need to tell it to use the texture that we loaded earlier, then draw the cube:

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, neheTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

    gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVertexIndexBuffer);
    setMatrixUniforms();
    gl.drawElements(gl.TRIANGLES, cubeVertexIndexBuffer.numItems, gl.UNSIGNED_SHORT, 0);

What’s happening here is somewhat complex. WebGL can deal with up to 32 textures during any given call to functions like gl.drawElements, and they’re numbered from TEXTURE0 to TEXTURE31. What we’re doing is saying in the first two lines that texture zero is the one we loaded earlier, and then in the third line we’re passing the value zero up to a shader uniform (which, like the other uniforms that we use for the matrices, we extract from the shader program in initShaders); this tells the shader that we’re using texture zero. We’ll see how that’s used later.

Anyway, once those three lines are executed, we’re ready to go, so we just use the same code as before to draw the triangles that make up the cube.

The only remaining new code to explain is the changes to the shaders. Let’s look at the vertex shader first:

  attribute vec3 aVertexPosition;
  attribute vec2 aTextureCoord;

  uniform mat4 uMVMatrix;
  uniform mat4 uPMatrix;

  varying vec2 vTextureCoord;

  void main(void) {
    gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
    vTextureCoord = aTextureCoord;
  }

This is very similar to the colour-related stuff we put into our vertex shader in lesson 2; all we’re doing is accepting the texture coordinates (again, instead of the colour) as a per-vertex attribute, and passing it straight out in a varying variable.

Once this has been called for each vertex, WebGL will work out values for the fragments (which, remember, are basically just pixels) between vertices by using linear interpolation between the vertices — just as it did with the colours in lesson 2. So, a fragment half-way between vertices with texture coordinates (1, 0) and (0, 0) will get the texture coordinates (0.5, 0), and one halfway between (0, 0) and (1, 1) will get (0.5, 0.5). Next stop, the fragment shader:

  precision mediump float;

  varying vec2 vTextureCoord;

  uniform sampler2D uSampler;

  void main(void) {
    gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
  }

So, we pick up the interpolated texture coordinates, and we have a variable of type sampler, which is the shader’s way of representing the texture. In drawScene, our texture was bound to gl.TEXTURE0, and the uniform uSampler was set to the value zero, so this sampler represents our texture. All the shader does is use the function texture2D to get the appropriate colour from the texture using the coordinates. Textures traditionally use s and t for their coordinates rather than x and y, and the shader language supports these as aliases; we could just as easily used vTextureCoord.x and vTextureCoord.y.

Once we have the colour for the fragment, we’re done! We have a textured object on the screen.

So, that’s it for this time. Now you know all there is to learn from this lesson: how to add textures to 3D objects in WebGL by loading an image, telling WebGL to use it for a texture, giving your object texture coordinates, and using the coordinates and the texture in the shaders.

If you have any questions, comments, or corrections, please do leave a comment below!

Otherwise, check out the next lesson, in which I show how you can get basic key-based input into the JavaScript that animates your 3D scene, so that we can start making it interact with the person viewing the web page. We’ll use that to allow the viewer to change the spin of the cube, to zoom in and out, and to adjust the hints given to WebGL to control the scaling of textures.

<< Lesson 4Lesson 6 >>

Acknowledgments: Chris Marrin’s spinning box was a great help when writing this, as was an extension of that demo by Jacob Seidelin. As always, I’m deeply in debt to NeHe for his OpenGL tutorial for the script for this lesson.

You can leave a response, or trackback from your own site.

142 Responses to “WebGL Lesson 5 – introducing textures”

  1. giles says:

    Hey Dan, thanks for the clarification. I need to update the tutorials to talk about this (lots of people run into the problem) so I’ll make sure I reference the Khronos page when I do that.

  2. Tony says:

    I’m not sure why this isn’t working for me.

    I’m testing it locally, is that a problem?

  3. LazyBitStream says:

    Hmm, any reason you wrote:
    gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));

    instead of:

    gl_FragColor = texture2D(uSampler, vTextureCoord);

    ?

  4. Rich Conlan says:

    Somebody mentioned wanting to do a different texture per face. I got this working as follows,

    —————–

    ...
    varying vec3 vTextureCoord;
    
    uniform sampler2D Sampler0;
    uniform sampler2D Sampler1;
    uniform sampler2D Sampler2;
    uniform sampler2D Sampler3;
    uniform sampler2D Sampler4;
    uniform sampler2D Sampler5;
    
    void main(void) {
      if (5.05 < vTextureCoord.z) {
        gl_FragColor = texture2D(Sampler5, vec2(vTextureCoord.s, vTextureCoord.t));
      } else if (4.05 < vTextureCoord.z) {
        gl_FragColor = texture2D(Sampler4, vec2(vTextureCoord.s, vTextureCoord.t));
      } else if (3.05 < vTextureCoord.z) {
        gl_FragColor = texture2D(Sampler3, vec2(vTextureCoord.s, vTextureCoord.t));
      } else if (2.05 < vTextureCoord.z) {
        gl_FragColor = texture2D(Sampler2, vec2(vTextureCoord.s, vTextureCoord.t));
      } else if (1.05 < vTextureCoord.z) {
        gl_FragColor = texture2D(Sampler1, vec2(vTextureCoord.s, vTextureCoord.t));
      } else {
        gl_FragColor = texture2D(Sampler0, vec2(vTextureCoord.s, vTextureCoord.t));
      }
    }
    
    attribute vec3 aVertexPosition;
    ...
    varying vec3 vTextureCoord;
    ...
    
    var textures = [];
    function getTexture(imgPath) {
      var texture = gl.createTexture();
      texture.image = new Image();
      texture.image.onload = function () {
        handleLoadedTexture(texture)
      }
      texture.image.src = imgPath;
      return texture;
    }
    
    function handleLoadTexture(texture) {
    ...
      // Allows non-power-of-two textures.
      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
    ...
    }
    
    function initTexture() {
      textures.push(getTexture("face0.gif"));
      textures.push(getTexture("face1.gif"));
      textures.push(getTexture("face2.gif"));
      textures.push(getTexture("face3.gif"));
      textures.push(getTexture("face4.gif"));
      textures.push(getTexture("face5.gif"));
    }
    
    function initBuffers() {
    ...
      var textureCoords = [
        // Front face
        0.0, 0.0, 1.0,
        1.0, 0.0, 1.0,
        1.0, 1.0, 1.0,
        0.0, 1.0, 1.0,
    
        // Back face
        1.0, 0.0, 2.0,
        1.0, 1.0, 2.0,
        0.0, 1.0, 2.0,
        0.0, 0.0, 2.0,
    
        // Top face
        0.0, 1.0, 3.0,
        0.0, 0.0, 3.0,
        1.0, 0.0, 3.0,
        1.0, 1.0, 3.0,
    
        // Bottom face
        1.0, 1.0, 4.0,
        0.0, 1.0, 4.0,
        0.0, 0.0, 4.0,
        1.0, 0.0, 4.0,
    
        // Right face
        1.0, 0.0, 5.0,
        1.0, 1.0, 5.0,
        0.0, 1.0, 5.0,
        0.0, 0.0, 5.0,
    
        // Left face
        0.0, 0.0, 6.0,
        1.0, 0.0, 6.0,
        1.0, 1.0, 6.0,
        0.0, 1.0, 6.0,
      ];
    ...
    }
    
    function drawScene()
    ...
      for (var i = 0; i < 6; ++i) {
        gl.activeTexture(gl['TEXTURE' + i]);
        gl.bindTexture(gl.TEXTURE_2D, textures[i]);
        gl.uniform1i(gl.getUniformLocation(shaderProgram, 'Sampler' + i), i);
      }
    ...
    }
    

    —————–

    The vTextureCoord.z checks are admittedly odd, but if I check against 5.0, etc., I get strange flickery blending of faces. If you could explain this or how to do it better, that'd be appreciated.

  5. Rich Conlan says:

    (The comment formatting took out the script tags at the top and all my indents, but it’s otherwise what I intended.)

  6. David says:

    Great tutorials once again, thank you so much.

    In this lesson, I think the discussion of changes to initShaders is missing

    1. Add lines for texture coord (and remove the old colour lines)
    shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, “aTextureCoord”);
    gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);

    2. Add line for get uniform location for uSampler

    shaderProgram.samplerUniform = gl.getUniformLocation(shaderProgram, “uSampler”);

  7. giles says:

    @Tony — some people have had problems with using their browsers’ “Save as” functions to save the JavaScript files, like glmatrix.js. Might that be it?

    @LazyBitStream — I did that to show that you can use .s and .t to access vector components, but you’re not the first person to ask. Perhaps I should change it.

    @Rich — clever! So you’re basically adding a new texture “coordinate” which says which sampler to use. I like it. The flickering might be some kind of issue resulting from interpolation of the coordinates, not sure how you’d be able to work around that. (BTW I’ve added “pre” tags around your code, so the formatting’s back.)

    @David — nice point, I’ll look at that.

  8. Nam says:

    I chance the image, in neheTexture.image.src=”";
    why didn ‘t it appear?
    what kind of image can i should take?

  9. asterisk11 says:

    I can’t seem to make sense of the mapping between the texture coords and the vertices. How do the following vertices

    vertices = [
    // Back face
    -1.0, -1.0, -1.0,
    -1.0, 1.0, -1.0,
    1.0, 1.0, -1.0,
    1.0, -1.0, -1.0,
    // Top face
    -1.0, 1.0, -1.0,
    -1.0, 1.0, 1.0,
    1.0, 1.0, 1.0,
    1.0, 1.0, -1.0,...]

    map to

    var textureCoords = [
    // Back face
    1.0, 0.0,
    1.0, 1.0,
    0.0, 1.0,
    0.0, 0.0,
    // Top face
    0.0, 1.0,
    0.0, 0.0,
    1.0, 0.0,
    1.0, 1.0, ...]

    ?

  10. Martin says:

    These are really well presented tutorials – though I’d like to get this demo working with six different textures so if someone has a complete example or has successfully implemented Rich Conlan’s suggestions* I would really appreciate any help (*I can get it to load after replacing “attribute vec2 aTextureCoord;” with “attribute vec3 aTextureCoord;”, which may have been skipped, but the textures don’t display correctly)

  11. giles says:

    @Nam — anything that’s a power-of-two size (eg. 512×512, 512×256, 128×1024) should work.

    @asterisk11 — the number of values each vertex has is defined in the buffer’s “itemSize” attribute. Does that make things clearer?

    @Martin — thanks! If you post a link to your code, I could take a look.

  12. Tatiana says:

    You mentioned that any image that’s a power-of-two size should work as a texture but for some reason it’s not the case. Some images work and some don’t and I can’t figure out what determines that. Any ideas?

  13. giles says:

    @Tatiana — what sizes worked, and what didn’t? Hard to say without knowing that.

  14. Martin says:

    Thanks Giles, I’ve uploaded my attempt here:
    http://www.martin.byethost13.com/multitexturecube.html

  15. Arthur says:

    Hey,
    in the first code snippet, in function webGLStart() i miss the initBuffers() between initShaders() and initTexture().
    Did you delete it on popurse?

  16. Soohyun says:

    Hi Martin,
    I just tried your example and got an idea of how to make it work.
    cubeVertexTextureCoordBuffer.itemSize = 3; // not 2

  17. giles says:

    @Soohyun — that’s a good point, I think you’re right there.

    @Arthur — no, that was an error — thanks for letting me know! I’ve fixed it now.

  18. Martin says:

    Thanks for your help Soohyun and giles. That got it working!

  19. Taar says:

    Hi,
    As I was planning to experiment with the code of this lesson, I copied the files to my local disk and opened the local html file in Firefox 5.0. This didn’t work (WebGL initialized but the image didn’t appear) and the javascript console gave me an error like this :

    uncaught exception: … location … handleLoadedTexture … data: no

    After a bit of browsing, I think that the issue is with the new cross domain policy that has been implemented in Firefox 5.0. The local copy of the texture file moon.gif is considered “domain-less” and Firefox declines to load it.

    More details here (german), with a first solution :

    http://www.die-informatiker.net/topic/Computergrafik_SS11/Texturen_in_Firefox_5/15008

    I tried Dominikus Baur’s solution and it works. Another solution is to distribute the files locally (e.g, using an http server), so that Firefox accepts to view them as coming from the same site.

    I suspect that other WebGL-compatible-browsers will soon make the same drastic move concerning cross-domain security, so I thought it would be a good idea to tell you about what I found.

    Sorry if that was already covered somewhere else.

  20. Aaron says:

    First off, these are great tutorials and I’m really looking forward to finishing them! However, I’ve run into a small problem on number 5 here that has me completely stumped… I’m using the nehe.gif image from the example and everything works fine except that the texture shows up as solid blue instead the actual image. I’m not getting any errors in the console. I’m using Mac OSX and Firefox 5, but had the same problem in Firefox 4…

    Thanks in advance to anyone that has some insight on this.

  21. Chpill says:

    wow great tutorial! you really got me into WebGl!
    I have a question about the texture mapping.
    I’ve noticed that when I put values greater than 1 in textureCoords,the image starts repeating itself to cover the surface.
    You can check with 2 for example:
    var textureCoords = [
    // Front face
    0.0, 0.0,
    2.0, 0.0,
    2.0, 2.0,
    0.0, 2.0,
    // Back face

    Is it the usual trick to repeat a texture, or are there smarter ways to do it?

  22. Chpill says:

    I found my answer in lesson 10, sorry for the useless post ^^

  23. Samwise says:

    Hello, first off awesome tutorials these have allowed me to advance my knowledge of WebGL by leaps and bounds over a small period of time.

    That being said I seem to have run into an odd problem. I am using Firefox 5.0 and am unable to display textures, after some googling I discovered this:
    http://hacks.mozilla.org/2011/06/cross-domain-webgl-textures-disabled-in-firefox-5/

    As the reason behind my inability to see the textures for this lesson. Testing with chrome confirmed that the code is fine and it is indeed a Firefox issue. However what’s weird is I’m able to view your live pages perfectly fine on both FF 5.0 and Chrome.

    I’m going to keep poking around to see if I can find a solution to this, and any insight you can provide would be awesome, but this was mostly just a heads-up. Maybe put a notice in your lessons about this?

  24. Mihai Damian says:

    @Samwise

    I saw this solution on the link Taar posted:

    - open about:config in Firefox
    - change security.fileuri.strict_origin_policy to false

    You are able to view the live pages because there is no cross domain request for textures in that case.

  25. Andreas says:

    Hallo,
    I really like your Lessons and so far everything worked.

    But Now I am unable to run the Lesson locally :(

    Perhaps u can help me ? I saved all necessary files: nehe.gif,glMatrix-0.9.5.min.js,ga.js and webgl-utils.js.

    What do I have to modify to make it run ? Here is what I saved locally:
    https://rapidshare.com/files/434925410/Lesson5.zip

  26. Andreas says:

    Oh ok ;)

    Should have read the comment above me first ;)

    This worked for me:

    “- open about:config in Firefox
    - change security.fileuri.strict_origin_policy to false

    You are able to view the live pages because there is no cross domain request for textures in that case.”

  27. Thai says:

    Thanks for Mihai Damian’s help!

  28. VengantMjolnir says:

    I discovered that Chrome doesn’t like to load the files from the local system and was throwing an uncaught error exception. This was fixed by setting up a development shortcut for Chrome that added the command line parameters of: “–allow-file-access-from-files”

    Hope this helps anyone else who runs into this!

  29. Jarav says:

    Are comments closed? Tried to submit a question twice and haven’t been successful.

  30. Jarav says:

    Hi,

    Am having a non-animated, 2D( no transformations) version of your lesson05, displaying a texture. Instead of a cube I have only a square. For some reason, simply calling ‘drawScene()’ does not display anything. On the other hand, if I use your ‘tick’ function which calls ‘drawScene’ repeatedly, I get the texture displayed. This is only for the texture display. If I un-comment the commented line in the fragment shader, which displays a white-filled square, it works without the ‘tick’ function. The changes I have made are in the fragment shader, initBuffers, drawScene and webglStart. Here are the changes:

    #ifdef GL_ES
    precision highp float;
    #endif

    varying vec2 vTextureCoord;

    uniform sampler2D uSampler;

    void main(void) {
    gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
    //gl_FragColor = vec4(1.0,1.0,1.0,1.0);
    }

    function initBuffers() {
    cubeVertexPositionBuffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexPositionBuffer);
    vertices = [
    // Front face, only a square
    -0.5, -0.5, 0.0,
    0.5, -0.5, 0.0,
    -0.5, 0.5, 0.0,
    0.5, 0.5, 0.0,

    ];
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(vertices), gl.STATIC_DRAW);
    cubeVertexPositionBuffer.itemSize = 3;
    cubeVertexPositionBuffer.numItems = 4;

    cubeVertexTextureCoordBuffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    var textureCoords = [
    // Front face
    0.0, 0.0,
    1.0, 0.0,
    0.0, 1.0,
    1.0, 1.0
    ];
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(textureCoords), gl.STATIC_DRAW);
    cubeVertexTextureCoordBuffer.itemSize = 2;
    cubeVertexTextureCoordBuffer.numItems = 4;
    }

    function drawScene() {
    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

    // No transformations
    //mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.0, 100.0, pMatrix);
    mat4.identity(pMatrix);
    // No transformations
    mat4.identity(mvMatrix);

    //mat4.translate(mvMatrix, [0.0, 0.0, -5.0]);

    // No animation
    //mat4.rotate(mvMatrix, degToRad(zRot), [0, 0, 1]);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, cubeVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, cubeVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, neheTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

    setMatrixUniforms();
    gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4);
    }

    function webGLStart() {
    var canvas = document.getElementById(“lesson05-canvas”);
    initGL(canvas);
    initShaders();
    initBuffers();
    initTexture();

    gl.clearColor(0.0, 0.0, 0.0, 1.0);
    gl.disable(gl.DEPTH_TEST);
    drawScene();// this doesn’t work
    //tick();// but this works
    }

    Hope you can help. Thanks.

  31. pschlupnzl says:

    It looks like you can avoid the cross-domain restriction for loading textures by directly embedding the base64-encoded image data within the source file. There are on-line converters and Javascript samples. I turned off the Firefox security.fileuri.strict_origin_policy, converted the image (to png), copied out the base64 text, and turned the security setting back on.

    Maybe you could link to a text file with this encoding within your tutorial?

    See:
    http://stackoverflow.com/questions/934012/get-image-data-in-javascript
    http://stackoverflow.com/questions/2704929/uncaught-error-security-err-dom-exception-18

    Thanks for the awesome tutorials, by the way!

  32. Paul C says:

    First, thanks for these great lessons. I’m working on my own simple file format for meshes and am writing an exporter from Blender. In Blender, it seems that the UV mapping is stored per-face. This makes sense to me since the UV mapping for a vertex could differ depending on the face it’s currently being rendered as a part of.

    I’m wondering if there’s a way to specify the UV mapping per-face rather than per-vertex. Any ideas?

    Thanks!

  33. Amir says:

    Thnk u,
    But i am unable to run the code on local server U’r code is running online but when i’m saving the code on my local server it is not running i dont knw y? Plz any 1 help me even i got the same problm in lessonn4 exampls also bt left it just to move forward thnkng tht it wud be somethng which i was nt in a position to understand
    Plz Reply Soon

  34. Amir says:

    It was my browser’s prob actually, I dont know wht hppnd on saving the page from view source instead of this function(requestAnimationFrame) it was by default saved requestAnimFrame

  35. Amir says:

    @Rich
    The line:
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);
    that u added in function handleLoadedTexture()
    is not working for any image.Plz let me know if resolvd this issue

    Thnx

  36. Ben says:

    I’m using the stable windows XP Chrome and I can run all your live lesson demos. However, when running the code locally from git, it works up to lesson 4, then from lesson 5 onward, I get only a black canvas.

    The texImage2D() call is raising an DOM exception 18.

    Another dato point: the stable linux Chrome does not raise that error and I can run all lesson examples locally.

    Any ideas?

  37. Jason Slemons says:

    @VengantMjolnir; thanks for the tip! However, for me, when trying to get the demo to load the nehe.gif in chrome, I found that ‘chrome.exe –allow-file-access-from-files –allow-file-access’ was required.

  38. giles says:

    @Taar, @Samwise — you’re right, they’ve made the requirements for loading textures stricter. I’ll update the tutorial to make that clear. The best thing is normally to use a local web server.

    @Aaron — sounds like you might need to double-check the texture coordinates, you might be getting one of the corners of the image spread over the whole faces.

    @Chpill — no problems ;-)

    @Jarav — put the drawScene call in the function that’s called when the texture is loaded, and you should be fine. I think the problem is that the when you’re calling it in webGLStart, the texture has not been loaded yet.

    @pschlupnzl — that’s a really clever idea, but I don’t think I should be teaching people to do it at this stage in the tutorials — they might get into bad habits.

    @Paul C — remember that a vertex, as far as WebGL is concerned, is a bundle of attributes, some of which might be its location. So the corner of a cube where the texture coordinates differ for each of the three faces that the corner joins is not one vertex but three, which all have the same location but different UV coordinates. This means that when you’re generating a model from a Blender file like the one you describe, you’ll generate a number of vertices for each Blender vertex. (This way of representing the data is a little inefficient in terms of memory, but less so than you’d think — and it’s *much* faster for the graphics card to process.)

    @Ben — sounds like the same-origin problem that was reported above. Try it with a local webserver.

  39. Kel Murphy says:

    Thanks Mihai.

    Verified in Firefox 7.0.1 –
    To fix running images locally:
    In URL bar type–
    about:config
    Filter by this name–
    security.fileuri.strict_origin_policy
    Change to FALSE.

    Restart browser and local images in WebGL will now work. :)

  40. Ecky says:

    Hi, I don’t know what went wrong. But I’m unable to run this in my machine. I’m using chrome and I have disable the domain policy.
    I have also used the example code (just copy and paste), and It’s still not working. I can see the black background but I can’t see the cube!

    Does anyone have this problem?

  41. [...] tutorial de la serie Aprende webGL. Es una traducción no literal de su respectivo tutorial en Learning webGL, que a su vez está basada en el capítulo 6 del tutorial sobre openGL de NeHe. Esta vez vamos a [...]

  42. Alex Bunting says:

    Thanks for this, fantastic tutorial!

    Aside from the shaders missing bit (I write it by hand as I go so I can see what im writing so this was a bit puzzling for a while!) if people are having trouble getting this working, it literally doesn’t work unless the image size is power of two.

    Ecky, check the image you are using and resize it to 128×128, 256×256 etc.

    If this still doesn’t work, stick a ‘webgl.verbose = true;’ at the top of your code and see if it spits out any issues.

  43. Ash Cairo says:

    If you have problems loading non-power of 2 textures, you can set the ST wrapping to clamp to the edges and non-power of 2 textures will be supported.

    // Required if using non power of 2 textures
    gl.texParameteri( gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE );
    gl.texParameteri( gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE );

  44. jas says:

    Hi!

    Thanks for creating this nice tutorial!

    If I’m not wrong, you might have forgotten to explain the changes in initShaders(). There should be some mapping between the textureCoordinateBuffer and this strange varying variable aTextureCoord:

    shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, “aTextureCoord”);
    gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);

  45. Sylvain says:

    Hello !

    Firstly, I thank you a lot for this website which is very clear and allows me to learn quick WebGL.

    My problem is as follows :
    I have executed the HTML file with the javascript tools and the image but I saw nothing on the webpage, just the black window. I use Google Chrome and my project is offline. The code is the same as you, so I don’t understand.

    I need really help ! Thanks !

  46. Sylvain says:

    Sorry, I discovered why it didn’t work. With Javascript, we can’t use data from clients, that was logical…

    Now, with Wamp, all is solved !

  47. Lord Ashes says:

    An alternative way to get 6 different sides (textures) on to the cube, especially when they are textures for a single object, is to use texture tiles. Basically create one GIF file that has 6 equal squares each representing a side of the cube.

    Then all you have to do is adjust the Texture Coordinates to select the desired tile from the texture for each face without having to do any other changes (i.e. you can use the Lesson 5 code directly with just modifications to the Texture Coordinates and a swap of the GIF file).

    The Texture Coordinates are a unit representation of the texture. So a value of 0 to 1 horizontally and 0 to 1 vertically will fit the texture on the face exactly once. If the Texture Coordinates are 0 to 2 horizontally and vertically then the texture will fit 2 times in each direction (for a total of 4 times) on the face.

    The same is true when the Texture Coordinates are a fraction. If the Texture Coordinates are 0 to 0.5 horizontally and vertically then only the corner half in each direction (and thus a 4th of the texture) will be displayed.

    This concept can be used to switch between tiles in a texture. Say that we have a 3×2 tile texture (i.e. a texture that contains 3 equal faces horizontally and two vertically or 6 faces in total). The Texture Coordinates would then be:

    0.00 0.00 // First Tile Lower Row (Remember GIF Is Flipped)
    0.00 0.50
    0.33 0.50
    0.33 0.00

    0.34 0.00 // Second Tile Lower Row
    0.34 0.50
    0.66 0.50
    0.66 0.00

    0.67 0.00 // Third Tile Lower Row
    0.67 0.50
    1.00 0.50
    1.00 0.00

    0.00 0.51 // First Tile First Row
    0.00 1.00
    0.33 1.00
    0.33 0.51

    0.34 0.51 // Second Tile First Row
    0.34 1.00
    0.66 1.00
    0.66 0.51

    0.67 0.51 // Third Tile First Row
    0.67 1.00
    1.00 1.00
    1.00 0.51

    This concept allows you to take faces for the same object define in the same texture file as opposed to having many texture files for the same object.

    I tried it with a 3×3 tile matrix (not using 3 of the tiles) and ran into some minor problems until I realize that the GIF is flipped (i.e. my face was displaying blank until I realized I was displaying one of the unused tiles).

  48. Lord Ashes says:

    You may actually have to use a 3×3 grid, as I did in my initial test, to ensure that the texture size is a power of 2 (assuming that this applies to both horizontal and vertical dimensions)

  49. Lord Ashes says:

    Anim8or is a free tiny 3D program that allows the creation of 3D objects. It imports 3DS objects (but not 3DS Max). It can be used to make 3D objects, create bones and joins to animate the objects, make object sequences and then make a scene complete with camera and lighting.

    This all for FREE and the alpha version is 2MB.
    No this is not a typo. Not 20MB. Not 200MB. Not 2GB! 2MB!
    The previous release version, I currently use the Alpha version, fit on a 1.4MB floppy (for those of you who remember what that is).

    Anyway, the Object Editor in this program has a few Export options (VTX, OBJ, C, 3DS and AN8). If you use the C export option, a C file will be generated (for use in C programs) but the values of the arrays can be copied directly into Lesson 5 and it will work without needing to do any re-formatting. Just don’t copy the variable names, copy only the content and paste it into the appropriate place in Lesson 5. You can extract the coordinates, texture coordinates and the indices (all three items needed for the lesson 5 code). Ignore the few other arrays that the file contains (such as normals).

    http://www.anim8tor.com

    I am not the author of this program but I have found it very useful.

  50. giogts says:

    Lord Ashes
    I copy all values you say from a .c file of anim8or but
    What number I put in
    cubeVertexPositionBuffer.numItems
    cubeVertexTextureCoordBuffer.numItems
    cubeVertexIndexBuffer.numItems ?

Leave a Reply

Subscribe to RSS Feed Follow Learning WebGL on Twitter