WebGL Lesson 5 – introducing textures

<< Lesson 4Lesson 6 >>

Welcome to my number five in my series of WebGL tutorials, based on number 6 in the NeHe OpenGL tutorials. This time we’re going to add a texture to a 3D object — that is, we will cover it with an image that we load from a separate file. This is a really useful way to add detail to your 3D scene without having to make the objects you’re drawing incredibly complex. Imagine a stone wall in a maze-type game; you probably don’t want to model each block in the wall as a separate object, so instead you create an image of masonry and cover the wall with it; a whole wall can now be just one object.

Here’s what the lesson looks like when run on a browser that supports WebGL:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t.

More on how it all works below…

The usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on in the code, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the differences between the code for lesson 4 and the new code.

There may be bugs and misconceptions in this tutorial. If you spot anything wrong, let me know in the comments and I’ll correct it ASAP.

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there. Either way, once you have the code, load it up in your favourite text editor and take a look.

The trick to understanding how textures work is that they are a special way of setting the colour of a point on a 3D object. As you will remember from lesson 2, colours are specified by fragment shaders, so what we need to do is load the image and send it over to the fragment shader. The fragment shader also needs to know which bit of the image to use for the fragment it’s working on, so we need to send that information over to it too.

Let’s start off by looking at the code that loads the texture. We call it right at the start of the execution of our page’s JavaScript, in webGLStart at the bottom of the page (new code in red):

  function webGLStart() {
    var canvas = document.getElementById("lesson05-canvas");
    initGL(canvas);
    initShaders();
    initBuffers();
    initTexture();

    gl.clearColor(0.0, 0.0, 0.0, 1.0);

Let’s look at initTexture — it’s about a third of the way from the top of the file, and is all new code:

  var neheTexture;
  function initTexture() {
    neheTexture = gl.createTexture();
    neheTexture.image = new Image();
    neheTexture.image.onload = function() {
      handleLoadedTexture(neheTexture)
    }

    neheTexture.image.src = "nehe.gif";
  }

So, we’re creating a global variable to hold the texture; obviously in a real-world example you’d have multiple textures and wouldn’t use globals, but we’re keeping things simple for now. We use gl.createTexture to create a texture reference to put into the global, then we create a JavaScript Image object and put it into a a new attribute that we attach to the texture, yet again taking advantage of JavaScript’s willingness to set any field on any object; texture objects don’t have an image field by default, but it’s convenient for us to have one, so we create one. The obvious next step is to get the Image object to load up the actual image it will contain, but before we do that we attach a callback function to it; this will be called when the image has been fully loaded, and so it’s safest to set it first. Once that’s set up, we set the image’s src property, and we’re done. The image will load asynchronously — that is, the code that sets the src of the image will return immediately, and a background thread will load the image from the web server. Once it’s done, our callback gets called, and it calls handleLoadedTexture:

  function handleLoadedTexture(texture) {
    gl.bindTexture(gl.TEXTURE_2D, texture);
    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
    gl.bindTexture(gl.TEXTURE_2D, null);
  }

The first thing we do is tell WebGL that our texture is the “current” texture. WebGL texture functions all operate on this “current” texture instead of taking a texture as a parameter, and bindTexture is how we set the current one; it’s similar to the gl.bindBuffer pattern that we’ve looked at before.

Next, we tell WebGL that all images we load into textures need to be flipped vertically. We do this because of a difference in coordinates; for our texture coordinates, we use coordinates that, like the ones you would normally use in mathematics, increase as you move upwards along the vertical axis; this is consistent with the X, Y, Z coordinates we’re using to specify our vertex positions. By contrast, most other computer graphics systems — for example, the GIF format we use for the texture image — use coordinates that increase as you move downwards on the vertical axis. The horizontal axis is the same in both coordinate systems. This difference on the vertical axis means that from the WebGL perspective, the GIF image we’re using for our texture is already flipped vertically, and we need to “unflip” it. (Thanks to Ilmari Heikkinen for clarifying that in the comments.)

The next step is to upload our freshly-loaded image to the texture’s space in the graphics card using texImage2D. The parameters are, in order, what kind of image we’re using, the level of detail (which is something we’ll look at in a later lesson), the format in which we want it to be stored on the graphics card (repeated twice for reasons we’ll also look at later), the size of each “channel” of the image (that is, the datatype used to store red, green, or blue), and finally the image itself.

On to the next two lines: these specify special scaling parameters for the texture. The first tells WebGL what to do when the texture is filling up a large amount of the screen relative to the image size; in other words, it gives it hints on how to scale it up. The second is the equivalent hint for how to scale it down. There are various kinds of scaling hints you can specify; NEAREST is the least attractive of these, as it just says you should use the original image as-is, which means that it will look very blocky when close-up. It has the advantage, however, of being really fast, even on slow machines. In the next lesson we’ll look at using different scaling hints, so you can compare the performance and appearance of each.

Once this is done, we set the current texture to null; this is not strictly necessary, but is good practice; a kind of tidying up after yourself.

So, that’s all the code required to load the texture. Next, let’s move on to initBuffers. This has, of course, lost all of the code relating to the pyramid that we had in lesson 4 but have now removed, but a more interesting change is the replacement of the cube’s vertex colour buffer with a new one — the texture coordinate buffer. It looks like this:

    cubeVertexTextureCoordBuffer = gl.createBuffer();
    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    var textureCoords = [
      // Front face
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,

      // Back face
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,

      // Top face
      0.0, 1.0,
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,

      // Bottom face
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,
      1.0, 0.0,

      // Right face
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
      0.0, 0.0,

      // Left face
      0.0, 0.0,
      1.0, 0.0,
      1.0, 1.0,
      0.0, 1.0,
    ];
    gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(textureCoords), gl.STATIC_DRAW);
    cubeVertexTextureCoordBuffer.itemSize = 2;
    cubeVertexTextureCoordBuffer.numItems = 24;

You should be pretty comfortable with this kind of code now, and see that all we’re doing is specifying a new per-vertex attribute in an array buffer, and that this attribute has two values per vertex. What these texture coordinates specify is where, in cartesian x, y coordinates, the vertex lies in the texture. For the purposes of these coordinates, we treat the texture as being 1.0 wide by 1.0 high, so (0, 0) is at the bottom left, (1, 1) the top right. The conversion from this to the real resolution of the texture image is handled for us by WebGL.

That’s the only change in initBuffers, so let’s move on to drawScene. The most interesting changes in this function are, of course, the ones that make it use the texture. However, before we go through these, there are a number of changes related to really simple stuff like the removal of the pyramid and the fact that the cube is now spinning around in a different way. I won’t describe these in detail, as they should be pretty easy to work out; they’re highlighted in red in this snippet from the top of the drawScene function:

  var xRot = 0;
  var yRot = 0;
  var zRot = 0;
  function drawScene() {
    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);

    mat4.identity(mvMatrix);

    mat4.translate(mvMatrix, [0.0, 0.0, -5.0]);

    mat4.rotate(mvMatrix, degToRad(xRot), [1, 0, 0]);
    mat4.rotate(mvMatrix, degToRad(yRot), [0, 1, 0]);
    mat4.rotate(mvMatrix, degToRad(zRot), [0, 0, 1]);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, cubeVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

There are also matching changes in the animate function to update xRot, yRot and zRot, which I won’t go over.

So, with those out of the way, let’s look at the texture code. In initBuffers we set up a buffer containing the texture coordinates, so here we need to bind it to the appropriate attribute so that the shaders can see it:

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, cubeVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

…and now that WebGL knows which bit of the texture each vertex uses, we need to tell it to use the texture that we loaded earlier, then draw the cube:

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, neheTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

    gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVertexIndexBuffer);
    setMatrixUniforms();
    gl.drawElements(gl.TRIANGLES, cubeVertexIndexBuffer.numItems, gl.UNSIGNED_SHORT, 0);

What’s happening here is somewhat complex. WebGL can deal with up to 32 textures during any given call to functions like gl.drawElements, and they’re numbered from TEXTURE0 to TEXTURE31. What we’re doing is saying in the first two lines that texture zero is the one we loaded earlier, and then in the third line we’re passing the value zero up to a shader uniform (which, like the other uniforms that we use for the matrices, we extract from the shader program in initShaders); this tells the shader that we’re using texture zero. We’ll see how that’s used later.

Anyway, once those three lines are executed, we’re ready to go, so we just use the same code as before to draw the triangles that make up the cube.

The only remaining new code to explain is the changes to the shaders. Let’s look at the vertex shader first:

  attribute vec3 aVertexPosition;
  attribute vec2 aTextureCoord;

  uniform mat4 uMVMatrix;
  uniform mat4 uPMatrix;

  varying vec2 vTextureCoord;

  void main(void) {
    gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);
    vTextureCoord = aTextureCoord;
  }

This is very similar to the colour-related stuff we put into our vertex shader in lesson 2; all we’re doing is accepting the texture coordinates (again, instead of the colour) as a per-vertex attribute, and passing it straight out in a varying variable.

Once this has been called for each vertex, WebGL will work out values for the fragments (which, remember, are basically just pixels) between vertices by using linear interpolation between the vertices — just as it did with the colours in lesson 2. So, a fragment half-way between vertices with texture coordinates (1, 0) and (0, 0) will get the texture coordinates (0.5, 0), and one halfway between (0, 0) and (1, 1) will get (0.5, 0.5). Next stop, the fragment shader:

  precision mediump float;

  varying vec2 vTextureCoord;

  uniform sampler2D uSampler;

  void main(void) {
    gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
  }

So, we pick up the interpolated texture coordinates, and we have a variable of type sampler, which is the shader’s way of representing the texture. In drawScene, our texture was bound to gl.TEXTURE0, and the uniform uSampler was set to the value zero, so this sampler represents our texture. All the shader does is use the function texture2D to get the appropriate colour from the texture using the coordinates. Textures traditionally use s and t for their coordinates rather than x and y, and the shader language supports these as aliases; we could just as easily used vTextureCoord.x and vTextureCoord.y.

Once we have the colour for the fragment, we’re done! We have a textured object on the screen.

So, that’s it for this time. Now you know all there is to learn from this lesson: how to add textures to 3D objects in WebGL by loading an image, telling WebGL to use it for a texture, giving your object texture coordinates, and using the coordinates and the texture in the shaders.

If you have any questions, comments, or corrections, please do leave a comment below!

Otherwise, check out the next lesson, in which I show how you can get basic key-based input into the JavaScript that animates your 3D scene, so that we can start making it interact with the person viewing the web page. We’ll use that to allow the viewer to change the spin of the cube, to zoom in and out, and to adjust the hints given to WebGL to control the scaling of textures.

<< Lesson 4Lesson 6 >>

Acknowledgments: Chris Marrin’s spinning box was a great help when writing this, as was an extension of that demo by Jacob Seidelin. As always, I’m deeply in debt to NeHe for his OpenGL tutorial for the script for this lesson.

You can leave a response, or trackback from your own site.

142 Responses to “WebGL Lesson 5 – introducing textures”

  1. Lord Ashes says:

    ****************************************
    *** WebGL Joint And Object Framework ***
    ****************************************

    I have created a WebGL based Framework which allows user to easily create object parts that are joined to each other and can be moved and rotated with respect to each other.

    The framework is designed to be easy to use by hiding all lower level WebGL implementation. Don’t know what a WebGL buffer is or what shaders or binding means? No problem. When you use the JOA Framework you don’t need to know about any of that because the JOA Framework will handle all that for you.

    All you need is basic knowledge of Vertices, Texture Coordinates and Indices. Since you are on Lesson 4 of the WebGL Tutorial, I’m guessing you do. If not go through the first 5 Tutorial lessons first.

    The JOA Framework implements a object structure that allows objects to be related to each other (such as the upper arm is connected to the lower arm), allows the user to define the 3D appearance of the objects, allows the objects to be textured, allows the objects to be moved and rotated while keeping their connections to their parents and children and even allows objects to be cloned so that you can create multiple objects from one object definition.

    A sample project of a bendable human body is provided.

    Instructions are also provided how to extract data from a freely available program called Anim8tor so that the 3D objects can be created using a program GUI as opposed to hand-coding.

    My JOA Framework, available for free, is downloadable from Media Fire at the following link:

    http://www.mediafire.com/?7ihdm8itdyec08w

    The template file, support files and the Framework files are all coded in HTML or Javascript so you can easy to verify that there is no malicious code inside.

    The Framework includes everything you need to start using the framework. No additional files are required (except for a WebGL compatible browser).

    Merry Christmas!

  2. Lord Ashes says:

    WebGL Joint And Object (JOA) Framework Version 1.1 Update

    Revision Log:

    * No changes to the JOA Framework itself.
    * Only changes to the Body Project Example as follows:

    * Added Hands
    * Added Boots
    * Added Boots Texture
    * Added Walking Animation
    * Added Key To Toggle Animation On/Off

    Link: http://www.mediafire.com/?knpw896bqsjttby

    Enjoy!

  3. ReaperUnreal says:

    Just a note, you mentioned changes in initShaders() but didn’t explicitly cover them. Not sure if this was intended or not. I mean, they’re pretty simple changes, but might be worth going over.

  4. Tux says:

    FYI:
    If you wish to save this code and run it locally(to play around with it), the texture image will not load because it is considered a cross domain image in this context. To fix this you need to add the line:
    neheTexture.image.crossOrigin = “anonymous”;
    just below:
    neheTexture.image = new Image();

    Happy New Year….

  5. zproxy says:

    IDL says:

    void pixelStorei(GLenum pname, GLint param);

    Example could state:

    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, 1);

  6. l5 says:

    Hi,

    Locally, this example doesn’t work in Chrome…only works in FF. Whats wrong?

  7. [...] http://learningwebgl.com/blog/?p=507 因为我是业余的翻译者,不一定翻译得非常好,你如果感兴趣,可以参考原文进行学习,期待与你一起进步 作者:cocoa | 分类目录:html5+css | 标签: [...]

  8. Erik says:

    To I5:
    I was having the same problem. Kept getting:
    Uncaught Error: SECURITY_ERR: DOM Exception 18
    Turns out it’s somehow a security risk to load files locally, even though you’re running your JavaScript in the same directory.
    I found two ways to test locally, but here’s the one I went with. You run Chrome from the command line with the tag “–allow-file-access-from-files”. You have to make sure you close all Chrome windows before you do that or else it won’t work.

  9. hdam says:

    Hi,

    Thank you very much for this fantastic series.

    I want to play with different images, simply by loading different image files, but most of them don’t display (see only black background). What have I done wrong? any restrictions on the image size, style, format, etc.?

    Thanks – H

  10. hdam says:

    My bad, found previous posts about the image size should be a power of 2.

    -H

  11. Dheepan Raju says:

    Hi,
    When i tried to run it with different image, It simply shows black background. Nothing is happening. Could you tell me what should i do for that?

    Thanks in advance.

  12. [...] jobbat med learning webGL. Har gjort nästan 5 lektioner, men får inte texturerna att fungera i lektion 5. ~7h This entry was posted in Uncategorized by Jonas Petersson. Bookmark the [...]

  13. I see this has already been mentioned, but I think it’s worth emphasizing the fact that the texture size MUST BE A POWER OF 2! This used to be the case in OpenGL 2.0 and earlier, but it is still the case in WebGL.

  14. casi says:

    I have the same problem..its not loading when i try with a diifferent image of the same dimensions 256×256..any solutions??

  15. Danny says:

    @Casi Make sure the color depth of the image is 256 :P .

  16. Erik says:

    Hello all. Upon completing this tutorial, my object is being rendered with colour, except the colour is uniform. Basically, the whole thing is exactly the same shade of green as opposed to having varying colours based on my texture.

    I thought that this may be a lighting issue (since I have not added any) in that everything defaults to the same shade; but the more I think about that, the less likely it seems.

    The other option is that my textureCoords (i.e. 0.0 to 1.0) aren’t being set correctly, but I’ve done some digger there and it does seem to be setting correctly. In that I mean: my bottom-left vertex as UV (0, 0), my top-right vertex has UV (1, 1) and the inbetweeners have some range between that based on their position.

    Any thoughts on something I may be missing?

    Appreciate any feedback.

  17. Erik says:

    If I change the very first pixel (i.e. top-left) in the image, everything gets set to that colour. So it must be something to do with my textureCoords.

  18. Erik says:

    Got it. I had missed adding this to my initShaders method:

    shaderProgram.textureCoordAttribute = gl.getAttribLocation(shaderProgram, “aTextureCoord”);
    gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute);

    Which I see now that ReaperUnreal mentioned above.

  19. Olle says:

    Hi!

    When I look at the live Webgl demo it sometimes freezes a bit. Just for very short time but it happens frequently and not reguarly. I look at it in Firefox.

    Does anyone know why this happens?

  20. Chris Lord says:

    Nice tutorial, but it really ought to cover deleting the texture too…

  21. [...] 以下の文章は、WebGL Lesson 5 – introducing texturesの日本語訳です。 WebGLのチュートリアルシリーズ第五回へようこそ。このチュートリアルは、第六回NeHeさんのOpenGLチュートリアルの基礎になります。チュートリアルは、分離したファイルから画像を読み込むといったものです。これは、複雑なオブジェクトを描画せずに3D空間を表現するのにお手軽な 手法です。ダンジョンゲームで岩が落下するところを想像してみてください。おそらく、ひとつひとつの岩をダンジョンの壁と別のオブジェクトとしてモデリングしたくはないでしょう。よって、壁全体はひとつのオブジェクトとして扱い、むしろ岩の画像を用意して壁を補うでしょう。 レッスンの内容をWebGLをサポートするブラウザで動かした様子です。 WebGLをサポートするブラウザ環境であれば、リンクをクリックして、WebGLの現行版をチェックしてみてください。もし持っていなければ、リンクから導入方法がわかります。 より詳しい原理を下記します。 注意点:レッスンは、プログラミングの知識を持っており、3Dグラフィックスの経験がないひとを対象にしています。できる限り早く個人の3D Webページを持てるよう、深く理解していただけると幸いです。すでに前回のチュートリアルをご覧になられていたら、この記事を読む前に読んでいただきたいです。なぜなら、このチュートリアルでは、レッスン4と新しいコードとの違いしか説明していないからです。 このチュートリアルには、バグや誤りがあるかもしれません。なにか誤りがあれば、コメントや訂正をできる限り早く教えていただきたいです。 例で紹介しているコードを手に入れるには、2つのやり方があります。現行のものを観察するようなまさにソースを見るやり方。Githubを使って、リンクのリポジトリからcloneしてくる(他のレッスンも)方法もあります。どちらかのやり方でコードを手に入れて、お気に入りのテキストエディタに読み込んで読んでみてください。 テクスチャがどのように機能しているか理解するには、3Dオブジェクト上で色の点がセットされる特有の仕組みを知ることです。レッスン2、色がフラグメントシェーダによって指定されるかを覚えていたら、画像を読み込んで、画像をフラグメントシェーダに送るには何が必要かわかりますか。フラグメントシェーダは、フラグメントを利用して画像のどの部分を取得するのか知る必要があります。 テクスチャを読み込むコードを見ていきましょう。ページ下部にあるWebGLStartのJavaScriptの実行結果を見てください(新たに追加されたコードは赤で書かれています)。 [...]

  22. Robert says:

    Hi,

    First I want to say that your tutorials are awesome. I already knew a lot of Open GL when I started them, but I feel like you have really deepened my understanding.

    However, when downloading the source code and running this lesson from my computer I am only seeing a black box where the canvas is. All previous 4 tutorials worked so I believe it is something to do with the loading of the texture. I downloaded your nehe.gif to my local directory and it seems as if it’s being used but somehow not loading correctly. I’m not sure if this is the right approach or what I should try but any help would be really great.

    Thanks

  23. Robert says:

    Also, I’ve traced it to this line in the code that it doesn’t go past.

    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);

    In the function
    function handleLoadedTexture(texture) {
    alert(“Entered handleLoadedTexture”);
    gl.bindTexture(gl.TEXTURE_2D, texture);
    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
    alert(“Got here”); <—- It gets here
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);
    alert("Got here2"); <—- But it doesn't get here
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
    gl.bindTexture(gl.TEXTURE_2D, null);
    alert("Exited handleLoadedTexture");
    }

  24. Robert says:

    For anyone else with the same issue, chrome doesn’t normally allow access to local files (as someone nicely mentioned above). Running from the command line (on a mac) with

    open /Applications/Google\ Chrome.app –args –allow-file-access-from-files

    will fix this problem. (but make sure you close all open windows first)

    Thanks again for the awesome tutorials, I’m really excited to continue them :)

  25. John Wright says:

    I am trying to make a texture scroll down, the Y-axis, on a 2D square via the shaders. My plan is to simply update the uv coordinates to make the texture appear to move instead of translating the piece of geometry with the texture on it. I can get the texture to move, but it never updates. Can anyone please give me an example of how this can be accomplished?

  26. AJ says:

    I spend the longest time comparing code trying to figure out why the texture wasn’t rendering. Turns out my camera was pulled back farther than in the example and as a result caused problems with the scaling used in the example. After digging around I ended up generating a mipmap and all worked.

    gl.bindTexture(gl.TEXTURE_2D, texture);
    gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
    gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, texture.image);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR_MIPMAP_NEAREST);
    gl.generateMipmap(gl.TEXTURE_2D);
    gl.bindTexture(gl.TEXTURE_2D, null);

  27. Dima says:

    For chrome users.

    You can install chrome canary and run it with the flag
    –allow-file-access-from-files

    And you can have your normal chrome running with many windows you need.

  28. [...] WebGL game can modify the UV-Coordinates of a vertex to reference a subsection of a larger image using normalized floating point [...]

  29. Anastasia says:

    Hello!

    I would like to texture the pyramid from the previous lessons. Does texturing pyramids much differ from texturing cubes? I’d like to know the size of the texture and the position of vertices. Appreciate examples.

    Thanks!

  30. Andy says:

    Using the switch –allow-file-access-from-files I can get the texture to load locally but it is scaled incorrectly and not rendered onto a cube face (just flat across the canvas). I have the RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not ‘texture complete’

    At the moment I’m running the code from my glassfish server on localhost which seems to get things working, although the render warning is still displayed so maybe something else is happening?

  31. Christine says:

    Hello,
    I would like to texture the spinning cube with two different images. I would like an image for front, top, right faces and a separate image for the rest. Does anyone know how to do this?

  32. Nick says:

    Hello,
    Big thanks. I’ve been using NeHe’s tutorials for years, and it makes it so much easier to learn a new binding when someone build tutorials with good explanations like yours.

    My problem is somewhat strange- This is all working on my box, using both localhost/ and /, but the textures aren’t loading on other machines. I can point to the images and see them, and I added an that showed up fine, but the textures don’t load in the webgl. Any ideas?

    Thanks,
    Nick

    Christine:
    You basically have 3 options, I arrange here from simplest but slowest to most complicated but fastest:

    1) render the different faces entirely separately. Make one set of buffers for all of the faces that uses the same image. Render the different sets in order.

    2) Use one set of buffers, but break it up into different calls to gl.drawElements with different offsets:

    gl.bindBuffer(gl.ARRAY_BUFFER, imgA);
    gl.drawElements(gl.TRIANGLES, numItemsWithFirstImg , gl.UNSIGNED_SHORT, 0);
    gl.bindBuffer(gl.ARRAY_BUFFER, imgB);
    gl.drawElements(gl.TRIANGLES, numItemsWithSecondImg , gl.UNSIGNED_SHORT, numItemsWithSecondImg * 3, 0);

    3) Finally, you can load both textures into the shader using different gl.TEXTUREN numbers. Then, use a separate aVectorTextureNum to pass the texture number per vertex

    Hope that helps,
    Nick

  33. Nick says:

    oops, left an extra arg in the last gl.drawElements. There’s no last 0 arg.

  34. DrBearhands says:

    I’d like to mention textures may not have loaded before the first draw. I had disabled animations/multiple draws which caused ‘black’ textures. Took a while to figure out.

  35. Mavaj says:

    Hi ,

    Thanks for all your wonderful tutorial , I am learning a lot .

    I need one more help from you . I search in google but I could not find any good example or references .

    I want to use live video stream which is comping from IP camera as texture for each face of cube . Or use motion JPG to grab image from IP camera and refresh the image at lest 10 frame per second . Can you guide me in this task ?

    Many Thanks

  36. hugues says:

    Hello
    On chrome, everything worked but I had a warning like :

    RENDER WARNING: texture bound to texture unit 0 is not renderable. It maybe non-power-of-2 and have incompatible texture filtering or is not ‘texture complete’

    I had noticed on other projects that chrome and firefox sometimes miss the image set up when one is using the DOM to load them, which is the case in this program.

    So I ended up doing a webGLStart(0) which runs until the initTextures included then at the end of the last handleLoadedTexture, I launch a timeout on webGLStart(1) which runs the subsequent.

    I had no more warning after this.

  37. Grüse says:

    Hi all,

    just expanding on what hugues said above concerning the texture RENDER WARNING:

    The warning is caused by activating and binding the texture in drawScene() before the corresponding image is loaded. In other words, this warning is triggered every time you render until the image has completed loading.

    While you could theoretically do the quick and dirty trick of calling
    window.setTimeout(tick, 1000);
    at the end of webGLStart(), you’ll still get the warning for slow connections and/or more textures.

    My solution works as follows: I removed the call of tick() in webGLStart() and changed initTexture() to this:

    var numTextures = 1;
    var textureImagesLoaded = 0;

    var initTexture = function() {
    cubeTexture = gl.createTexture();
    cubeTexture.image = new Image();
    cubeTexture.image.onload = function() {
    handleLoadedTexture(cubeTexture);
    textureImagesLoaded++;
    if(textureImagesLoaded >= numTextures) {
    tick();
    }
    };
    cubeTexture.image.src = “./img/nehe.gif”;
    };

    This calls tick() for the first time AFTER all the textures (in this case just one) have loaded. Voila!

    numTextures and textureImagesLoaded are global variables, but of course they could be stored differently. You could also use some sort of signalling instead of calling tick() directly, but the concept remains the same.

  38. roman21 says:

    Hey,

    I have been trying to follow your tutorials, but when I download the source code it doesn’t work and I have no idea why. All the lessons source shows the same thing, blank canvas with the 2 “back to lesson” links. anyone else have this problem and does anyone have a fix?

    I tried the neheTexture.image.crossOrigin = “anonymous”; but that didnt help.

    Thanks

  39. Evren says:

    Hi,

    thanks for all the these demos. They are really helpful.

    Questions:

    1) why does not the webgl use quaternion ? Is it just to stick to openGL way ?

    2)
    What if I wanted to map multiple images on different faces of the cube.

    say nehe.gif on one size(front face) and crate.gif on all others ?

    Say I want front face to be nehe and rest crate.

    Would I remove the front face coordinates from textureCoords
    and reduce the num items to 20 instead of 24. (so front face is gone)

    and create a separate array say “frontface” coordinates.
    and a separate “faceVertexTextureCoordBuffer” and map that to frontface and pass that to the shader?

    Also what if I want my texture to wrap around the edge of the cube ?

    Say I want to map an texture of a cube (unwrapped) Think about one texture that would go around the cube.

    I have always worked with engines which wraps opengl or directX. This is kind of new to me.

  40. kevin says:

    good stuff.
    I made a cube orbiting in a circle with different face rendered diffferent texture.
    Bascially the procedure is to add 3 different matrix of indices 3 different pair of faces and each bind each matrix to a different texture.
    See the examples in the link below:

    http://echofromfuture.com/webgltexture1.html

  41. xsailor says:

    Thank you for your lessons! I learned a lot from it.
    I test the lesson05 source code on f://html/webgl/lesson05.html. but it doesn’t work.
    the info:
    Uncaught SecurityError: Failed to execute ‘texImage2D’ on ‘WebGLRenderingContext’: the cross-origin image at file:///F:/html/webGL/nehe.gif may not be loaded.
    than I open a http server on localhost, than it works!
    the URL is like this:
    http://localhost:8083/shopping/webgl/lesson5.html
    There maybe something wrong with the http protocol or the file protocol !
    Webgl loads gif picture failed when work with the file protocol.
    Can you find a way to solve it?Many thanks!

  42. Dethraid says:

    When you use the texture coordinates in your fragment shader, yo use them like

    gl_FragColor = texture2D( uSampler, vec2( vTexcoord.s, vTexcoord.t ) );

    When you could just put

    gl_FragColor = texture2D( uSampler, vTexcoord );

    Since vTexcoord is already a vec2, and there’s no need to split it up and recombine it.

Leave a Reply

Subscribe to RSS Feed Follow Learning WebGL on Twitter