WebGL Lesson 16 – rendering to textures

<< Lesson 15

Welcome to my number sixteen in my series of WebGL tutorials! In it, we’ll get started with an extremely useful technique: rendering a 3D scene to a texture, which we can then later use as an input for rendering a different scene. This is a neat trick not just because it makes it possible to have scenes within scenes, as in the demo page for this tutorial, but also because it is the foundation required for adding picking (selection of 3D objects with the mouse), shadows, reflections, and many other 3D effects.

Here’s what the lesson looks like when run on a browser that supports WebGL:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t. You’ll see a model of a white laptop, with all of the various lighting effects you’ll have seen in the previous lessons (including a specular gleam on its screen). But, more interestingly, on the screen of the laptop you’ll see another 3D scene being displayed — the orbiting moon and crate that made up the demo for lesson 13. I’m sure it’s clear that what’s happening in this page is that we’re rendering the scene from lesson 13 to a texture, and then using that texture on the screen of the laptop.

So, how does it work? Read on to find out.

Before we wade into the code, the usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the new stuff. The lesson is based on lessons 13 and 14, so you should make sure that you understand those.

There may be bugs and misconceptions in this tutorial. However, thanks the kind help of Marco Di Benedetto, the creator of SpiderGL, and Paul Brunt of GLGE fame, and a legion of testers, particularly Stephen White, this tutorial’s more correct than it would otherwise have been. Of course, any errors are entirely my own fault, so please don’t hesitate to let me know what I got wrong :-)

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there.

Once you have a copy of the code, load up index.html in a text editor and have a look. This tutorial’s file has quite a few changes from previous lessons, so let’s start at the bottom and work our way up. Firstly, webGLStart; I’ve highlighted new stuff in red, as usual:

  function webGLStart() {
    var canvas = document.getElementById("lesson16-canvas");
    initGL(canvas);
    initTextureFramebuffer();
    initShaders();
    initBuffers();
    initTextures();
    loadLaptop();

    gl.clearColor(0.0, 0.0, 0.0, 1.0);
    gl.enable(gl.DEPTH_TEST);

    tick();
  }

So, we’re doing our usual setup, initialising WebGL, loading our shaders, creating buffers of vertices to draw, loading the textures we’ll use (the moon and the crate), and kicking off a request to load the JSON model of the laptop, just like we did to load the teapot model in lesson 14. The exciting new bit is that we’re creating a framebuffer for the texture. Before I show you the code that does this, let’s look at what a framebuffer is.

When you render something with WebGL, you obviously need some kind of memory on the graphics card to receive the results of the rendering. You have really fine-grained control over what kind of memory is allocated for this. You need, at the very least, space to store the colours of the various pixels that make up the results of your rendering; it’s also pretty important (though occasionally not essential) to have a depth buffer, so that your rendering can take account of how close objects in the scene hide distant objects (as discussed in lesson 8), so that needs a bit of memory too. And there are other kinds of buffers that can also be useful, like a stencil buffer — which is something we’ll take a look at in a future lesson.

A framebuffer is a thing to which you can render a scene, and it’s made up of these various bits of memory. There’s a “default” framebuffer, which is the one we’ve always been rendering to in the past, and is displayed in the web page — but you can create your own framebuffers and render to them instead. In this tutorial, we’ll create a framebuffer and we’ll tell it to use a texture as the bit of memory where it should store the colours when it’s rendering; we’ll also have to allocate it a bit of memory to use for its depth calculations.

So, having explained all that, let’s take a look at some code that does it all. The function is initTextureFramebuffer, and it’s about a third of the way from the top of the file.

  var rttFramebuffer;
  var rttTexture;

  function initTextureFramebuffer() {

Before the function starts, we define some global variables to store the framebuffer to which we’re going to render the the stuff that is to go on the laptop’s screen, and to store the texture that stores the result of the rendering to this framebuffer (which we’ll need to access when we’re drawing the laptop itself). On to the function:

    rttFramebuffer = gl.createFramebuffer();
    gl.bindFramebuffer(gl.FRAMEBUFFER, rttFramebuffer);
    rttFramebuffer.width = 512;
    rttFramebuffer.height = 512;

Our first step is to create the framebuffer itself, and, following the normal pattern (as with textures, vertex attribute buffers, and so on) we make it our current one — that is, the one the next function calls will operate on. We also store away the width and height of the scene we’re going to be rendering to it; these attributes aren’t normally part of a framebuffer, I’ve just used the normal JavaScript trick of associating them as new properties because they’ll be needed at later points when we’re doing stuff with the framebuffer. I’ve picked 512×512 pixels as a size — remember, textures have to have widths and heights that are powers of two, and I found that 256×256 was too blocky, while 1024×1024 didn’t make things noticeably better.

Next, we create a texture object, and set up the same parameters as usual:

    rttTexture = gl.createTexture();
    gl.bindTexture(gl.TEXTURE_2D, rttTexture);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
    gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR_MIPMAP_NEAREST);
    gl.generateMipmap(gl.TEXTURE_2D);

But there’s one small difference; the call to gl.texImage2D has rather different parameters:

      gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, rttFramebuffer.width, rttFramebuffer.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);

Normally when we’re creating textures to show images that we’ve loaded into JavaScript, we call gl.texImage2D to bind the two together. Now, of course, there’s no loaded image; what we need to do is call a different version of gl.texImage2D, telling it that we don’t have any image data and we’d just like it to allocate a particular amount of empty space on the graphics card for our texture. Strictly speaking, the last parameter to the function is an array which is to be copied into the freshly-allocated memory as a starting point, and by specifying null we’re telling it that we don’t have anything to copy. (Early versions of Minefield required you to pass an appropriately-sized empty array in for this, but that seems to have been fixed now.)

OK, so we now have an empty texture which can store the colour values for our rendered scene. Next, we create a depth buffer to store the depth information:

    var renderbuffer = gl.createRenderbuffer();
    gl.bindRenderbuffer(gl.RENDERBUFFER, renderbuffer);
    gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, rttFramebuffer.width, rttFramebuffer.height);

What we’ve done here is create a renderbuffer object; this is a generic kind of object that stores some lump of memory that we’re intending to associate with a framebuffer. We bind it — just as with textures, framebuffers, and everything else, WebGL has a current renderbuffer — and then call gl.renderbufferStorage to tell WebGL that the currently-bound renderbuffer needs enough storage for 16-bit depth values across a buffer with the given width and height.

Next:

    gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, rttTexture, 0);
    gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, renderbuffer);

We attach everything to the current framebuffer (remember, we bound our new one to be the current one just after creating it at the top of the function). We tell it that the framebuffer’s space for rendering colours (gl.COLOR_ATTACHMENT0) is our texture, and that the memory it should use for depth information (gl.DEPTH_ATTACHMENT) is the depth buffer we just created.

Now we have all of the memory set up for our framebuffer; WebGL knows what to render to when we’re using it. So now, we tidy up, setting the current texture, renderbuffer, and framebuffer back to their defaults:

    gl.bindTexture(gl.TEXTURE_2D, null);
    gl.bindRenderbuffer(gl.RENDERBUFFER, null);
    gl.bindFramebuffer(gl.FRAMEBUFFER, null);
  }

…and we’re done. Our framebuffer is properly set up. So now that we’ve got it, how do we use it? The place to start looking is drawScene, near the bottom of the file. Right at the start of the function, before the normal code to set the viewport and clear the canvas, you’ll see something new:

  var laptopAngle = 0;

  function drawScene() {
    gl.bindFramebuffer(gl.FRAMEBUFFER, rttFramebuffer);
    drawSceneOnLaptopScreen();

    gl.bindFramebuffer(gl.FRAMEBUFFER, null);

    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT)

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);

In the light of the description above, it should be pretty obvious what’s happening there — we’re switching away from the default framebuffer, which renders to the canvas in the HTML page, to the render-to-texture framebuffer that we created in initTextureFramebuffer, then we’re calling a function called drawSceneOnLaptopScreen to render the scene that we want displayed on the laptop’s screen (implicitly, rendering it to the RTT framebuffer), and when that’s done, we’re switching back to the default framebuffer. Before moving on with drawScene, it’s worth taking a look at the drawSceneOnLaptopScreen function. I won’t copy it in here, because it’s actually really very simple — it’s just a stripped-down version of the drawScene function from lesson 13! This is because our rendering code until now hasn’t made any assumptions about where it’s rendering to; it’s just rendered it to the current framebuffer. The only changes made for this lesson were the simplifications made possible by removing the movable light source and other things lesson 13 had that weren’t necessary for this tutorial.

So, once those first three lines of drawScene have been executed, we have a frame from lesson 13 rendered to a texture. The remainder of drawScene simply draws the laptop, and uses this texture for its screen. We start off with some normal code to set up the model-view matrix and to rotate the laptop by an amount determined by laptopAngle (which, as in the other tutorials, is updated in a animate function that’s called every time we draw the scene to make the laptop keep rotating):

    mat4.identity(mvMatrix);

    mvPushMatrix();

    mat4.translate(mvMatrix, [0, -0.4, -2.2]);
    mat4.rotate(mvMatrix, degToRad(laptopAngle), [0, 1, 0]);
    mat4.rotate(mvMatrix, degToRad(-90), [1, 0, 0]);

We send the values defining the colours and locations of our light sources to the graphics card as normal:

    gl.uniform1i(shaderProgram.showSpecularHighlightsUniform, true);
    gl.uniform3f(shaderProgram.pointLightingLocationUniform, -1, 2, -1);

    gl.uniform3f(shaderProgram.ambientLightingColorUniform, 0.2, 0.2, 0.2);
    gl.uniform3f(shaderProgram.pointLightingDiffuseColorUniform, 0.8, 0.8, 0.8);
    gl.uniform3f(shaderProgram.pointLightingSpecularColorUniform, 0.8, 0.8, 0.8);

Next, we pass the graphics card information about the lighting-related parameters of the laptop’s body, which is the first thing we’re going to draw. There’s something new here that’s not directly related to rendering to textures. You may remember that way back in lesson 7, when I described the Phong lighting model, I mentioned that materials had different colours for each kind of light — an ambient colour, a diffuse colour, and a specular colour. At that time, and in all of the lessons since, we’ve been making the simplifying assumption that these colours were always white, or the colour of the texture, depending on whether textures were switched off or on. For reasons we’ll look at in a moment, that’s not quite enough for this tutorial — we’ll need to specify colours in a bit more detail for the laptop screen, and we’ll have to use a new kind of colour, the emissive colour. However, for the laptop’s body, we don’t need to worry too much about this: the material colour parameters are simple, the laptop is just white.

    // The laptop body is quite shiny and has no texture.  It reflects lots of specular light
    gl.uniform3f(shaderProgram.materialAmbientColorUniform, 1.0, 1.0, 1.0);
    gl.uniform3f(shaderProgram.materialDiffuseColorUniform, 1.0, 1.0, 1.0);
    gl.uniform3f(shaderProgram.materialSpecularColorUniform, 1.5, 1.5, 1.5);
    gl.uniform1f(shaderProgram.materialShininessUniform, 5);
    gl.uniform3f(shaderProgram.materialEmissiveColorUniform, 0.0, 0.0, 0.0);
    gl.uniform1i(shaderProgram.useTexturesUniform, false);

The next step is, if the laptop’s various vertex coordinates have been loaded yet, to draw it. This code should be pretty familiar by now, especially after lesson 14 (from which it’s largely copied):

    if (laptopVertexPositionBuffer) {
      gl.bindBuffer(gl.ARRAY_BUFFER, laptopVertexPositionBuffer);
      gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, laptopVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

      gl.bindBuffer(gl.ARRAY_BUFFER, laptopVertexTextureCoordBuffer);
      gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, laptopVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

      gl.bindBuffer(gl.ARRAY_BUFFER, laptopVertexNormalBuffer);
      gl.vertexAttribPointer(shaderProgram.vertexNormalAttribute, laptopVertexNormalBuffer.itemSize, gl.FLOAT, false, 0, 0);

      gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, laptopVertexIndexBuffer);
      setMatrixUniforms();
      gl.drawElements(gl.TRIANGLES, laptopVertexIndexBuffer.numItems, gl.UNSIGNED_SHORT, 0);
    }

Once we’ve done all that, the laptop body has been draw. Next, we need to draw the screen. Its lighting settings are done first, and this time we set an emissive colour:

    gl.uniform3f(shaderProgram.materialAmbientColorUniform, 0.0, 0.0, 0.0);
    gl.uniform3f(shaderProgram.materialDiffuseColorUniform, 0.0, 0.0, 0.0);
    gl.uniform3f(shaderProgram.materialSpecularColorUniform, 0.5, 0.5, 0.5);
    gl.uniform1f(shaderProgram.materialShininessUniform, 20);
    gl.uniform3f(shaderProgram.materialEmissiveColorUniform, 1.5, 1.5, 1.5);
    gl.uniform1i(shaderProgram.useTexturesUniform, true);

So, what’s the emissive colour? Well, screens on things like laptops don’t just reflect light — they emit it. We want the colour of the screen to be determined by the colour of the texture much more than by the lighting effects. We could do that by changing the uniforms that govern the lighting, to switch off point lighting and bump ambient lighting up to 100% before drawing the screen, and then restoring the old values afterwards, but that would be a bit of a hack — after all, this emissivity of the screen is property of the screen, not the light. In this particular example, we could also do it just by using the ambient lighting, because the ambient light is white-coloured, so setting the screen’s ambient colour to 1.5, 1.5, 1.5 would have the right effect. But if someone then changed the ambient lighting, the screen’s colour would change, which would be odd. After all, if you put your laptop in a red-lit room, the screen doesn’t go red. So we use a new emissive colour uniform, which is handled by the shader using some simple code we’ll come to later.

(A side note: it’s worth remembering that an object’s emissive colour in this sense doesn’t affect any other objects around it — that is, it doesn’t make the object turn into a lighting source and light other things up. It’s just a way of making an object have a colour that is independent of the scene’s lighting.)

The requirement for the emissive colour also explains why we needed to separate the other material colour parameters out for this tutorial; our laptop screen has an emissive colour determined by its texture, but its specular colour should be fixed any unaffected by this — after all, the thing showing on your laptop’s screen doesn’t change the colour of the reflection in it of the window behind you. So that colour is still white.

Right, moving on, we bind the buffers that specify the laptop screen’s vertex attributes:

    gl.bindBuffer(gl.ARRAY_BUFFER, laptopScreenVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, laptopScreenVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, laptopScreenVertexNormalBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexNormalAttribute, laptopScreenVertexNormalBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, laptopScreenVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, laptopScreenVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

Next, we specify that we want to use the texture to which we rendered earlier:

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, rttTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

Then we draw the screen, and we’re done:

    setMatrixUniforms();
    gl.drawArrays(gl.TRIANGLE_STRIP, 0, laptopScreenVertexPositionBuffer.numItems);

    mvPopMatrix();
  }

Almost an anti-climax, isn’t it ;-) That was all of the code required to render a scene to a texture, and then to use that texture in another scene.

That’s pretty much it for this tutorial, but let’s just quickly run through the other changes from the previous lessons; there’s a pair of functions called loadLaptop and handleLoadedLaptop to load up the JSON data that makes the laptop; they’re basically the same as the code to load the teapot in lesson 14. There’s also a bit of code at the end of initBuffers to initialise the vertex buffers for the laptop screen; this is a bit ugly and will be improved in a later version of this tutorial (the values should be loaded up from JSON like the laptop data but are currently sitting there in the code).

Finally, there’s the new fragment shader, which needs to handle per-lighting-type material colours as an alternative to the texture colour. All of it should be pretty easy to understand in the light of the earlier shaders; the only thing that’s really new is the emissive lighting, and all that happens with that is that it’s added to the final fragment colour right at the end. Here’s the code:

  precision mediump float;

  varying vec2 vTextureCoord;
  varying vec3 vTransformedNormal;
  varying vec4 vPosition;

  uniform vec3 uMaterialAmbientColor;
  uniform vec3 uMaterialDiffuseColor;
  uniform vec3 uMaterialSpecularColor;
  uniform float uMaterialShininess;
  uniform vec3 uMaterialEmissiveColor;

  uniform bool uShowSpecularHighlights;
  uniform bool uUseTextures;

  uniform vec3 uAmbientLightingColor;

  uniform vec3 uPointLightingLocation;
  uniform vec3 uPointLightingDiffuseColor;
  uniform vec3 uPointLightingSpecularColor;

  uniform sampler2D uSampler;

  void main(void) {
    vec3 ambientLightWeighting = uAmbientLightingColor;

    vec3 lightDirection = normalize(uPointLightingLocation - vPosition.xyz);
    vec3 normal = normalize(vTransformedNormal);

    vec3 specularLightWeighting = vec3(0.0, 0.0, 0.0);
    if (uShowSpecularHighlights) {
      vec3 eyeDirection = normalize(-vPosition.xyz);
      vec3 reflectionDirection = reflect(-lightDirection, normal);

      float specularLightBrightness = pow(max(dot(reflectionDirection, eyeDirection), 0.0), uMaterialShininess);
      specularLightWeighting = uPointLightingSpecularColor * specularLightBrightness;
    }

    float diffuseLightBrightness = max(dot(normal, lightDirection), 0.0);
    vec3 diffuseLightWeighting = uPointLightingDiffuseColor * diffuseLightBrightness;

    vec3 materialAmbientColor = uMaterialAmbientColor;
    vec3 materialDiffuseColor = uMaterialDiffuseColor;
    vec3 materialSpecularColor = uMaterialSpecularColor;
    vec3 materialEmissiveColor = uMaterialEmissiveColor;
    float alpha = 1.0;
    if (uUseTextures) {
      vec4 textureColor = texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t));
      materialAmbientColor = materialAmbientColor * textureColor.rgb;
      materialDiffuseColor = materialDiffuseColor * textureColor.rgb;
      materialEmissiveColor = materialEmissiveColor * textureColor.rgb;
      alpha = textureColor.a;
    }
    gl_FragColor = vec4(
      materialAmbientColor * ambientLightWeighting
      + materialDiffuseColor * diffuseLightWeighting
      + materialSpecularColor * specularLightWeighting
      + materialEmissiveColor,
      alpha
    );
  }

And that truly is it! In this tutorial, we’ve gone over how to render a scene to a texture and use it in another scene, and on the way touched on material colours and how they work. In the next tutorial, I’ll show how to do something really useful with this: GPU picking, so that you can write 3D scenes that people can interact with by clicking on objects.

<< Lesson 15


Acknowledgments: I needed a lot of help to get this one running, in particular because the first version had bugs that didn’t show up when I ran it on my own laptop. I’d particularly like to thank Marco Di Benedetto, the creator of SpiderGL, and Paul Brunt, of GLGE fame, for telling me what I’d got wrong and how to fix it. But I owe a lot of gratitude to the people who tested version after version of the demo until we finally get one that should work pretty much anywhere — Stephen White (who also made it clear to me that RTT was a necessity for sensible picking, which was what made it the topic for this lesson), Denny (creator of EnergizeGL), blinblin, nameless, Jacob Seidelin, Pyro Technick, ewgl, Peter, Springer, Christofer, Thormme, and titan.

Other places where I took much-needed inspiration were the OpenGL ES 2.0 Programming Guide, Paul Brunt’s GLGE library, and a variety of iPhone development forum posts and queries: here, here, and here. Obviously. the WebGL specification helped too…

The 3D model of the laptop was made freely available by Xedium, and the Moon texture is courtesy of the Jet Propulsion Laboratory.

Phew. That was beginning to sound like an Oscar acceptance speech…

You can leave a response, or trackback from your own site.

47 Responses to “WebGL Lesson 16 – rendering to textures”

  1. hider says:

    Thank you for your lessons, they’re very useful.

  2. giles says:

    @hider — thanks! Glad you find them useful.

  3. Alvaro says:

    Good tutorial.

    One interesting issue: Unlike Firefox, Chrome applies nice antialiasing to WebGL graphics, but in this demo, antialiasing is gone.

    I don’t really understand why, but if you comment out the line calling “drawSceneOnLaptopScreen();” (disabling the actual render to texture) then the view (i.e. the laptop) becomes nicely antialiased.

    A chrome bug, I guess…

  4. steve says:

    @Alvaro; I think the antialiasing is coming from GL_LINEAR, which is set once. The rebinding to another framebuffer may be wiping out this setting. You could test this by putting the GL_LINEAR call into the render loop?

  5. Wiz says:

    You can use NPOT with webgl under some conditions (no mipmaps, and use clamp to edges). Just enough to do some post processing.
    Ref: http://khronos.org/webgl/wiki/WebGL_and_OpenGL_Differences

  6. giles says:

    @Wiz — you’re right, and somewhere else on the blog I explain that. I should put something appropriate in this post too.

  7. giles says:

    @Alvaro — if you get a chance to run this demo with steve’s suggested modification I’d be really pleased to hear what results you get.

  8. szimek says:

    Does call to gl.generateMipmap(gl.TEXTURE_2D) in initTextureFramebuffer function actually does anything if the texture has just been created and is empty at that time?

  9. giles says:

    @szimek — good question! I suspect it doesn’t and I just left it there out of habit. I’ll double-check and remove it if necessary.

  10. Nicolas says:

    Hi, very useful tutorial, thank you very much!

    I have one question: What does gl.STREAM_DRAW (as opposed to gl.STATIC_DRAW) do for the moon indices? The box indices have gl.STATIC_DRAW. I don’t understand which one should be used in this case.

    Thanks again!

  11. giles says:

    Hi Nicolas, glad you found the tutorial useful!

    Re: the STREAM_DRAW vs STATIC_DRAW — you know, I’d never realised that I was using a different constant! The last parameter to the gl.bufferData call is actually a hint to the runtime saying how you expect the values in the buffer to change over time; more details here and here. Given the nature of my demos, I should really be using STATIC_DRAW in every case. I’ll update the tutorials to remove any chance of confusion, thanks for pointing it out!

  12. Jian says:

    Great tutorial! I’ve done quite some 3D programming but the problem has always been how to deliver the content to the client-end in a hassle-free way. Now, with WebGL, no more need to require the user to install this and that dll or ActiveX. The only sad thing is that the stubborn Microsoft still refuses to embrace open standards such as WebGL — so IE users would still have to install something… Very sad situation for a developer….

  13. giles says:

    Thanks, Jian. Re: IE — very true. Hopefully Chrome Frame will make it less of a problem, though.

  14. rajasekhar says:

    Hi,
    I am getting gl.FRAMEBUFFER_INCOMPLETE_ATTACHMENT when checking for checkFramebufferStatus after using bindFramebuffer. Any idea on this.

  15. giles says:

    Hmm, no idea, sorry. Have you tried asking in the WebGL forums?

  16. Nobita says:

    Hi giles,
    That’s very a cool demo.
    I walk on internet several days and try to convert some model type .3ds .blend .max … to Json Model. The only way, i can find, is using the blend converter using the file WebGLExport.py.
    When I view it on browser, it’s not the model I converted. It lose some planes, it don’t have color …
    So, Please Help me how to work with model in WebGL. My Final exercise is building a Web site with WebGL and I just have 4 month to do it.
    My mail is [email protected]. Please contact with me.

    Thank’s for all and sorry about my poor english.

  17. giles says:

    Hi Nobita — try importing yout model into Blender and then exporting it using the instructions here.

  18. Nobita says:

    That’s really good link, thank giles.

  19. Nobita says:

    Hi giles,
    I have converted an animated knight model with your help. The knight walks throught a maze that is just a tiny program but it really make me happy.

    I continued with skybox, a simple effect. It worked well, but I had a trouble with the function mat4.LookAt(). This function work strangely,so I can’t understand how to set up this function.
    this is the way I set up the function :

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);
    mat4.lookAt([0, 0, 0], [0, 0, -100], [0, 1, 0], pMatrix);
    mat4.identity(mvMatrix);

    In that function,the eye postion is [0,0,0], look at the point [0,0,-100] (Deep inside the monitor), and UP vector is [0,1,0]. but it didn’t work as I Expected. If you see any mistake, please let me know.
    Sorry about my annoy. I’m just a beginer ^^.

  20. Juriy says:

    I believe, you’re applying your lookAt method to the wrong matrix. Try the same code with the modelview matrix.

  21. dragon8xa2 says:

    i have the same problem with him, i also used with mvMatrix, but it’s also wrong. Could you give us an example of lookAt function?
    Thank you so much.

  22. giles says:

    Hi all — I don’t tend to use lookAt, but what Juriy says sounds right to me. So what Nobita should have done was this:

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);
    mat4.lookAt([0, 0, 0], [0, 0, -100], [0, 1, 0], mvMatrix);
    

    – note that the mat4.identity is gone, and the last parameter of lookAt has changed.

    Does that help?

  23. hamid says:

    hello
    tanks for this learning.
    we study your educations in our class.
    we are engineer Computer that we like learn your educations.
    Now,we translate your educations for our friends.
    good luck

  24. giles says:

    @hamid — that’s great, glad it helps! If you put the translation online, let me know — I’d be happy to add a link to it.

  25. [...] Once the particles are draw in the framebuffer we only have to get the final texture to the next pass, one good example of how to make this appears in this tutorial from LearningWebGL. [...]

  26. _Nobita_ says:

    Hi all,
    When I’m trying to do picking on webgl, I use the framebuffer and read the pixel color to do this. And it thow the (SECURITY_ERR : Dom Eception 18) while the function gl.readpixels run.
    http://imageshack.us/f/94/readpxielerror.jpg/
    I don’t know how to solve this problem.
    So help me please !!!

  27. dj says:

    can you write a tutorial to demonstrating picking using mouse click on object

  28. [...] Once the particles are draw in the framebuffer we only have to get the final texture to the next pass, one good example of how to make this appears in this tutorial from LearningWebGL. [...]

  29. Chris Dew says:

    Thanks for this excellent series.

    Are you planning the stencil lesson some time soon?

  30. Tobias says:

    Hi, thanks for this great tutorial!
    I’m using Three.js right now, do you have any idea how to realize this using Three.js?

  31. Tobias says:

    Solved my own question right there with: http://mrdoob.github.com/three.js/examples/webgl_rtt.html

  32. [...] tutorial de la serie Aprende webGL. Es una traducción no literal de su respectivo tutorial en Learning webGL.En él vamos a enseñar una técnica que os resultará muy útil: La representación de una escena [...]

  33. Dagstjerna says:

    I need a function to return the position of my matrix, is this right way to do it?

    var xyz = mat4.getPosition(pMatrix);

    I´m missing this function in gl-matrix.js

  34. Corey Clark says:

    so once you have rendered to the texture, is there a way to save this texture so you don’t have to keep rendering to it on every draw call. So basically you would render the scene on the laptop screen once, then just keep reusing that texture, so you are basically only rendering a screen quad, with the saved RTT.

    Currently if I don’t keep rendering the scene every update, the texture is blanked out.

    Any help on this would be much appreciated

  35. Corey Clark says:

    Looks like the issue may have been, I was drawing the RTT first, and the model had not fully loaded, and therefore was not shown on the texture. Once I put the RTT on keypress, it works just fine. Now I only render the scene once and then just render texture to quad for all other frame updates.

    Another question I had, was looking at the output of the WebGL Inspector, and since the demo calls gl.generateMipmap(gl.TEXTURE_2D); on every draw call, it is being called several times… Is there a reason the mimpas have to be generated every frame?

    With all of the other objects I draw, I only had to create the mipmaps when I loaded them, I did not call it on every draw.

  36. Nikola says:

    I wanna put two json included obj but i got error:
    XMLHttprequest is not defined
    var request1 = new XMLHttprequest();
    I do everything similar but program doesnt work?SomeBody !

  37. Hernan says:

    Great series of articles!
    Are you planning on adding new lessons?

  38. [...] there's a way to render to a texture as well in webgl, please check the super tutorial here, however i'm still struggling with it, so the result of javascript version is less interesting then [...]

  39. [...] WebGL Lesson 16: render-to-texture shows how to render a WebGL scene into a texture that can then be used in another scene — a neat trick in itself, and a useful foundation for other techiques. « Presentation 7 [...]

  40. AbstractAlgorithm says:

    I want to create post-processing glow effect.
    So I need object rendered normally -> first texture.
    Then same object rendered only with glow maps -> second texture.
    And then combining first and blurred second texture to blend them and display it on some quad as finished pp effect.

    What are changes required to render scenes to multiple textures?

  41. AbstractAlgorithm says:

    I created yet another framebuffer, renderbuffer and texture and that did the job. Will try to minify number of changes.

    Great tutorials. :)

  42. Zhilong says:

    Great Job! I learned a lot from this lesson.

  43. [...] does. I’ll assume that you’re familiar with the basics of render-to-texture. If not, read up here. Before this extension was available you could render to a color texture, but the depth component [...]

  44. Filo says:

    Hi, this example (lesson16) works amazingly on google chrome, firefox and opera mobile, but it doesn’t work on opera desktop, do you know why?

  45. Michael says:

    Please add a lesson about displaying text with WebGL.

  46. Stuart says:

    Is this sort of how portal (the game) works? E.g. each ‘portal’ is just another rendered view into the same physical scenery from a different position, and you walk through the textures?

Leave a Reply

Subscribe to RSS Feed Follow Learning WebGL on Twitter