WebGL Lesson 12 – point lighting

<< Lesson 11Lesson 13 >>

Welcome to my number twelve in my series of WebGL tutorials, the second one that isn’t based on the NeHe OpenGL tutorials. In it, we’ll go through point lighting, which is pretty simple, but is important and will lead on to interesting things later. Point lighting, as you might expect, is lighting that comes from a particular point within a scene — unlike the directional lighting we’ve been using so far, which comes from some point outside the scene.

Here’s what the lesson looks like when run on a browser that supports WebGL:

Click here and you’ll see the live WebGL version, if you’ve got a browser that supports it; here’s how to get one if you don’t. You’ll see a sphere and cube orbiting; both will probably be white for a few moments while the textures load, but once that’s done you should see that the sphere is the moon and the cube a (not-to-scale) wooden crate. Both are illuminated by a point light source that is in between them. If you want to change the light’s position, colour, etc., there are fields beneath the WebGL canvas.

More on how it all works below…

The usual warning: these lessons are targeted at people with a reasonable amount of programming knowledge, but no real experience in 3D graphics; the aim is to get you up and running, with a good understanding of what’s going on in the code, so that you can start producing your own 3D Web pages as quickly as possible. If you haven’t read the previous tutorials already, you should probably do so before reading this one — here I will only explain the new stuff. The lesson is based on lesson 11, so you should make sure that you understand that one (and please do post a comment on that post if anything’s unclear about it!)

There may be bugs and misconceptions in this tutorial. If you spot anything wrong, let me know in the comments and I’ll correct it ASAP.

There are two ways you can get the code for this example; just “View Source” while you’re looking at the live version, or if you use GitHub, you can clone it (and the other lessons) from the repository there.

Let’s kick off by describing exactly what we’re trying to do with point lighting; the difference between it and directional lighting is that the light comes from a point within the scene. A moment’s thought should make it clear that this means that the angle from which it comes is different at every point in the scene. So, the obvious way to model it is to calculate the direction toward the light’s location for each vertex and then to just do exactly the same calculations as we did for directional lighting. And that’s what we do!

(You might be thinking, at this point, that perhaps it would be even better to calculate the direction to the light not just for every vertex, but for the points between vertices — that is, for the fragments. And you’d be quite right in thinking that; lighting like that is harder work for the graphics card, but it looks much better. And it’s what we’ll move on to in the next lesson :-)

Now we’ve determined what to do, it’s worth looking once again at this lesson’s demo page and noting one more thing: there’s no actual object in the scene at the point where the light is coming from. If you want to have an object that appears to be casting light (say, the sun in the centre of the solar system) then you need to define the light source and the object separately. Doing the object should be pretty easy based on the previous lessons, so in this walkthough I’ll only explain how the point light source works. As you might expect from the description above, it’s actually really simple; most of the differences between this page and lesson 11’s are simply to draw the cube and make it and the sphere orbit…

As usual, we’ll start at the bottom of the source HTML file and work our way up through the differences between this file and lesson 11’s. The first set of changes are in the HTML body, where the fields where you could enter a light direction have changed to be the position of the light. This is simple enough that there’s no point in showing them here, so let’s move on up to webGLStart. Once again, the changes are simple — this lesson has no mouse-based controls, so we have no mouse-handling code, and the function formerly known as initTexture is now called initTextures because it’s going to load two of them. Not very exciting…

Moving a little further up, the tick function has gained a new call, to animate, so that our scene updates over time:

  function tick() {
    requestAnimFrame(tick);
    drawScene();
    animate();
  }

Above that is the animate function itself, which simply updates two global variables that describe how far around their orbits the moon and the cube are in such a manner that they orbit at 50°/second:

  var lastTime = 0;
  function animate() {
    var timeNow = new Date().getTime();
    if (lastTime != 0) {
      var elapsed = timeNow - lastTime;

      moonAngle += 0.05 * elapsed;
      cubeAngle += 0.05 * elapsed;
    }
    lastTime = timeNow;
  }

The next function up is drawScene, which has a few interesting changes. It starts off with the normal boilerplate code to clear the canvas and set up our perspective, and then has code identical to lesson 11’s to check whether the lighting checkbox is checked and to send the ambient lighting colour to the graphics card:

  function drawScene() {
    gl.viewport(0, 0, gl.viewportWidth, gl.viewportHeight);
    gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);

    mat4.perspective(45, gl.viewportWidth / gl.viewportHeight, 0.1, 100.0, pMatrix);

    var lighting = document.getElementById("lighting").checked;
    gl.uniform1i(shaderProgram.useLightingUniform, lighting);
    if (lighting) {
      gl.uniform3f(
        shaderProgram.ambientColorUniform,
        parseFloat(document.getElementById("ambientR").value),
        parseFloat(document.getElementById("ambientG").value),
        parseFloat(document.getElementById("ambientB").value)
      );

Next, we push the position of our point light up to the graphics card in a uniform. This is equivalent to the code that pushed the lighting direction up in lesson 11; the difference is in something that was taken away rather than something added. When we sent the lighting direction to the graphics card, we needed to turn it into a unit vector (that is, scale it so that its length was one unit) and reverse its direction. No need for anything like that here: we just push the coordinates of the light directly up:

      gl.uniform3f(
        shaderProgram.pointLightingLocationUniform,
        parseFloat(document.getElementById("lightPositionX").value),
        parseFloat(document.getElementById("lightPositionY").value),
        parseFloat(document.getElementById("lightPositionZ").value)
      );

Next, we do the same for the point light’s colour, and we’re done with the lighting code in drawScene.

      gl.uniform3f(
        shaderProgram.pointLightingColorUniform,
        parseFloat(document.getElementById("pointR").value),
        parseFloat(document.getElementById("pointG").value),
        parseFloat(document.getElementById("pointB").value)
      );
    }

Next, we actually draw the sphere and the cube in the appropriate positions:

    mat4.identity(mvMatrix);

    mat4.translate(mvMatrix, [0, 0, -20]);

    mvPushMatrix();
    mat4.rotate(mvMatrix, degToRad(moonAngle), [0, 1, 0]);
    mat4.translate(mvMatrix, [5, 0, 0]);
    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, moonTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, moonVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, moonVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, moonVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, moonVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, moonVertexNormalBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexNormalAttribute, moonVertexNormalBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, moonVertexIndexBuffer);
    setMatrixUniforms();
    gl.drawElements(gl.TRIANGLES, moonVertexIndexBuffer.numItems, gl.UNSIGNED_SHORT, 0);
    mvPopMatrix();

    mvPushMatrix();
    mat4.rotate(mvMatrix, degToRad(cubeAngle), [0, 1, 0]);
    mat4.translate(mvMatrix, [5, 0, 0]);
    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexPositionBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexPositionAttribute, cubeVertexPositionBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexNormalBuffer);
    gl.vertexAttribPointer(shaderProgram.vertexNormalAttribute, cubeVertexNormalBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.bindBuffer(gl.ARRAY_BUFFER, cubeVertexTextureCoordBuffer);
    gl.vertexAttribPointer(shaderProgram.textureCoordAttribute, cubeVertexTextureCoordBuffer.itemSize, gl.FLOAT, false, 0, 0);

    gl.activeTexture(gl.TEXTURE0);
    gl.bindTexture(gl.TEXTURE_2D, crateTexture);
    gl.uniform1i(shaderProgram.samplerUniform, 0);

    gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, cubeVertexIndexBuffer);
    setMatrixUniforms();
    gl.drawElements(gl.TRIANGLES, cubeVertexIndexBuffer.numItems, gl.UNSIGNED_SHORT, 0);
    mvPopMatrix();
  }

So, that’s drawScene. Moving further up the code, you will see that initBuffers has gained our standard code for generating buffers for a cube as well as the code for a sphere, and even further up that initTextures is now loading two textures instead of just one.

The next, and in fact the final, change in the file is the most important one. If you scroll up to the top, where the vertex shader is, you’ll see that it has a few small changes, and its these that make the difference for this lesson. Working through from the top, with changes in red:

  attribute vec3 aVertexPosition;
  attribute vec3 aVertexNormal;
  attribute vec2 aTextureCoord;

  uniform mat4 uMVMatrix;
  uniform mat4 uPMatrix;
  uniform mat3 uNMatrix;

  uniform vec3 uAmbientColor;

  uniform vec3 uPointLightingLocation;
  uniform vec3 uPointLightingColor;

So, we have uniforms for the lighting location and colour to replace the old lighting direction and colour. Next:

  uniform bool uUseLighting;

  varying vec2 vTextureCoord;
  varying vec3 vLightWeighting;

  void main(void) {
    vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
    gl_Position = uPMatrix * mvPosition;

What we’ve done here is split our old code in two. In all of our vertex shaders so far, we’ve applied the model-view matrix and the projection matrix to the vertex position in one go, like this:

    // Code from lesson 11
    gl_Position = uPMatrix * uMVMatrix * vec4(aVertexPosition, 1.0);

Now, we’re storing the intermediate value, the position of the vertex with the current model-view matrix applied but before it has been adjusted to allow for perspective. This is used in the next bit:

    vTextureCoord = aTextureCoord;

    if (!uUseLighting) {
      vLightWeighting = vec3(1.0, 1.0, 1.0);
    } else {
      vec3 lightDirection = normalize(uPointLightingLocation - mvPosition.xyz);

The light’s position is in terms of the world coordinates, and the vertex position, once it’s been multiplied by the model-view matrix, is also in terms of world coordinates. We need to work out the direction of the point light from our current vertex in terms of these coordinates, and to work out the direction from one point to another, we just need to subtract them; once that’s done, we need to normalise the direction vector so that, just like our old lighting direction vector, it has a length of one. Once that’s done, all of the pieces are in place to do a calculation that’s identical to the one we were doing for directional lighting, with just a few variable names changed:

      vec3 transformedNormal = uNMatrix * aVertexNormal;
      float directionalLightWeighting = max(dot(transformedNormal, lightDirection), 0.0);
      vLightWeighting = uAmbientColor + uPointLightingColor * directionalLightWeighting;

And that’s it! You now know how to write shaders to provide point lighting.

That’s it for now; next time we’ll look at lighting again, improving the realism of our scene by making the lighting work per-fragment instead of per-vertex.

<< Lesson 11Lesson 13 >>

Acknowledgments: As before, the texture-map for the moon comes from NASA’s JPL website, and the code to generate a sphere is based on this demo, which was originally by the WebKit team. Many thanks to both!

You can leave a response, or trackback from your own site.

38 Responses to “WebGL Lesson 12 – point lighting”

  1. aa says:

    Runs slow here :(

  2. aa says:

    Chromium profiler says gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT) takes about 30% of cpu cycles!

  3. giles says:

    Wow, that’s really weird! Have you tried it on any other browsers? And which OS are you on?

  4. aa says:

    Yeah Linux, also on Firefox. May be a driver issue (fglrx). Advanced GLGE demo works smooth though as most other demos do.

  5. giles says:

    Hmm, there was a silly bug in the sphere code which was causing problems on Windows Chrome, and I’ve just fixed that — could you take another look and see if it helped performance for you too? It would be odd if it did (the bug in question stopped it from displaying under Chrome at all!) but you never know…

  6. Running the tutorial scene with the latest MineField on a rev1 Intel MacBook with the RadeonX1600 is smooth.

  7. giles says:

    Excellent, thanks for letting me know, Trevor!

  8. aa says:

    No, about as choppy, maybe a bit less.
    I’m sorry: clear is only taking 0.30%!

    This is output firebug
    drawScene 392 25.28% 309.101ms 1211.98ms 3.092ms 2.379ms 15.624ms index.html (line 531)
    (?)() 10192 11.2% 136.924ms 136.924ms 0.013ms 0.007ms 5.775ms 1 (line 2)
    setMatrixUniforms 784 8.18% 99.972ms 459.569ms 0.586ms 0.445ms 6.504ms index.html (line 285)
    (?)() 10192 6.91% 84.48ms 221.404ms 0.022ms 0.012ms 5.782ms 1 (line 2)
    (?)() 1960 6.17% 75.386ms 122.591ms 0.063ms 0.048ms 0.954ms 1 (line 2)
    (?)() 784 5.99% 73.203ms 296.45ms 0.378ms 0.283ms 6.164ms 1 (line 2)
    (?)() 1568 5.33% 65.128ms 114.127ms 0.073ms 0.05ms 0.332ms
    (?)() 5876 5.42% 29.868ms 102.334ms 0.017ms 0.012ms 0.108ms

    first (?) () is function

    function (a) {
    var i, elements = a.elements || a;
    if (typeof elements[0][0] != “undefined”) {
    var b = elements.length, ki = b, nj, kj, j;
    this.elements = [];
    do {
    i = ki – b;
    nj = elements[i].length;
    kj = nj;
    this.elements[i] = [];
    do {
    j = kj – nj;
    this.elements[i][j] = elements[i][j];
    } while (–nj);
    } while (–b);
    return this;
    }
    var n = elements.length, k = n;
    this.elements = [];
    do {
    i = k – n;
    this.elements.push([elements[i]]);
    } while (–n);
    return this;
    }

    second is

    function (a) {
    if (!a.elements) {
    return this.map(function (x) {return x * a;});
    }
    var b = a.modulus ? true : false;
    var M = a.elements || a;
    if (typeof M[0][0] == “undefined”) {
    M = Matrix.create(M).elements;
    }
    if (!this.canMultiplyFromLeft(M)) {
    return null;
    }
    var d = this.elements.length, ki = d, i, nj, kj = M[0].length, j;
    var e = this.elements[0].length, elements = [], sum, nc, c;
    do {
    i = ki – d;
    elements[i] = [];
    nj = kj;
    do {
    j = kj – nj;
    sum = 0;
    nc = e;
    do {
    c = e – nc;
    sum += this.elements[i][c] * M[c][j];
    } while (–nc);
    elements[i][j] = sum;
    } while (–nj);
    } while (–d);
    var M = Matrix.create(elements);
    return b ? M.col(1) : M;
    }

    third is

    function () {
    if (!this.isSquare() || this.isSingular()) {
    return null;
    }
    var a = this.elements.length, ki = a, i, j;
    var M = this.augment(Matrix.I(a)).toRightTriangular();
    var b, kp = M.elements[0].length, p, els, divisor;
    var c = [], new_element;
    do {
    i = a – 1;
    els = [];
    b = kp;
    c[i] = [];
    divisor = M.elements[i][i];
    do {
    p = kp – b;
    new_element = M.elements[i][p] / divisor;
    els.push(new_element);
    if (p >= ki) {
    c[i].push(new_element);
    }
    } while (–b);
    M.elements[i] = els;
    for (j = 0; j < i; j++) {
    els = [];
    b = kp;
    do {
    p = kp – b;
    els.push(M.elements[j][p] – M.elements[i][p] * M.elements[j][i]);
    } while (–b);
    M.elements[j] = els;
    }
    } while (–a);
    return Matrix.create(c);
    }

    fifth is

    function (a) {
    var M = new Matrix;
    return M.setElements(a);
    }

  9. giles says:

    Thanks! I’ll take a look. BTW how do you get the source for unknown functions like that in Firebug?

  10. aa says:

    Right mouse button, copy source. Or hovering just to see them.

  11. giles says:

    Odd, I don’t see those. I’m using http://getfirebug.com/releases/firebug/1.6X/firebug-1.6X.0a1.xpi, you?

    Anyway, those functions are presumably matrix multiplication stuff from Sylvester. I suppose they might be particularly slow (I know that there’s an optimisation I need to apply to the calculation of the normal matrix) but it really does seem weird that they’re particularly slow in your browser but not in mine. Still, if Paul’s GLGE demo is fast then it must be something specific about this one.

    When you say that other demos are OK, are you including the other ones I’ve done? In particular, is the last one (the moon that can be spun around with the mouse) usable?

  12. aa says:

    The moon is also not usable. The most others, however, are. It’s both in Firefox as in Chromium (a little bit less in Chromium).

    I installed that version (was using 1.4) but those still show up.

  13. aa says:

    More clear: this demo is more usable in Chromium.
    The moon demo is more usable in Firefox.

    But, there seems to be a problem at my computer! First the most examples were usable, but now only example 10 is. But that’s still strange, because the advanced glge demo is running smooth steadily at 30/40 fps!

  14. giles says:

    OK, that’s really weird! When you say “first the most examples were usable”, was this all when you checked them today, or do you mean that they perform worse now then they did when you tried them several days ago? I’m wondering if one of my retrospective changes might have broken something.

  15. aa says:

    Several weeks ago.

  16. aa says:

    Ah found the problem. Probably a driver, X, or Firefox issue. When I start certain examples in Firefox, the performance of WebGL in most examples (except some like GLGE) decreases very much after that. After computer restart performance is normal again. Maybe video memory somehow isn’t released in Firefox? In Chromium it’s no problem. It must be something like that.

  17. giles says:

    That’s odd! I wonder why GLGE isn’t affected?

  18. Glut says:

    as far as i can see, you did some changes in the initShaders method, which are not mentioned in the tutorial. maybe you’d like to add this :)

  19. giles says:

    Hi Glut — not sure which changes you mean. I’ve only changed the names of a couple of uniforms… unless you mean these ones? http://learningwebgl.com/blog/?p=1606 — if so, I’ve also retrospectively changed all of the lessons back to #1 :-)

  20. Dane Lindblad says:

    If you’re going through all the trouble of recalculating the reduced light intensity based on the normals, couldn’t you also (pretty painlessly) recalculate the reduced light intensity based on the reduced radiation with distance from the light source?

    If you store the vector

    uPointLightingDistance = uPointLightingLocation – mvPosition.xyz

    instead of normalizing it right away, you could then calculate the distance light weighting with the formula:

    distanceLightWeighting = alpha/(uPointLightingDistance*uPointLightingDistance)

    where alpha is the intensity of the light at distance d=0.

    This would help ensure that objects at the same orientation, but with different distances from the light source would have different degrees of shading. I may not have the syntax right as I’m just learning this through your tutorials, but I KNOW I have the physics right.

  21. giles says:

    Hi Dane,

    Interesting. The algorithm I’m using is pretty much a standard, but you’re quite right that the physics aren’t great and just scaling the weighting down by the distance squared (though I think you mean alpha is intensity at distance 1, right?) would make it more accurate.

    I’ll give it a go and see whether it works.

    Cheers,

    Giles

  22. masterx says:

    hi giles,

    i have tried this tutorial and added walking to it.
    But i notice the light will move along with player how is this possible?

    Should the light position be minus the player position?

  23. giles says:

    Yup, that would work.

  24. xema25 says:

    The problem with the walking is that if you “rotate” the camera it would be wrong to just calculate de camera position using what you use.

    I think I should do this, but i doesn’t work right… (the light doesn’t move with the camera, but it’s bad located) Any Idea?

    void main(void) {
    vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
    vec4 lightPos = uMVMatrix * vec4(uPointLightingLocation, 1.0);
    gl_Position = uPMatrix * mvPosition;
    vTextureCoord = aTextureCoord;

    if (!uUseLighting) {
    vLightWeighting = vec3(1.0, 1.0, 1.0);
    } else {
    vec3 lightDirection = normalize(lightPos.xyz – mvPosition.xyz);

    vec3 transformedNormal = uNMatrix * aVertexNormal;

    float directionalLightWeighting = max(dot(transformedNormal, lightDirection), 0.0);
    vLightWeighting = uAmbientColor + uPointLightingColor * directionalLightWeighting;
    }
    }

  25. giles says:

    The problem is that you’re applying the model-view matrix (which contains the current rotation that should be applied to the object) to the light location as well.

    One way to do this neatly is to have separate view and model matrices instead of having one combined model-view matrix. You then apply the inverse of the view matrix to the camera, and both the model and the view matrices to the objects.

  26. Mari says:

    Hi giles,

    Thank you for your excellent tutorial!
    Could you help me to understand the following part?

    > void main(void) {
    > vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
    > gl_Position = uPMatrix * mvPosition;
    … snip

    > if (!uUseLighting) {
    > vLightWeighting = vec3(1.0, 1.0, 1.0);
    > } else {
    > vec3 lightDirection = normalize(uPointLightingLocation – mvPosition.xyz);

    > The light’s position is in terms of the world coordinates, and the
    > vertex position, once it’s been multiplied by the model-view matrix,
    > is also in terms of world coordinates.

    ————-
    > vec3 lightDirection = normalize(uPointLightingLocation – mvPosition.xyz);

    In this line, “uPointLightingLocation” is in terms of the world
    coordinates, but “mvPosition.xyz” is in terms of the view coordinates,
    not “world coordinates”. Because it has been multiplied by the model-view matrix.

    In this case, “uPointLightingLocation” should be multiplied by
    model-view matrix? such as,

    vec4 mvPositionLightingLocation = uMVMatrix * vec4(uPointLightingLocation, 1.0);

    and then the lightDirection should be

    vec3 lightDirection = normalize(mvPointLightingLocation – mvPosition.xyz);

    Am I missing something?

  27. giles says:

    Hi Mari, glad you like the tutorial! The difference is that the lighting’s position is fixed. The mvMatrix combines the various translations and rotations that determine where the currently-drawn object is — that is, the position of the crate or the moon as related to the viewpoint. The lighting therefore shouldn’t be adjusted by it, as that would make the light move around with the objects.

    Is that any clearer?

  28. Mari says:

    Hi giles,
    Thank you for your quick reply!

    > The mvMatrix combines the various translations and rotations that
    > determine where the currently-drawn object is — that is, the position
    > of the crate or the moon as related to the viewpoint. The lighting
    > therefore shouldn’t be adjusted by it, as that would make the light
    > move around with the objects.

    Sorry, I made a mistake.

    My question is as follows
    “position” is multiplied by the model-view matrix to transform into
    a “view” coordinate.
    However, uPointLightingLocation is still in a “world” coordinate.
    As both coordinates are different, you cannot subtract them.

    If you calculate the lightDirection, you need to transform
    the uPointLightingLocation into “view” coordinate by multiplying
    “view” matrix (not model-view matrix) before subtraction.

    I rewrote my reply as follows:

    >> In this case, “uPointLightingLocation” should be multiplied by
    “view” matrix? such as,
    >> vec4 vPositionLightingLocation = uVMatrix * vec4(uPointLightingLocation, 1.0);

    In your example, uVMatrix (view matrix) is not available in your example.
    It need to be added.

    >> and then the lightDirection should be
    >> vec3 lightDirection = normalize(vPointLightingLocation.xyz – mvPosition.xyz);

  29. giles says:

    Hi Mari — right, I understand now. You’re right, in this case the camera position is locked so the “model-view” matrix is strictly speaking only a model matrix. If they were separated then you’d have a view matrix that contained the camera position and then a model matrix for positioning the items in the scene. The light position would, as you say, be multiplied only by the view matrix.

  30. Damien says:

    Hi Giles,
    First, very nice”s” tutorial”s” ! I follow all of them, but I wait for the nexts =P (Sorry for my English, I’m French)

    But I have one question, in all tutorials I don’t find the anwser :
    It’s about light effects. I try to make some static light-points in a land (by “land”, I mean a big floor ^^).
    With your tutorials, I succeed to make this “land”, and a first-person-game-like with moves by pressing keys.

    My problem : no lesson I followed learn me how to do that..
    Light effects move when the character move, it’s not a indoor-light-effect, like a lamp in the middle of a room with 4 walls..

    Do you know what I mean ? I can explain in another words, if I’m not clear :)

    Thanks if you can help me, otherwise thanks for all of your job !

  31. [...] tutorial de la serie Aprende webGL. Es una traducción no literal de su respectivo tutorial en Learning webGL. Ésta lección, como la anterior, no va a estar basada en ningún tutorial de OpenGL de NeHe. En [...]

  32. Jeff says:

    Hi Giles.
    this tutorial is amazing!

    but i have some question.

    how can i build a world that camera cannot penetrated?
    and function requestAnimFrame(tick), how does it work?

    thanks a lot.

  33. Tobias says:

    Thank you for that great Tutorial,
    unfortunatelry as some other people here, I’m fighting with the transformation. When moving the camera the light followed.
    I was solving this problem as Mari said by calculating

    vec4 lightPos = uMVMatrix * vec4(uPointLightingLocation, 1.0);
    and
    vec3 lightDirection = normalize(lightPos.xyz – mvPosition.xyz);

    Unfortunately I still have a problem. Maybe someone might help me with that.

    I made a “kind of” scene graph, which gives the opportunity to draw a cube and addChild(cube2) on it.
    When drawing, I first draw the cube itself and at the end the draw() method calls for eachChildren.draw()

    for(var i in this.children)
    {
    mvPushMatrix();
    mat4.translate(mvMatrix, this.children[i].positionsVector.getVektorArray());
    mat4.rotate(mvMatrix, degToRad(this.children[i].rotation), this.children[i].rotationsVector.getVektorArray());
    this.children[i].draw();
    mvPopMatrix();
    }

    So far everything goes fine. Except for one thing: when rotating the child cube, the coordinates are rotated as well – but for the light as well.
    So the light, that comes from the top of the scene is making a side brighter. That is, because the cube (better the mvMatrix) got rotated and now the top is the side. But still the top gets brighter.
    That means, the rotation isn’t included in that position calculation.

    Has anyone an idea how to manage this?

    Thank you very much for everything!
    Tobias

  34. Tobias says:

    Well, in other words, I need to rotate the lightposition “inverse” so when lighting the top, after rotating to the left, I need to light the right side.
    Any ideas how to manage this? And should I do this in the shader script or on the “client side” before pushing the coordinates of the light source?
    Maybe thats easier to understand.
    I really hope someone may help me with that!
    Thank you very much anyway,
    Tobias

  35. Tobias says:

    Ah, okay, finally I found my mistake.
    Using the lightposition with the MVMatrix goes wrong when making changes to the Matrix before.
    Of course I was rotating the whole matrix with this object. So The solution was simple. I was creating a Matrix only for the Camera but not to include the whole Matrix into the light position. (sorry for my bad english)
    Anyway, If anyone else hast the same problem, try doing something like this:

    vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
    vec4 lightPos = uMVCameraMatrix * vec4(uPointLightingLocation, 1.0);
    [...]
    vec3 lightDirection = normalize(lightPos.xyz – mvPosition.xyz);

    Tadaa! It works!

  36. Lord Ashes says:

    To make the light intensity drop off with distance, I made the following modifications to the shader:

    attribute vec3 aVertexPosition;
    attribute vec3 aVertexNormal;
    attribute vec2 aTextureCoord;

    uniform mat4 uMVMatrix;
    uniform mat4 uPMatrix;
    uniform mat3 uNMatrix;

    uniform vec3 uAmbientColor;

    uniform vec3 uPointLightingLocation;
    uniform vec3 uPointLightingColor;
    uniform float uPointLightingIntensity;

    uniform bool uUseLighting;

    varying vec2 vTextureCoord;
    varying vec3 vLightWeighting;

    void main(void)
    {

    vec4 mvPosition = uMVMatrix * vec4(aVertexPosition, 1.0);
    gl_Position = uPMatrix * mvPosition;

    vTextureCoord = aTextureCoord;

    if (!uUseLighting)
    {
    vLightWeighting = vec3(1.0, 1.0, 1.0);
    }
    else
    {
    vec3 lightDirection = normalize(uPointLightingLocation – mvPosition.xyz);
    float lightDistance = length(uPointLightingLocation – mvPosition.xyz);
    if(lightDistance < 0.0){lightDistance = lightDistance * -1.0;}
    lightDistance = (uPointLightingIntensity – lightDistance) / uPointLightingIntensity;

    vec3 transformedNormal = uNMatrix * aVertexNormal;
    float directionalLightWeighting = max(dot(transformedNormal, lightDirection), 0.0);
    vLightWeighting = uAmbientColor + uPointLightingColor * directionalLightWeighting * lightDistance;
    }
    }

    Obviously the shader init also needs to be modified to load uPointLightingIntensity with a value. This value is basically the distance at which the point light source contributions become zero. So a smaller value would mean the light source is less intense to the rest of the world (will light less of it) whereas larger values indicate a more intense light (that will light more of the surrounding world).

  37. [...] Lesson 12: point lighting shows how implement lighting that seems to come from points within your 3D scene. [...]

  38. [...] texture courtesy of the Jet Propulsion Laboratory. Based on Lesson 12 April 27, 2013 1 [...]

Leave a Reply

Subscribe to RSS Feed Follow Learning WebGL on Twitter