WebGL: Frequently Asked Questions
This is a list of frequently-asked questions about WebGL. It is not a tutorial -- if that's what you want, you can check out the Learning WebGL lessons.
Background and Getting Started
What is WebGL?
WebGL is managed by Khronos, an organisation which is responsible for a number of open standards.
How do I get WebGL running on my machine?
Right now, you need to install a special version of a web browser to use WebGL. You can get appropriate versions of Firefox, Chrome and Safari.
- There are [official instructions on http://www.khronos.org/webgl/wiki/Main_Page the WebGL public Wiki].
- There are more detailed instructions, organised by operating system, on the Learning WebGL website.
When will WebGL be ready for production use?
This is really three questions:
When will the WebGL specification get to version 1.0?
The WebGL Working Group have not announced a date, though they are very positive about how rapidly they are approaching a final version. Usefully, some of their deliberations happen on a public email list, so if you want to get a good feel for that current state, a good way to find out is to read the archives.
When will WebGL be available in the standard versions of web browsers?
Hopefully soon after the specification reaches 1.0! There are "pre-alpha" testing implementations in three of the major browsers, and because these implementations are following the specification as it evolves, so while the author of this FAQ can't make promises on behalf of the browser development teams, it doesn't look as if there will be much of a delay once the specification is ready.
When will enough people have WebGL in their browsers to make it work using on a website?
Naturally, this depends on the kinds of people who visit the site -- which browsers they use and how frequently they update them.
People who use Microsoft Internet Explorer will have specific problems with WebGL: as of this writing, Microsoft have not announced any intention of supporting it, so IE users will be reliant on a plugin. More about this in [#What about Microsoft?]
For users of Chrome and Firefox, there is interesting information about how rapidly they have upgraded to new versions in the past in this blog post. The short version: Chrome users will be automatically upgraded within a month or so of a new version's release; the Firefox upgrade will be slower to a greater or lesser extent depending on whether they package it as a major (eg. 4.0) or minor (3.7) release. Safari will be somewhere in between.
What about Microsoft?
As of this writing, Microsoft have not announced any intention of supporting WebGL, and their press announcements surrounding their new version of Internet Explorer, version 9, have said a lot about its use of computers' graphics hardware as a way of speeding up existing web pages, instead of doing new stuff like WebGL. So while they've not said explicitly that they're going to avoid WebGL, it seems unlikely that they're going to support it in the short term.
(It should be said that while it would be consistent with their popular image for them to launch their own competing non-open system for hardware-accelerated 3D graphics, they have shown no signs of intending to do that either.)
Why is my CPU at 100% when I look at WebGL content?
A lot of WebGL demos repaint the canvas constantly even when nothing is moving, so if you're wondering why the CPU is busy even when the image is still, this is probably why.
If you're wondering why the CPU can get up to 100% when WebGL is animating a simple scene, and all the work is meant to be happening on the graphics card anyway, the answer is a little more complex.
Because WebGL is displayed on an HTML canvas, it has to work like any other stuff that goes on that canvas. In particular, it needs to be possible to have other HTML elements overlay the rendered WebGL image, and for the WebGL image to have transparent bits that allow HTML stuff beneath it to show through. The process of blending two images -- the HTML and the WebGL -- together is known as compositing, and currently browsers handle this in a slightly roundabout way. They render the WebGL content using the graphics card, as they're meant to, getting one image. Then they render the HTML content (in a manner that differs from browser to browser), getting one or more images (the stuff below the WebGL and the stuff above, for example). Then they do the compositing to combine these images using the CPU.
What all this means is that if you have a large WebGL canvas, there's an awful lot of copying and compositing of images going on, and this is all being done by your CPU -- hence the 100%.
The good news is that the browser writers are very much aware of the problem, so hopefully it will be addressed in time.
Why is coding WebGL so hard?
In particular, it inherits from OpenGL ES 2.0 (the cut-down graphics for mobile devices on which it was based) a purely programmable pipeline, with no fixed function support. To put that in a less technical way -- graphics libraries like older versions of OpenGL have a plethora of useful functions which beginners can use to get started with, but experts rarely (if ever) use. Getting rid of the beginner-friendly functions makes the system smaller and "simpler" in a sense, which is a good thing when you need to use it on a cut-down device like a smartphone, but does of course mean that it's a bit tricky to get started with it.
However, there's good news. If you don't want to go to all the effort of learning the low-level API, you can choose a high-level one instead. The fact that WebGL provides a solid framework that can be guaranteed to run on any compatible device means that many developers have worked hard to produce frameworks that make programming it easy.
If you want your WebGL pages to run quickly, you do have options:
* Try to push as much as possible into shaders. Shader code is executed on the graphics card in parallel, and sometimes it can be surprising what you can do in it -- for example, picking objects in a scene and collision detection can both be done with appropriately-clever shaders. * If latency isn't an issue move stuff back to the server.
What's the best WebGL book?
There are no WebGL-specific books right now. However, WebGL is based on OpenGL ES 2.0, and so books about that standard can be useful as guides so long as you're willing to do a bit of translation yourself:
- Aaftab Munshi, Dan Ginsburg and Dave Shreiner: OpenGL ES 2.0 Programming Guide
- Philip Rideout: iPhone 3D Programming -- Developing Graphical Applications with OpenGL ES
Why won't my textures work?
A common problem is that you're using a non-power-of-two (NPOT) texture -- that is a texture whose width, height or both is not two-to-the-power-of-something long. These often won't work as expected with WebGL.
(For example, 100x100 pixel textures will not generally work; 128x128 will work, as will 128x512. In theory 4096x8192-pixel images would work as textures, though your viewers might run out of memory on their graphics cards, particularly if they're using smartphones!)
Kenneth Russell has written a more detailed explanation of the situation with NPOT textures on the WebGL Wiki.
If your textures look upside-down, they probably are... in general, people tend to specify texture coordinates in their 3D models with the "t" axis -- that is, their equivalent of the "y" axis -- pointing upwards. This fits with the "y" axis in used when specifying the positions of the vertices in the model. However, for historical reasons most image formats have "y" axes that point downwards.
This means that you need to flip your images around a horizontal axis before using them.
Old ATI graphics drivers
The code is actually invalid WebGL, gl.TEXTURE_2D should not be accepted by gl.enable. However, there is a bug in some graphics drivers that makes it necessary to put the code in if you want WebGL to work.
Your best bet is to update your drivers if you can.
How should I get my mega-vertex mesh up to the browser?
What are the "units" used to position vertices in WebGL? Pixels?
They're whatever you want them to be. Think of it this way — something of (say) one unit in size that is at distance z might be ten pixels across, but the same object at distance 2z will be five pixels across because of perspective. We use non-specific units so that we have a uniform measurement of size.
What that means in practice is that you can decide when planning a scene what you want a unit to represent. In a spaceflight game, you might decide that one unit would be equivalent to a kilometer, so you would make the Earth 6,371 units in radius, the Sun 1,392,000 units, and place them 149,597,887 units apart, then add a spaceship of length 0.2 units, and place it 18,000 units from the Earth. Because all of the scales were consistent, it would look right.
But if you were writing a first-person shooter like Quake, you might treat units as being meters — so 2-unit tall monsters might have coming shooting at the player from 100 units away.
The point is, as long as everything in your model uses the same kind of units, it’ll all work out.
How can I make sure that my WebGL code runs everywhere?
Sadly, the only real option is to test it on as many devices as possible, from smartphones to desktop PCs.
What about security?
How do I protect my content from pirates?