Difference between revisions of "WebGL: Frequently Asked Questions"

From webglcookbook
Jump to: navigation, search
(How do I get WebGL running on my machine?)
Line 13: Line 13:
Right now, you need to install a special version of a web browser to use WebGL.  You can get appropriate versions of Firefox, Chrome and Safari.
Right now, you need to install a special version of a web browser to use WebGL.  You can get appropriate versions of Firefox, Chrome and Safari.
* There are [official instructions on http://www.khronos.org/webgl/wiki/Main_Page the WebGL public Wiki].
* There are [http://www.khronos.org/webgl/wiki/Main_Page official instructions on the WebGL public Wiki].
* There are [http://learningwebgl.com/blog/?p=11 more detailed instructions, organised by operating system, on the Learning WebGL website].
* There are [http://learningwebgl.com/blog/?p=11 more detailed instructions, organised by operating system, on the Learning WebGL website].

Revision as of 18:54, 28 June 2010

This is a list of frequently-asked questions about WebGL. It is not a tutorial -- if that's what you want, you can check out the Learning WebGL lessons. It's also not a set of WebGL how-to "recipes" -- take a look at the Main Page of this Wiki to see some of them.


Getting started and viewing WebGL

What is WebGL?

WebGL is a low-level JavaScript API giving you access to the power of a computer's graphics hardware from within scripts on web pages. It makes it possible to create 3D graphics that update in realtime, running in the browser. It's a web standard, so it doesn't need a plugin and will soon be available on many smartphones. Some people regard it as being part of the new 'HTML5' standard.

WebGL is managed by Khronos, an organisation which is responsible for a number of open standards.

How do I get WebGL running on my machine?

Right now, you need to install a special version of a web browser to use WebGL. You can get appropriate versions of Firefox, Chrome and Safari.

What about Microsoft and Internet Explorer?

As of this writing, Microsoft have not announced any intention of supporting WebGL, and their press announcements surrounding their new version of Internet Explorer, version 9, have said a lot about its use of computers' graphics hardware as a way of speeding up existing web pages, instead of doing new stuff like WebGL. So while they've not said explicitly that they're going to avoid WebGL, it seems unlikely that they're going to support it in the short term.

(It should be said that while it would be consistent with their popular image for them to launch their own competing non-open system for hardware-accelerated 3D graphics, they have shown no signs of intending to do that either.)

What this means for now is that Internet Explorer users who want to see WebGL content will not be able to in the standard version of their browser. However, there may well be a way to work around this; Chrome Frame is a plugin for Internet Explorer which takes over the rendering of a particular tab, doing it using Chrome instead of the normal IE rendering engine. With an appropriate bit of JavaScript trickery, you can set up a web page so that when IE loads it, it will tell the user that the Chrome Frame plugin is needed to view it. That's not an ideal solution, but it's better than nothing.

When will WebGL be ready for production use?

This is really three questions:

When will the WebGL specification get to version 1.0?

The WebGL Working Group have not announced a date, though they are very positive about how rapidly they are approaching a final version. Usefully, some of their deliberations happen on a public email list, so if you want to get a good feel for that current state, a good way to find out is to read the archives.

When will WebGL be available in the standard versions of web browsers?

Hopefully soon after the specification reaches 1.0! There are "pre-alpha" testing implementations in three of the major browsers, and because these implementations are following the specification as it evolves, so while the author of this FAQ can't make promises on behalf of the browser development teams, it doesn't look as if there will be much of a delay once the specification is ready.

When will enough people have WebGL in their browsers to make it work using on a website?

Naturally, this depends on the kinds of people who visit the site -- which browsers they use and how frequently they update them.

People who use Microsoft Internet Explorer will have specific problems with WebGL: as of this writing, Microsoft have not announced any intention of supporting it, so IE users will be reliant on a plugin. More about this in [#What about Microsoft?]

For users of Chrome and Firefox, there is interesting information about how rapidly they have upgraded to new versions in the past in this blog post. The short version: Chrome users will be automatically upgraded within a month or so of a new version's release; the Firefox upgrade will be slower to a greater or lesser extent depending on whether they package it as a major (eg. 4.0) or minor (3.7) release. Safari will be somewhere in between.

Why is my CPU at 100% when I look at WebGL content?

A lot of WebGL demos repaint the canvas constantly even when nothing is moving, so if you're wondering why the CPU is busy even when the image is still, this is probably why.

If you're wondering why the CPU can get up to 100% when WebGL is animating a simple scene, and all the work is meant to be happening on the graphics card anyway, the answer is a little more complex.

Because WebGL is displayed on an HTML canvas, it has to work like any other stuff that goes on that canvas. In particular, it needs to be possible to have other HTML elements overlay the rendered WebGL image, and for the WebGL image to have transparent bits that allow HTML stuff beneath it to show through. The process of blending two images -- the HTML and the WebGL -- together is known as compositing, and currently browsers handle this in a slightly roundabout way. They render the WebGL content using the graphics card, as they're meant to, getting one image. Then they render the HTML content (in a manner that differs from browser to browser), getting one or more images (the stuff below the WebGL and the stuff above, for example). Then they do the compositing to combine these images using the CPU.

What all this means is that if you have a large WebGL canvas, there's an awful lot of copying and compositing of images going on, and this is all being done by your CPU -- hence the 100%.

The good news is that the browser writers are very much aware of the problem, so hopefully it will be addressed in time.

What about security?

When told that it "gives web pages access to your graphics hardware", some people worry that WebGL could be a serious security threat. It's not as dangerous as it might sound, though! While any new extension to JavaScript could potentially open up security holes that could be used by a hacker, there's nothing particularly risky about WebGL as compared to, say, HTML5's audio extensions -- apart from one thing. With current versions of WebGL, a hacker could potentially write a WebGL page that made your graphics card stop responding to other applications. Under Microsoft Windows Vista and Windows 7, this would be annoying but not disasterous -- the operating system would notice that something was wrong and reset the graphics card. But under Apple OS X on a Macintosh, the hacker could potentially freeze the screen of your computer. (Notes about what would happen on Linux/Maemo/iOS welcome, just edit the page :-)

It is possible that updates to the browsers will fix this, though for technical reasons it's a tough problem to solve without severely limiting the kinds of 3D graphics you can do in WebGL.

Writing WebGL

Why is coding WebGL so hard?

WebGL is a low-level API. That means that it is designed to give you access from JavaScript to as much of the power of the user's graphics hardware as is possible with being possible to run on a broad spectrum of devices -- from PCs with Nvidia or ATI graphics, all the way to smartphones -- and with security, with the minimum number of functions and data types.

In particular, it inherits from OpenGL ES 2.0 (the cut-down graphics for mobile devices on which it was based) a purely programmable pipeline, with no fixed function support. To put that in a less technical way -- graphics libraries like older versions of OpenGL have a plethora of useful functions which beginners can use to get started with, but experts rarely (if ever) use. Getting rid of the beginner-friendly functions makes the system smaller and "simpler" in a sense, which is a good thing when you need to use it on a cut-down device like a smartphone, but does of course mean that it's a bit tricky to get started with it.

However, there's good news. If you don't want to go to all the effort of learning the low-level API, you can choose a high-level one instead. The fact that WebGL provides a solid framework that can be guaranteed to run on any compatible device means that many developers have worked hard to produce frameworks that make programming it easy.

But isn't JavaScript too slow?

The JavaScript engines built into browsers are getting ever faster, but yes -- for some kinds of calculations, particularly numerically intensive stuff, JavaScript is too slow right now. For example, you probably wouldn't want to write the physics engine of a modern computer game in it!

If you want your WebGL pages to run quickly, you do have options:

* Try to push as much as possible into shaders.  Shader code is executed on the graphics card in parallel, and sometimes it can be surprising what you can do in it -- for example, picking objects in a scene and collision detection can both be done with appropriately-clever shaders.
* If latency isn't an issue move stuff back to the server.

Why can't I code my shaders in JavaScript?

WebGL code can look a bit funny, with the mixture of C-like shader code and JavaScript in one file. And understandably, it can feel a bit irritating to have to learn another programming language just to be able to draw 3D graphics.

However, at least in the short term, it's unlikely that you'll be able to write shaders in any language other than the specialised shader language, GLSL. This is because the processors on graphics cards, while extremely fast, are extremely simple. They can only operate on programs expressed in simple languages, and getting one to run a dynamic language like JavaScript would be well-nigh impossible -- and would probably wind up running much slower than it would on a regular CPU anyway.

What's the best WebGL book?

There are no WebGL-specific books right now. However, WebGL is based on OpenGL ES 2.0, and so books about that standard can be useful as guides so long as you're willing to do a bit of translation yourself:

Why won't my textures work?

NPOT problems

A common problem is that you're using a non-power-of-two (NPOT) texture -- that is a texture whose width, height or both is not two-to-the-power-of-something long. These often won't work as expected with WebGL.

(For example, 100x100 pixel textures will not generally work; 128x128 will work, as will 128x512. In theory 4096x8192-pixel images would work as textures, though your viewers might run out of memory on their graphics cards, particularly if they're using smartphones!)

Kenneth Russell has written a more detailed explanation of the situation with NPOT textures on the WebGL Wiki.


If your textures look upside-down, they probably are... in general, people tend to specify texture coordinates in their 3D models with the "t" axis -- that is, their equivalent of the "y" axis -- pointing upwards. This fits with the "y" axis in used when specifying the positions of the vertices in the model. However, for historical reasons most image formats have "y" axes that point downwards.

This means that you need to flip your images around a horizontal axis before using them.

Old ATI graphics drivers

For certain old ATI graphics drivers, in particular those shipped with HP computers, you may find that textures do not appear when you display WebGL pages unless those pages include this JavaScript code:


The code is actually invalid WebGL, gl.TEXTURE_2D should not be accepted by gl.enable. However, there is a bug in some graphics drivers that makes it necessary to put the code in if you want WebGL to work.

Your best bet is to update your drivers if you can.

How should I get my mega-vertex mesh up to the browser?

If you have a very large 3D model and want to transfer it from your web server to your WebGL page, a good option is to encode it in JSON format. After all, internally inside your WebGL page you are going to be processing it as JavaScript -- so why not convert it to that format on the server and then send it up that way using an XmlHttpRequest?

An alternative would be to use an XML-based format such as COLLADA, but parsing XML and converting it to your JavaScript code's internal format will be slower than using JSON. Some frameworks, like O3D, support server-side proxies that convert COLLADA on the server side into JSON for use on the client.

What are the "units" used to position vertices in WebGL? Pixels?

They're whatever you want them to be. Think of it this way — something of (say) one unit in size that is at distance z might be ten pixels across, but the same object at distance 2z will be five pixels across because of perspective. We use non-specific units so that we have a uniform measurement of size.

What that means in practice is that you can decide when planning a scene what you want a unit to represent. In a spaceflight game, you might decide that one unit would be equivalent to a kilometer, so you would make the Earth 6,371 units in radius, the Sun 1,392,000 units, and place them 149,597,887 units apart, then add a spaceship of length 0.2 units, and place it 18,000 units from the Earth. Because all of the scales were consistent, it would look right.

But if you were writing a first-person shooter like Quake, you might treat units as being meters — so 2-unit tall monsters might have coming shooting at the player from 100 units away.

The point is, as long as everything in your model uses the same kind of units, it’ll all work out.

Deploying WebGL apps

How can I make sure that my WebGL code runs everywhere?

Sadly, the only real option is to test it on as many devices as possible, from smartphones to desktop PCs.

How do I protect my content from pirates?

The "modern" open-source-based answer to this is, of course, that you shouldn't: instead, you should architect your application or business so that the value is generated by the service you offer, not in the content delivered to the browser. However, in some cases that may well be unduly idealistic. Your best bet for under those circumstances is to look into JavaScript obfuscators.

Personal tools