WebGL: Frequently Asked Questions

From webglcookbook
Jump to: navigation, search

This is a list of frequently-asked questions about WebGL. It is not a tutorial -- if that's what you want, you can check out the Learning WebGL lessons. It's also not a set of WebGL how-to "recipes" -- take a look at the Main Page of this Wiki to see some of them.

Contents

Getting started and viewing WebGL

What is WebGL?

WebGL is a low-level JavaScript API giving you access to the power of a computer's graphics hardware from within scripts on web pages. It makes it possible to create 3D graphics that update in realtime, running in the browser. It's a web standard, so it doesn't need a plugin and will soon be available on many smartphones. Some people regard it as one of the new 'HTML5' technologies.

WebGL is managed by Khronos, an organisation which is responsible for a number of other open standards, including the well-known OpenGL desktop graphics library and its lesser-known version for "embedded" devices like smartphones, OpenGL ES. WebGL is based on OpenGL ES.

How do I get WebGL running on my machine?

Right now, you need to install a special version of a web browser to use WebGL. You can get appropriate versions of Firefox, Chrome and Safari.

What about Microsoft and Internet Explorer?

As of this writing, Microsoft have not announced any intention of supporting WebGL, and their press announcements about their new version of Internet Explorer, version 9, have said a lot about its use of computers' graphics hardware as a way of speeding up existing web pages, instead of doing new stuff like WebGL. So while they've not said explicitly that they're going to avoid WebGL, it seems unlikely that they're going to support it in the short term.

(It should be said that while it would be consistent with their popular image for them to launch their own competing non-open system for hardware-accelerated 3D graphics, they have shown no signs of intending to do that either.)

What this means for now is that Internet Explorer users who want to see WebGL content will not be able to in the standard version of their browser. Some may be able and willing to switch to other browsers; for those who cannot or will not, there may soon be a workaround. Chrome Frame is a plugin for Internet Explorer which takes over the rendering of a particular tab, doing it using Chrome instead of the normal IE rendering engine. With an appropriate bit of JavaScript trickery, you can set up a web page so that when IE loads it, it will use Chrome Frame to render it if it is installed, and tell the user that the Chrome Frame plugin is needed to view it if it's not. Non-IE browsers are unaffected and just deal with the page the normal way.

Although Chrome Frame does not currently list support for WebGL in its feature list, if Chrome supports it then it seems likely that Chrome Frame will shortly thereafter. It's not an ideal solution, but it's much better than nothing.

When will WebGL be ready for production use?

This is really three questions:

When will the WebGL specification get to version 1.0?

It's there! The WebGL Working Group ratified the 1.0 specification on 10 February 2011. Usefully, some of their deliberations happen on a public email list, so if you want to get a good feel for the current state of future spec updates, a good way to find out is to read the recent posts in the archives.

When will WebGL be available in the standard versions of web browsers?

Recent versions of Chrome, Firefox, Safari all have support for WebGL. Opera 12 alpha also has support. In the case of Safari it is currently disabled by default, but can be enabled manually. This page has some more details about what versions are supported.

When will enough people have WebGL in their browsers to make it work using on a website?

Naturally, this depends on the kinds of people who visit the site -- which browsers they use and how frequently they update them.

People who use Microsoft Internet Explorer will have specific problems with WebGL: as of this writing, Microsoft have not announced any intention of supporting it, so IE users will be reliant on a plugin. More about this in #What about Microsoft and Internet Explorer?

For users of Chrome and Firefox, there is interesting information about how rapidly they have upgraded to new versions in the past in this blog post. The short version: pretty much all Chrome users will be automatically upgraded within a month or so of a new version's release; the Firefox upgrade will be slower to a greater or lesser extent depending on whether they package it as a major (eg. 4.0) or minor (3.7) release. Safari will be somewhere in between.

Why is my CPU at 100% when I look at WebGL content?

A lot of WebGL demos repaint the canvas constantly even when nothing is moving, so if you're wondering why the CPU is busy even when the image is still, this is probably why.

If you're wondering why the CPU can get up to 100% when WebGL is animating a simple scene, and all the work is meant to be happening on the graphics card anyway, the answer is a little more complex.

Because WebGL is displayed on an HTML canvas, it has to work like any other stuff that goes on that canvas. In particular, it needs to be possible to have other HTML elements overlay the rendered WebGL image, and for the WebGL image to have transparent bits that allow HTML stuff beneath it to show through. The process of blending two images -- the HTML and the WebGL -- together is known as compositing, and currently browsers handle this in a slightly roundabout way. They render the WebGL content using the graphics card, as they're meant to, getting one image. Then they render the HTML content (in a manner that differs from browser to browser), getting one or more images (the stuff below the WebGL and the stuff above, for example). Then they do the compositing to combine these images using the CPU.

What all this means is that if you have a large WebGL canvas, there's an awful lot of copying and compositing of images going on, and this is all being done by your CPU -- hence the 100%.

The good news is that the browser writers are very much aware of the problem, so hopefully it will be addressed in time.

What about security?

When told that it "gives web pages access to your graphics hardware", some people worry that WebGL could be a serious security threat. It's not as dangerous as it might sound, though! While any new extension to JavaScript could potentially open up security holes that could be used by a hacker, there's nothing particularly risky about WebGL as compared to, say, HTML5's audio extensions -- apart from one thing. With current versions of WebGL, a hacker could potentially write a WebGL page that made your graphics card stop responding to other applications. Under Microsoft Windows Vista and Windows 7, this would be annoying but not disasterous -- the operating system would notice that something was wrong and reset the graphics card. But under Apple OS X on a Macintosh, the hacker could potentially freeze the screen of your computer. On Linux it depends on what GPU you have and which drivers, Intel's drivers are supposed to detect GPU lockups and reset the card (if your kernel is new enough to have this code anyway) for ATI the free drivers do lockup detection for some (older) GPUs, nouveau (free NVIDIA driver) doesn't currently do hang detection. Basically the Linux situation is, at the moment maybe it'll reset your GPU, but in the future it'll get better (as usual for anything in Linux that sucks). (Notes about what would happen on Linux/Maemo/iOS welcome, just edit the page :-)

It is possible that updates to the browsers will fix this, though for technical reasons it's a tough problem to solve without severely limiting the kinds of 3D graphics you can do in WebGL.

Writing WebGL

Why is coding WebGL so hard?

WebGL is a low-level API. That means that it is designed to give you access from JavaScript to as much of the power of the user's graphics hardware as is compatible with being possible to run on a broad spectrum of devices -- from PCs with Nvidia or ATI graphics, all the way to smartphones -- and with security, with the minimum number of functions and data types.

In particular, it inherits from OpenGL ES 2.0 (the cut-down graphics for mobile devices on which it was based) a purely programmable pipeline, with no fixed function support. To put that in a less technical way -- graphics libraries like older versions of OpenGL have a plethora of useful functions which beginners can use to get started with, but experts rarely (if ever) use. Getting rid of the beginner-friendly functions makes the system smaller and "simpler" in a sense, which is a good thing when you need to use it on a cut-down device like a smartphone, but does of course mean that it's a bit tricky to get started with it.

However, there's good news. If you don't want to go to all the effort of learning the low-level API, you can choose a high-level one instead. The fact that WebGL provides a solid framework that can be guaranteed to run on any compatible device means that many developers have worked hard, independently of the browser writers, to produce frameworks that make programming it easy and which work on all WebGL browsers.

What WebGL frameworks exist?

There's a list on the WebGL public Wiki.

But isn't JavaScript too slow?

The JavaScript engines built into browsers are getting ever faster, but yes -- for some kinds of calculations, particularly numerically intensive stuff, JavaScript is too slow right now. For example, you probably wouldn't want to write the physics engine of a modern computer game in it!

If you want your WebGL pages to run quickly, you do have options:

  • Try to push as much as possible into shaders. Shader code is executed on the graphics card in parallel, and sometimes it can be surprising what you can do in it -- for example, picking objects in a scene and collision detection can both be done with appropriately-clever shaders.
  • If latency isn't an issue move stuff back to the server.

Why can't I code my shaders in JavaScript?

WebGL code can look a bit funny, with the mixture of C-like shader code and JavaScript in one file. And understandably, it can feel a bit irritating to have to learn another programming language just to be able to draw 3D graphics.

However, at least for the next few years, it's unlikely that you'll be able to write shaders in any language other than the specialised shader language, GLSL. This is because the processors on graphics cards, while extremely fast, are extremely simple. They can only operate on programs expressed in simple languages, and getting one to run a dynamic language like JavaScript would be well-nigh impossible -- and would probably wind up running much slower than it would on a regular CPU anyway.

What's the best WebGL book?

There are no WebGL-specific books right now. However, WebGL is based on OpenGL ES 2.0, and so books about that standard can be useful as guides so long as you're willing to do a bit of translation yourself:

Why won't my textures work?

NPOT problems

A common problem is that you're using a non-power-of-two (NPOT) texture -- that is a texture whose width, height or both is not two-to-the-power-of-something long. These often won't work as expected with WebGL. For example, 100x100 pixel textures will not generally work; 128x128 will work, as will 128x512. In theory 4096x8192-pixel images would work as textures, though your viewers might run out of memory on their graphics cards, particularly if they're using smartphones!

Kenneth Russell has written a more detailed explanation of the situation with NPOT textures on the WebGL Wiki.

Flipping

If your textures look upside-down, they probably are... in general, people tend to specify texture coordinates in their 3D models with the "t" axis -- that is, their equivalent of the "y" axis -- pointing upwards. This fits with the "y" axis used when specifying the positions of the vertices in the model. However, for historical reasons most image formats have "y" axes that point downwards.

This means that you need to flip your images around a horizontal axis before using them. Learning WebGL lesson 5 has more on this.

Old ATI graphics drivers

For certain old ATI graphics drivers, in particular those shipped with HP computers, you may find that textures do not appear when you display WebGL pages unless those pages include this JavaScript code:

gl.enable(gl.TEXTURE_2D)

The code is actually invalid WebGL, gl.TEXTURE_2D should not be accepted by gl.enable. However, there is a bug in some graphics drivers that makes it necessary to put the code in if you want WebGL to work.

Your best bet is to update your drivers if you can.


How should I get my mega-vertex mesh up to the browser?

If you have a very large 3D model and want to transfer it from your web server to your WebGL page, a good option is to encode it in JSON format. After all, internally inside your WebGL page you are going to be processing it as JavaScript -- so why not convert it to that format on the server and then send it up that way using an XmlHttpRequest?

An alternative would be to use an XML-based format such as COLLADA, but parsing XML and converting it to your JavaScript code's internal format will be slower than using JSON. Some frameworks, like O3D, support server-side proxies that convert COLLADA on the server side into JSON for use on the client.

What are the "units" used to position vertices in WebGL? Pixels?

They're whatever you want them to be. Think of it this way — something of (say) one unit in size that is at distance z might be ten pixels across, but the same object at distance 2z will be five pixels across because of perspective. We use non-specific units so that we have a uniform measurement of size.

What that means in practice is that you can decide when planning a scene what you want a unit to represent. In a spaceflight game, you might decide that one unit would be equivalent to a kilometer, so you would make the Earth 6,371 units in radius, the Sun 1,392,000 units, and place them 149,597,887 units apart, then add a spaceship of length 0.2 units, and place it 18,000 units from the Earth. Because all of the scales were consistent, it would look right.

But if you were writing a first-person shooter like Quake, you might treat units as being meters — so 2-unit tall monsters might have coming shooting at the player from 100 units away.

The point is, as long as everything in your model uses the same kind of units, it’ll all work out.


Deploying WebGL apps

How can I make sure that my WebGL code runs everywhere?

Sadly, the only real option is to test it on as many devices as possible, from smartphones to desktop PCs.

How do I protect my content from pirates?

The "modern" open-source-based answer to this is, of course, that you shouldn't: instead, you should architect your application or business so that the value is generated by the service you offer, not in the content delivered to the browser. However, in some cases that may well be unduly idealistic. Your best bet for under those circumstances is to look into JavaScript obfuscators.


Contributors

This FAQ is based on a talk given by Giles Thomas at the WebGL Camp in June 2010. Thanks to Vladimir Vukićević, initiator of the WebGL project, for guidance on the timeline towards the 1.0 Web GL spec, and to Miguel Angel Garcia for suggesting the iPhone OpenGL ES 2.0 book.

Personal tools
Namespaces

Variants
Actions
Navigation
Toolbox