John Smith's Blog

Ramblings (mostly) about technical stuff

Parallax starfield and texture mask effect in WebGL

Posted by John Smith on

I've been pottering around a bit with WebGL lately, really just playing with 2D stuff, and this is probably the most notable thing I've hacked up so far.

Screengrab from a WebGL demo, showing the text Hello World using a starfield effect

It's vaguely inspired by stuff like the Sid Sutton Doctor Who titles from the early '80s and Numb Res (especially the bit with letters forming around 2'30" in) - but it's incredibly crude compared to either of those.

If you care to view source on that page and/or the JavaScript that drives it, it should hopefully be fairly easy to follow, but in summary:

  • There are no true 3D objects or particles, instead the code sets up 6 triangles that make up 3 rectangles, each of which fill the full WebGL canvas area. These form 3 parallax layers, which form the starfield(s) - with only minor tweaks, the number of layers could be increased, which would probably improve the believability of the effect a fair bit.
  • The stars are rendered entirely in the fragment shader, using a pseudo-random algorithm based on the X&Y; pixel position and the frame number. The increments to the frame number are what give the scrolling effect, and a speed value supplied via the vertex shader is what causes each layer to scroll at a different rate.
  • The "Hello world" text is rendered into a hidden 2D canvas object and converted to a WebGL texture at startup. The fragment shader reads the texture, and if that particular pixel was set on the texture, increases the probability of a star being shown.
  • Things would probably look better if I added a bit of pseudo-randomness to make the stars twinkle. Unfortunately I was getting a bit bored with the whole thing by the point it came to do this part ;-)

Some observations from my ill-informed stumbling around in a technology I don't really understand:

  • Performance seems fine on the 2 of the three machines I've tested it on so far - a dual-boot Linux/Win7 box with AMD hexacore and nVidia GTS450 and 2008 white MacBook are quite happy; a Celeron netbook with integrated graphics understandably less so, although still churning out an acceptable framerate.
  • Curiously the dual-boot box reporting consuming way more CPU when running Firefox or Chromium under Linux compared to Windows 7. I'm not quite sure why, as CPU usage should be pretty minimal - all the "clever" stuff should be happening on the graphics card, and all that the CPU should be doing each frame is updating a counter and getting the graphics card to re-render based on that updated counter. (Both operating systems have fairly up-to-date nVidia drivers, with the browsers configured to use "proper" OpenGL as opposed to ANGLE or suchlike.) What the cause is, I haven't yet investigated - it could be some quirky difference in the way CPU usage is reported.
  • I reduced Chromium CPU usage from mid-30s to mid-20s (as measured on the Linux box) by moving code out of the main animation loop that didn't need to be there - stuff that defined the geometry, pushed the texture through, etc.
  • I still need to find a way to mentally keep track of the various different coordinate systems in use - vertex shader uses -1.0 to 1.0, fragment shader uses 0.0 to 1.0, plus also remembering the real pixels. (And not to mention that 2D canvas is inverted compared to WebGL canvas!)
  • It feels a bit odd to me that *GL makes it easier to use floating-point rather than integer values. I guess I'm still stuck in an '80s mentality of writing stuff for the 6502 (which didn't really have 16-bit integers, never mind floating point) and stuff like Bresenham's algorithm. (Ironically enough, Bresenham's algorithm was a topic of discussion just last night, in the talk about Raspberry Pi at this month's Hacker News London event.)
  • In a similar vein, I was a tad surprised to find minimal support in shaders for doing bit manipulation stuff like you see in "pure CPU" Satori demos. The same goes for the lack of a proper random number generator, although in the context of this experiment, my controllable pseudo-random numbers were probably a better fit. (I get the impression that this functionality is available in newer versions of regular OpenGL, just not the variant supported by WebGL?)

About this blog

This blog (mostly) covers technology and software development.

Note: I've recently ported the content from my old blog hosted on Google App Engine using some custom code I wrote, to a static site built using Pelican. I've put in place various URL manipulation rules in the webserver config to try to support the old URLs, but it's likely that I've missed some (probably meta ones related to pagination or tagging), so apologies for any 404 errors that you get served.

RSS icon, courtesy of www.feedicons.com RSS feed for this blog

About the author

I'm a software developer who's worked with a variety of platforms and technologies over the past couple of decades, but for the past 7 or so years I've focussed on web development. Whilst I've always nominally been a "full-stack" developer, I feel more attachment to the back-end side of things.

I'm a web developer for a London-based equities exchange. I've worked at organizations such as News Corporation and Google and BATS Global Markets. Projects I've been involved in have been covered in outlets such as The Guardian, The Telegraph, the Financial Times, The Register and TechCrunch.

Twitter | LinkedIn | GitHub | My CV | Mail

Popular tags

Other sites I've built or been involved with

Work

Most of these have changed quite a bit since my involvement in them...

Personal/fun/experimentation