Vaughan Knight Technology as Artforms 2013-05-06T10:32:47Z WordPress Vaughan <![CDATA[Unity3D with Oculus Rift]]> 2013-05-06T10:32:47Z 2013-05-03T03:23:50Z Read more »]]> I just recently had my Oculus Rift arrive, and the first thing I did after spending some time in Tuscany in the demo level, was jump into Unity and load up the test track demo project that’s on the Unity store.

It took all of 2 minutes to grab the Oculus Rift Unity plugin, and plonk the camera on top of the car. I played around with camera positioning, and found putting the camera on top of the car worked best. While putting the camera in the drivers seat works well, and replicates real life, there is a reason why driving at 100km/h with your head out the sunroof is more fun. It also means you get to look around and get a feel for everything without looking at the headrest.

The video ended up cropping strangely. I’ll re-upload it at some stage. Also, link to executable and tutorial coming at some point.

Perception Blow Out

You can test out the Oculus immediately in the editor player, but I recommend pushing out a binary version running at highest quality. The reason for this is that when certain things don’t match perfectly in each eye, it blows up your depth perception. You simply can’t look at items that don’t make sense, your brain tries to look away.

Another thing that breaks is the entire sky box concept. The reason a sky box works is that it has a single camera and exploits the fact that you don’t have any depth perception. As soon as you get depth perception the sky box falls apart. It’s not that you see yourself inside a box, it’s more things don’t align how you expect, and your brain blows up again. If the sky box was a grid of lines you’d see it, but your brain doesn’t accept that clouds are printed onto a box.

Goodbye Gimbal Lock

One of the big issues that FPS suffers from is gimbal lock. I won’t got into what gimbal lock actually is, but it relates to the limitation of 3D orientations being represented by rotations on the X Y and Z axes. We’ll refer to them as pitch, yaw, and roll.

But in practical terms an FPS suffers from this limitation since when you walk around, and look around, control schemes are limited. With a dual stick controller, one stick represents walking around, and the other represents changing orientation. Left and right on the stick will usually rotate you around the Veritcal axis (yaw), and up and down on the stick tilt you around the horizontal axis (pitch).

Take the scenario of watching a plane fly directly towards you, and flying over your head. With the stick you would push up on the stick to change the “pitch” as it passes over you. But when the object is directly above you, pushing up doesn’t track any further. Up on the stick means “look up”, not “pitch in this direction”. To keep tracking the object, you need to now rotate around 180′ (yaw 180′) using the horizontal axes of the stick, and change the direction of your pitch by pressing down on the stick. Some of this can be fixed by adding “roll” to the controls, but then the controls get incredibly complicated, and keeping track of orientation becomes difficult because “down” on the screen is no longer “down”.

But with the Oculus, your brain inherently knows which way is down. You can lean back and keep leaning back. If you check over your shoulder, your head isn’t horizontal but it’s not disoriented, and the controls for yaw, pitch, and roll are n longer controls, your control is “look at X”. This will change how games are played if Oculus style devices become the norm.

Vaughan <![CDATA[A Day with Articy Draft]]> 2013-04-14T02:09:34Z 2013-04-14T02:09:34Z Read more »]]> I started using Articy Draft the other day. It’s a game development tool for putting together story lines, story arcs, and collating all the creative and information when designing a game.

Jelly is in it’s final stages, abd Articy Draft is more designed for the early stages of a project, but I thought I’d give it a go to see whether it was worthwhile using.

What I’ve found valuable is just putting the story down on virtual paper. It has made the rather linear story that goes a long with Jelly feel deeper since I’m writing the story holistically. I seriously thought there would be little to no value in putting it together, and was pleasantly surprised to find it working well.

Articy has some neat features I didn’t get ot use, like the location / layout tool where you can plop characters within world maps or level maps, which would make it ideal for RPG design.

My only complaints with it are the lag when zooming, and the severe lack of keyboard shortcuts (or hints as to what they are). I care less for the visual presentation than I do for performance when using a productivity tool, and if I was to use this on a complex game moving forward, that would have to improve. I work on a decent machine, and with fixed zoom levels, and 32GB of ram, I’d love to be able to have those zoom levels cached at a minimum.

Vaughan <![CDATA[Jelly is Coming]]> 2013-04-13T17:01:09Z 2013-04-13T16:57:48Z Read more »]]> I’ve been pretty busy in my spare time working on game 2. It’s coming together quite nicely and we’re looking to release it later this month.

The theme is very heavily chocolate/candy/ice cream based. We’ve been working on it for about 6-7 months but not solid, plenty of delays due to real life interruptions. It’s this time on a project you start looking at bits of the game and want to redo them, but that’s how games never get released, so it’s being released in it’s current form, and if I feel it needs to be better, that can just be version 2. I should listen to my own advice and just release it.

One thing that I’ve started to notice however is a lack of monitor space. I love working with multiple monitors. It’s something I’ve always been comfortable with ever since my first experience when work gave away cheap 15 inch CRT’s over 10 years ago. For most development I’ve found 3 is the right mix, logs and a browser on the left, product on the right, IDE in the centre. But doing game development has me craving more. I’m currently on 3 x 27″ LED’s and I plan on adding a 4th in the upside down T configuration.

I also can’t rave enough about the 25 key Novation Impulse that I picked up. It’s been a while since I bought my 88 key monster and I am gobsmacked at just how good a 25 key keyboard is for such a low price. It does feel a little bit more plastic than I expected, but it’s primarily for making sound effects.

Vaughan <![CDATA[Post Hack Cleanup]]> 2013-03-09T12:10:36Z 2013-03-09T12:10:36Z Read more »]]> Looks like the site got hacked. I did a quick cleanup of most things, a fresh install to hopefully plug some security holes, and a new theme while I was at it.

Some posts have garbage characters in them. I cleaned up a few but then realised it was going to take forever. Hopefully you can cope.

Thanks to Jay for the heads up.

Vaughan <![CDATA[Launching Nokia Drive from an app in Windows Phone 8]]> 2013-04-13T05:13:30Z 2013-01-31T02:23:21Z Read more »]]> drive

Being able to launch Nokia Drive from an application has been something Ive wanted to do for a while. It opens up a wide range of opportunities to build location applications for transient or content based points of interest. This has a number of use cases, including service response (emergency services, damage repair), tourism (tour guide applications), through to social gaming.

And now with Windows Phone 8 it is possible. And its dead simple.

var uriString = “ms-drive-to:?destination.latitude=<latitude>”+
await Windows.System.Launcher.LaunchUriAsync(new Uri(uriString));

Thats it. It requires no manifest updates, no additional libraries.

For further info check out the following links on MSDN:

No way! That’s offline!

Some Disbeliever

Vaughan <![CDATA[DeLorean Ipsum .NET]]> 2013-04-13T05:14:25Z 2013-01-30T06:46:23Z Read more »]]> Blend offers fantastic text generation in Sample Data, but one of the issues I’ve always had is the lack of paragraph based text to give a better idea of text layout. I have just been putting up with it.

However, I stumbled across DeLorean Ipsum the other day over at, and decided I needed to port it to .NET to use in my Windows 8 and Windows Phone projects. It has paragraph based text, and leverages the script of Back to the Future for the text.

An example of some generated text – 4 paragraphs of 5 sentences per paragraph:

Oh honey, he’s teasing you, nobody has two television sets. Hey, hey, I’ve seen this one, I’ve seen this one. This is a classic, this is where Ralph dresses up as the man from space. What do you mean you’ve seen this, it’s brand new. Yeah well, I saw it on a rerun. What’s a rerun?

You’ll find out. You know Marty, you look so familiar, do I know your mother? Yeah, I think maybe you do. Oh, then I wanna give her a call, I don’t want her to worry about you. You can’t, uh, that is, uh, nobody’s home.

Oh. Yet. Oh. Uh listen, do you know where Riverside Drive is? It’s uh, the other end of town, a block past Maple.

A block passed Maple, that’s John F. Kennedy Drive. Who the hell is John F. Kennedy? Mother, with Marty’s parents out of town, don’t you think he oughta spend the night, after all, Dad almost killed him with the car. That’s true, Marty, I think you should spend the night. I think you’re our responsibility. Well gee, I don’t know.


To generate 10 paragraphs of text, it’s very straight forward. Defaults are 3 sentences per paragraph, using lines from all characters in the movie.

var d = new DeLorean();
var text = d.Generate(10);

Or to generate 20 words of text, with a maximum of 3 sentences per paragraph, using lines only from Marty.

var d = new DeLorean(3, Characters.Marty, TextTypes.Words);

Or to generate 90 characters of text, with a maximum of 2 sentences per paragraph, using lines from all characters.

var d = new DeLorean(3, Characters.All, TextTypes.Characters);

The library is a portable library, and it should work on everything.

Vaughan <![CDATA[A Standard HTML5 Standard Standard]]> 2012-05-21T12:44:04Z 2011-03-26T23:25:18Z After working on, and tinkering with other ideas since, my original thoughts on HTML5 increasing the gap between browsers is becoming apparent.  Having the HTML standard is awesome, but until everyone has a standard approach to the standard, we’ll still be making exceptions. 

Not everything in this post is HTML5, more HTML and JavaScript in general, but with HTML becoming an evolving standard it makes it more important for there to be consistency as adoption of HTML features will become increasingly staggered.

The core issue is with interpretation, and/or certification.  Who certifies that a browser is HTML* compliant?  Whether it’s a central body, or an individual tester, they need to ensure there is consistency, but in the scenario of the individual tester, what is the benchmark?  If text is being rendered on the page, what are the metric to measure that it is rendered correctly?   I’ll take a step back from HTML to explain. 

Bad Jump

If I was to write a specification for a computer game, one of the aspects to be described is controls.  To simplify it for discussions sake ”when a player hits ‘fire’, the character will jump.  Jump duration is a maximum of 2 seconds, to a height of 10 units”.  But how this occurs has many options: 

  • Is there in air control?
  • Is there hang time at peak or is it a parabolic jump?
  • Does the character slow down in the air? 
  • If the character hits a ceiling/wall do they fall immediately or still hang?

New initiatives in games are benchmarked against all the previous successors.  There is always a parallel.  This is why jump times in all 2D platform games are about the same. 

Coming back to HTML, the standard may be open for interpretation, but just because a new standard is being implemented doesn’t mean that the function of the standard should behave any differently that you would normally expect. 

I’ll start with the basics.

Text Rendering

One of the fundamental things that every browser does and has done since the beginning, is render text.  Visually, this is the primary concern of the browser.  So one would expect that this is something that does not merit discussion. 

As such, I’m not sure how this gets through testing, but take a look at the following in Firefox and in parallel have a look in Chrome and IE9.

Chrome just AA’s the entire thing into a blurry mess, but at least it makes sense, if undesirable. IE9 realises it’s rendering text, and does more intelligent text smoothing.

But Firefox’s behaviour be described as a bug.  By default, text should be render with no anti aliasing, and if you have some smoothing it shouldn’t degrade the legibility of the text.  Having to design for another browser, or cater for it in code, is bad.  This just makes everyone design in to keep their designs predictable and maintainable. For those without HTML5, or Firefox, or super sonic zoom eyes, let’s have a closer look at FF’s rendering.



NOTE: You do not need 3D glasses to view the above picture.

Once your eyes stop bleeding, lets look at what is actually being rendered.  Anti aliasing with shades of red, green, and blue?  What is going on here isn’t in the style of Apple IIE monochrome monitors (for those that remember) but actual colours being rendered.  A single vertical line has a line of blue, a line of green, and a line of red.  Did anyone actually test this?  It’s like no one knew how to do text smoothing, so they just put 1 of each colour around the text.

Also notice how the vertical line in IE is doing anti aliasing on the 1 pixel line, making it appear transparent, and in Firefox it decides to make a 2 pixel wide line to avoid the issue altogether.  So while IE9 has good text smoothing, the anti aliasing logic in both browsers is broken and inconsistent.

It also negates the awesomeness of WOFF and the Google font API if you have to use bitmap font templates to have consistency (and even then not 100% reliably).

Media Handling

MP3 vs OGG.  I’ve stated my views on this.  Support OGG if you want, but don’t ignore MP3 if you want to be serious about creating a media rich web.  MP3 is out there, everywhere, and creating a barrier for innovation is dumb.

I won’t start on video codecs, that’s just too much of a mess for now.

Secondly, the extremely poor memory management regarding audio.  In all browsers, eventually media elements hit a limit.  For most sites this isn’t an issue.  But at about 50 megs of audio, IE9 stops loading new audio, even though the previous audio elements were removed.  They had to be ‘delete’ed.  Chrome hit this issue a few tracks later.

When deleted every browser behaves consistently and smoothly, except Chrome, which goes silent for no apparent reason at around 280 megs of audio.

You may ask why on earth would a page have 280 megs of audio?  This isn’t isolated to but is also a problem for audio stores where you can preview sample audio, and online radio such as that could be playing gigabytes of tracks.


I’m not talking syntax, but being able to crash a browser in JavaScript doing a standard JavaScript call just isn’t acceptable. 

The line was ‘delete a;’ where a was null.  I was trying to reproduce the error, but in isolation it seems to work.  Why does it crash Chrome but not consistently?  I’m currently assuming it’s to do with Chrome trying to be smart around garbage collection that is not thread safe. 

Never mind whether the syntax used is good form, JavaScript should not crash the browser.  And since it crashes the browser, it doesn’t throw a JavaScript error, and has to be debugged from an external debugger like Visual Studio.

Final Thoughts

My biggest issue is that we’ll end up in a world supporting multiple codecs for audio and video to support different browsers, increasing cost of implementation, and reducing budget for innovation.

It’s not all doom and gloom, but for the discussed issues, and others, I feel that there is no short term win. The LCCD (lowest common common denominator) consistently will be the target platform.

Vaughan <![CDATA[Nobody Likes A Space Chicken, Everyone Should Love HTML5]]> 2012-05-21T12:46:21Z 2011-03-15T12:43:42Z Read more »]]> Over the last few weeks at work I have been head down involved with building a fun HTML5 game for EMI titled ‘Way Out Wars’.


Way Out Wars is a fun game for casual gamers, and provides the much sought after challenge for veterans of the ‘space chicken music discovery typing shoot ‘em up’ genre.  The game was built in pure HTML5, with some heavily modified impactjs as the core engine.  Lots of long hours spent tuning nanoseconds off particle render times.

My best is 75 tracks back to back for around 44 million.   Go play it at … preferably before reading the rest of this post to give insight.

Some things achieved in HTML5 that were pretty cool:

  • Real time usable particle effects engine we had to dumb down because it was just looking too busy (yet awesome).  At one stage we had smoke, jets, trails, explosions, musical notes, and swarms.  Unfortunately you couldn’t see the game.
  • Infinite playlist – Depending on your browser, since Chrome mysteriously dies
  • Music discovery that’s fun – Songs you hate, songs you like and songs you don’t know, that you can review and purchase.

Some pleasant discoveries:

  • HTML5 rocks.  It’s definitely a platform for the future, however it’s not necessarily the be all and end all.  But the game has no browser dependent coding.  It does however have code that checks for issues that may occur in some browsers, there isn’t any code that goes ‘if IE do this, if FF do that’. 
    NOTE: This does happen for audio however since Firefox hates MP3.
  • Impactjs rocks!  In the end we probably didn’t need it as much as we thought, but it did so much heavy lifting in the beginning it allowed us to experiment and play around with ideas.  We simply wouldn’t have had a product as far advanced without it.
  • HTML5 benchmarks don’t coincide with real world scenarios.  Simply ignore them until real world tests turn up.  Every benchmark I do IE9 doesn’t come out on top, but all of our experience with the game has been IE9 is way out on front, with no IE9 specific tuning in sight.  Someone who is familiar with graphics pipelines who understands the concepts behind draw calls, fill rate etc.  needs to give HTML5 the same treatment.
  • As such, IE9 rocked, and we were pleasantly surprised.
  • Opera is amazing, although I’ll probably still never use it.  It came in 2nd performance and stability wise.
  • Firefox and Safari were stable, however their performance lacked.
  • Chrome was fast, however it’s stability lacked.
  • Hardware acceleration rocks.  Browsers without it will die slowly as user experience becomes encumbered by poor performance.
  • The iPad did load it once, but it got too big for it.  This however is promising for HTML5 moving forward when the processing power of these devices steps up. 

Some discoveries that were as fun as being jettisoned into the sun by space chickens:

  • Having lots of <audio> elements on a page can cause issues. Clean them up, delete them, and do everything you can to remove any trace of them. This was common for all browsers, with the impact having different results in each.  But cleaning up the audio elements makes all the browsers behave.  And whilst Chrome will still go silent after about 100 songs, without doing this it will crash after 57. 
  • Oh, and be careful deleting i.e. ‘delete obj;’ audio objects in Chrome, you can crash the browser and all open Chrome windows will stop working.  I think Chrome tries to do garbage collection on audio elements no longer in use.  A quick check ‘if(obj != null) delete obj;’ tends to be stable.
  • Any browser that doesn’t support MP3 needs to fix that.  It is a bug, not a feature. If every music site in the world has to re-encode their audio to OGG, they will ignore browsers that require it, or simply be ignorant to it. Firefox needs to wake up.  Media companies are going to be staring at mobile devices, and Firefox is the last thing their IE and Safari based offices will test in.
  • Firefox 3 is antique, Firefox 4 feels like a classic, that still goes well, but won’t keep up for long. A shame.  This feels like a Firefox bash, but it has just been the reality we’ve been dealing with.
  • Chromes instability.  Chrome is fast, yet buggy. We discovered a critical error that would happen on the 57th track crashing Chrome. After writing a universal piece of code that wouldn’t crash Chrome, we found that at some point, chrome refuses to play audio, and kills audio across all chrome tabs and windows, with no error message.  This came as a surprise to me, and over the coarse of 3 weeks Chrome has been superseded by IE9 now as my default. 
  • Safari underperformed.  Of the big names it was the worst performer. 
Vaughan <![CDATA[This Blog Post Best Viewed In Netscape Navigator]]> 2012-05-22T12:59:08Z 2011-02-02T02:47:01Z </2 years not posting>

HTML5 is great.  I’m really looking forward to it, but with all it’s standardised extended capabilities comes a new frontier, browser performance. 

I’ve been playing around with IE9’s GPU accelerated HTML5 SVG, and performance blows Chrome and Firefox out of the water.  Throw in a few physics engines and play around a bit more though, and you start to notice the performance spread shifting dramatically.  Because IE9 is doing all the display calculations in GPU, the choke point is the physics engine, and that choke point is the JavaScript engine, so it generally slows down proportionately to world object calculations (once the world step calculation time exceeds your physics time step delta). 

With Firefox and Chrome, you start to notice massive differences early on.  Chrome starts to slow down, but it flies compared to Firefox, which seems to struggle pushing everything through the physics steps and the graphics rendering at a decent frame rate.  If I was to rate the performance in the ‘Eyeball HTML5 Benchmark TM’ it would be 100, 60, 20 to IE9, Chrome, Firefox.

But this isn’t a browser war rant, this is a rant about the problem this creates on the developer side.

At the moment the main considerations when doing an advanced website are limited.  The questions asked by clients are generally straight forward.

Will this run on the PC’s of our users (CPU, resolution etc)?  Will it render correctly in all browsers?

I’m not talking about a bleeding edge benchmark of performance, but something consumer mass market oriented.  My concern is that with the new era of HTML5 RIA apps emerging, the performance gap in rendering and JavaScript, are we going to re-enter a world of ‘Best Viewed in Netscape Navigator’ ?  WebGL is coming in Chrome and Firefox, but Microsoft has no public announcement it is going to come with IE9.  The underlying engines when they arrive will differ drastically in performance.  SVG, Canvas, WebGL, and HTML rendering performance may vary between browser, CPU, and GPU. 

But even on the day that Chrome, Firefox, and IE have relatively equal performance metrics, with mobile internet usage projected to outgrow desktop internet usage by 2014, the problem gets exponentially complex when catering for mobile devices, tablets, and the rest of the crew.

The cost of development and testing goes through the roof.  Creative designers have been trained to think in terms of Flash / Silverlight and the predictable performance metrics, and development houses without foresight will start pumping out amazing creations that only work under lab conditions.

And while as a geek I will write something to push the GPU in IE9, I don’t see it as a compelling reason to develop bleeding edge clock cycle pushing code until the all browsers are up to speed.  I do see GPU acceleration as a reason to use IE9 so hats off to Microsoft for getting that in there.  I’ve found it much snappier than any of the other browsers at the moment.  But throwing something out there publicly knowing that in FIrefox it will crawl along slowly is not compelling at all.

I’ve been hearing a lot about the future of gaming and apps, and simply put, it’s a long way off for now.  The technology is there, but the industry is not.  While the future may hold some exciting things for HTML5, the short to mid term holds some nice standard ways of doing things that for the past 10 years have been HTML and CSS hacks.  And that’s enough to get excited about for now.

PS – Any other browser I haven’t mentioned above is still relevant for this discussion, and the ‘if everyone just used X’ argument simply reinforces the issue.

PPS – IE9 GPU acceleration can be seen working best with SVG, and not to be mistaken with WebGL.  3rd party 3D libraries I’ve found just come down to JS execution speed, the rendering is the baby step in those scenarios.

Vaughan <![CDATA[Silverlight 3DSMax Exporter – Update]]> 2012-05-21T12:46:40Z 2009-04-15T00:38:04Z Read more »]]> The exporter is coming along nicely.  I’m really happy with the results.  So much so that I decided to create a Q*Bert scene.  Click on the image blow for the full sized image.


In the image you have 3DSMax in the background with all it’s wireframe goodness.  The render on the right, and as you can see, the Silverlight output in Firefox on the left.

Features in the short term will be more .NET features, as full scene rotation would give fantastic interactivity, and open up the door for useful Silverlight transitions.  Texture mapping is another one I want to look at, mainly focusing on texture scaling and offset.