The Coolest Thing About GPS

I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve my video presentation skills, in particular my engagement with the camera and the audience.

I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re an expert on”. The idea came from a talk I used to give at the University of Oxford.

The Secret of Magic

I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve my video presentation skills, in particular my engagement with the camera and the audience.

I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re passionate about”. The idea came from a blog post I wrote back in 2014.

Axe Feather 2021

tl;dr?

I recreated a 16-year old interactive ad. Experience it here. Get the source code here. Or keep reading for the full story.

What?

Back in 2005 I reblogged a Flash-based interactive advert I’d discovered via del.icio,us. And if that sentence wasn’t early-naughties enough for you, buckle up…

A woman lies on a bed with her legs crossed, playfully wagging her finger.
This screenshot isn’t from the original site but from my homage to it. More on that later.

At the end of 2004, Unilever brand Axe (Lynx here in the UK) continued their strategy of marketing their deodorant as magically transforming young men into hyper-attractive sex gods. This is, of course, an endless battle, pitting increasingly sexually-charged advertisements against the fundamental experience of their product, which smells distinctly like locker rooms and school discos. To launch 2005’s new fragrance Feather, they teamed up with London-based design agency Dare Digital to create a game at domain AxeFeather.com (long since occupied by domain squatters).

In the game, the player’s mouse pointer becomes a feather which they can use to tickle an attractive young woman lying on a bed. The woman’s movements – which vary based on where she’s tickled – have been captured in digital video. This was aggressively compressed using the then-new H.263-ish Sorensen Spark codec to make a download just-about small enough to be tolerable for people still on dial-up Internet access (which was still almost as popular as broadband). The ad became a viral hit. I can’t tell you whether it paid for itself in sales, but it must have paid for itself in brand awareness: on Valentines Day 2005 it felt like it was all the Internet wanted to talk about.

Axe Feather logo visible via Archive.org, circa August 2005, in a Firefox browser window.
The site was archived by the WayBack Machine… but it doesn’t work in a modern browser.

I suspect its success also did wonders for the career of its creative consultant Olivier Rabenschlag, who left Dare a few years later, hopped around Silicon Valley for a bit, then landed himself a job as Head of Creative (now Chief Creative Officer) with Google. Kudos.

Why?

I told you about the site 16 years ago: why am I telling you again? Because this site, which made headlines at the time, is gone.

And not just a little bit gone, like a television ad no longer broadcast but which might still exist on YouTube somewhere (and here it is – you’re welcome for the earworm). The website went down in 2009, and because it was implemented in Flash the content was locked away in a compiled, proprietary format, which has ceased to be meaningfully usable on the modern web.

IE-specific CSS with a comment "Ok, so the scrollbar is IE specific...but I like it, ok?? :)"
The parts of AxeFeather.com’s code that are openly readable don’t help much, but I love this comment, which carries the scent of the adolescent web in the same way at Lynx deodorant carries the scent of an adolescent human.

The ad was pioneering. Flash had only recently gained video support (this would be used the following year for the first version of YouTube), and it had so far been used mostly for non-interactive linear video. This ad was groundbreaking… but now it’s disappeared like so much other Flash work. And for all that Flash might have been bad for the web, it’s an important part of our digital history [recommended reading].

Ruffle window showing an empty bed.
Third-party Flash emulation is imperfect. I tried to make Axe Feather work in Ruffle and got… an empty bed? What is this, a metaphor for being a lonely nerd?

So on a whim… I decided to see if I could recreate the ad.

Call it lockdown fever if you like, because it’s certainly not the work of a sane mind to attempt to resurrect a 16-year-old Internet advertisement. But that’s what I did.

How?

My plan: to reverse-engineer the digital assets (video, audio, cursor etc.) out of the original Flash file, and use them to construct a moderately-faithful recreation of the ad, suitable for use on the modern web. My version must:

  • Work in any modern browser, without Flash of course.
  • Work on mobile devices/with touchscreens, with all of the original functionality available without a keyboard (the original had secret content hidden behind keyboard keypresses). Nowadays, Rabenschlag knows to put mobile-first, but I think we can forgive him for not doing that twelve months before Flash Lite 2.0 would bring .flv support to mobile devices…
  • Indicate how much of the video content you’d seen, because we live in an era of completionists who want to know they’ve seen it all.
  • Depend on no third-party frameworks/libraries: just vanilla HTML, CSS, and JavaScript.

Let’s get started.

Reverse-engineering

Handbrake converting 19.flv to MP4 format.
At this point I noticed that the videos had no audio tracks: the giggling and other sound effects must be stored separately.

I grabbed the compiled .swf file from archive.org and ran it through SWFExtract and an online decompiler: neither was individually able to extract all of the assets, but together they gave me a full set. I ran the .flv files through Handbrake to get myself a set of .mp4 files instead.

Two starting frames from the videos, annotated to show that they are not aligned to the same point.
In what appears to have been an exercise in size optimisation, the original authors cropped the videos differently depending on how much space was needed (e.g. if the subject stretched her arms above her head, more space would be required). Clearly, some re-alignment would be needed.

Seeing that the extracted video files were clearly designed to be carefully-positioned on a static background, and not all in the exact same position, I decided to make my job easier by combining them all together, and including the background layer (the picture of the bed) as a single video. Integrating the background with the subject meant that I was able to use video editing software to tweak the position, which I imagined would be much easier than doing so in code. Combining all of the video clips into a single file provides compression benefits as well as making it easier to encourage a browser to precache the entire video to begin with.

Four layer design. From bottom to top: web page, video (showing woman on bed), (transparent) canvas, cursor (shaped like a feather).
My design called for three “layers” above my web page: the video, a transparent (and usually hidden) canvas showing the hit areas for debugging purposes, and the feather-shaped cursor.

The longest clip was a little over 6 seconds long, so I split my timeline into blocks of 7 seconds, padding each clip with a freeze-frame of its final image to make each exactly 7 seconds long. This meant that calculating the position in the finished video to which I wanted to jump was as simply as multiplying the (0-indexed) clip number by 7 and seeking to that position. The additional “frozen” frames acted as a safety buffer in case my JavaScript code was delayed by a few milliseconds in jumping to the “next” block.

Davinci Resolve showing composition of the actress onto the bed in a timeline.
I used onion-skinning to help “line up” the actress with herself as I composited her onto the bed in a single unified video of 7-second blocks.

An additional challenge was that in the original binary, the audio files were stored separately from the video clips… and slightly longer than them! A little experimentation revealed that the ends of each clip lined up, presumably something to do with how Flash preloads and synchronises media streams. Luckily for me, the audio clips were numbered such that they mostly mapped to the order in which the videos appeared.

Once I had a video file suitable for use on the web (you can watch the entire clip here, if you really want to), it was time to write some code.

Video timeline showing that each 7-second block is comprised of the original clip plus padding, atop a background layer of the bed and each clip's associated audio.
It feels slightly wasteful that over 50% of the resulting video clip is a freeze-frame, but modern video compression algorithms like H.264 reduce the impact considerably and the resulting video file is about the same size as its more-optimised predecessor.

Regular old engineering

The theory was simple: web page, video, loop the first seven seconds until you click on it, then animate the cursor (a feather) and jump to another seven-second block before jumping back or, in some cases, on to a completely new seven second block. Simple!

Of course, any serious web development is always a little more complex than you first anticipate.

Game map illustrating transition between the states of Axe Feather 2021.
I extracted from the .swf 34 distinct animated clips, which I numbered 0 through 33. 6 and 30 appeared to be duplicates of others. 0 and 33 are each two “idling” states from which interaction can lead to other states. Note that my interpretation of the order and relationship of animation sequences differs from the original.

For example: nowadays, putting a video on a web page is as easy as a <video> tag. But, in an effort to prevent background web pages from annoying you with unexpected audio, modern browsers won’t let a video play sound unless user interaction is the reason that the video starts playing (or unmutes, if it was playing-but-muted to begin with). Broadly-speaking, that means that a definitive user action like a “click” event has to be in the call stack when your code makes the video play/unmute.

But changing the .currentTime of a video to force it into a loop: that’s fine! So I set the video to autoplay muted on page load, with a script to make it loop within its first seven-second block. The actress doesn’t make any sound in block 0 (position A) anyway; so I can unmute the video when the user interacts with a hotspot.

For best performance, I used window.requestAnimationFrame to synchronise my non-interactive events (video loops, virtual cursor repositioning). This posed a slight problem in that animationframes wouldn’t be triggered if the tab was moved to the background: the video would play through each seven-second block and into the next! Fortunately the visibilitychange event came to the rescue and I was able to pause the video when it wasn’t being actively watched.

I originally hoped to use the cursor: CSS directive to make the “feather” cursor, but there’d be no nice way to animate it. Comet Cursor may have been able to use animated GIFs as cursors back in 1997 (when it wasn’t busy selling all your personal information to advertisers, back when that kind of thing used to attract widespread controversy), but modern browsers don’t… presumably because it would be super annoying. They also don’t all respect cursor: none, so I used the old trick of using cursor: url(null.png), none (where null.png is an almost-entirely transparent 1×1 pixel image) to hide the original cursor, then position an image dynamically.  I usegetBoundingClientRect() to allow the video to resize dynamically in CSS and convert coordinates on it represented as percentages into actual pixel values and vice-versa: this allows it to react responsively to any screen size without breakpoints or excessive code.

Once I’d gone that far I was able to drop the GIF idea entirely and used a CSS animation for the “tickling” motion.

Woman on bed in idle position B, with hotspots highlighted on each arm, her hed, her chest, her stomach, her hips, the top of her legs, and the bottom of the leg that's extended straight below her.
The hotspot overlay was added as a debugging feature but I left it in the final version. Hold the space bar to highlight hit areas.

I added a transparent <canvas> element on top of the <video> on which the hit areas are dynamically drawn to help me test the “hotspots” and tweak their position. I briefly considered implementing a visual tool to help me draw the hotspots, but figured it wasn’t quite worth the time it would take.

As I implemented more and more of the game, I remembered one feature from the original that I’d missed: the “blowaway”. If you trigger block 31 – a result of tickling the woman’s nose – she’ll blow your cursor off the screen. It’s particularly fun because it subverts the player’s expectations of their user interface: once you’ve got past the surprise of your cursor being a feather, you quickly settle in to it moving like a regular cursor… but then control’s stolen from you and the cursor vanishes! (Well I thought it was cool… 16 years ago.)

A woman blows a feather away from her face.
Sometimes tickling her nose will make her blow your feather off the screen. That’ll show you.

So yeah: that was my project this weekend.

I can’t even begin to explain why anybody would do this. But I did it. If you haven’t already: go have a play. And if you’re interested in how it works, the source code’s free for you to explore.

× × × × × × × × × × × ×

Recursively repackaging .mkvs as .mp4s

A note for my own reference. If you want to repackage a lot of .mkv files as .mp4, without transcoding, here’s a one-liner:

for f in **/*.mkv; do echo "$f"; ffmpeg -y -hide_banner -loglevel panic -i "$f" -c copy "${f%.mkv}.mp4"; rm "$f"; done

Endpoint Encabulator

(This video is also available on YouTube.)

I’ve been working as part of the team working on the new application framework called the Endpoint Encabulator and wanted to share with you what I think makes our project so exciting: I promise it’ll make for two minutes of your time you won’t seen forget!

Naturally, this project wouldn’t have been possible without the pioneering work that preceded it by John Hellins Quick, Bud Haggart, and others. Nothing’s invented in a vacuum. However, my fellow developers and I think that our work is the first viable encabulator implementation to provide inverse reactive data binding suitable for deployment in front of a blockchain-driven backend cache. I’m not saying that all digital content will one day be delivered through Endpoint Encabulator, but… well; maybe it will.

If the technical aspects go over your head, pass it on to a geeky friend who might be able to make use of my work. Sharing is caring!

Note #17871

Watched the pilot of Webbed Briefs by @heydonworks (of Every Layout fame). It’s a sarcastic independent vlog about web technologies, so I immediately fell in love and subscribed to the feed…

Just kidding. It doesn’t have a feed! (Yet?)

Webbed Briefs logo

Dune (2020)

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Oh my god I’m so excited. I’m afraid they might fuck up the story even more than David Lynch did in 1984 (not that I don’t love that film, too, but in a very different way than the books). I mean: I’d have hoped a modern adaptation would have a bigger part for Chani than it clearly does. And I know nothing at all about the lead, Timothée Chalamet. If only there was something I could do about these fears?

I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.

Yeah, that’s the kind of thing.

The supporting cast look excellent. I think Josh Brolin will make an awesome Gurney Halleck, Jason Momoa will rock Duncan Idaho, and I’m looking forward to seeing Stephen McKinley Henderson play Thufir Hawat. But if there’s just one thing you should watch the trailer for… it’s to listen to fragments of Hans Zimmer’s haunting, simplistic choral adaptation of Pink Floyd’s Eclipse.

Geohashing expedition 2020-09-09 51 -1

This checkin to geohash 2020-09-09 51 -1 reflects a geohashing expedition. See more of Dan's hash logs.

Location

Edge of a field bounded by Letcombe Brook, over the A338 from Landmead Solar Farm.

Participants

Plans

We’re discussing the possibility of a Subdivision geohash achievement for people who’ve reached every “X in a Y”, and Fippe pointed out that I’m only a hash in the Vale of White Horse from being able to claim such an achievement for Oxfordshire’s regions. And then this hashpoint appears right in the Vale of White Horse: it’s like it’s an omen!

Technically it’s a workday so this might have to be a lunchtime expedition, but I think that might be workable. I’ve got an electric vehicle with a hundred-and-something miles worth of batteries in the tank and it looks like there might be a lay-by nearby the hashpoint (with a geocache in it!): I can drive down there at lunchtime, walk carefully back up the main road, and try to get to the hashpoint!

Expedition

I worked hard to clear an hour of my day to take a trip, then jumped in my (new) electric car and set off towards the hashpoint. As I passed Newbridge I briefly considered stopping and checking up on my geocache there but feeling pressed for time I decided to push on. I parked in the lay-by where GC5XHJG is apparently hidden but couldn’t find it: I didn’t search for long because the farmer in the adjacent field was watching me with suspicion and I figured that anyway I could hunt for it on the way back.

Walking along the A338 was treacherous! There are no paths, only a verge covered in thick grass and spiky plants, and a significant number of the larger vehicles (and virtually all of the motorbikes) didn’t seem to be obeying the 60mph speed limit!

Reaching the gate, I crawled under (reckoning that it’s probably there to stop vehicles and not humans) and wandered along the lane. I saw a red kite and a heron doing their thing before I reached the bridge, crossed Letcombe Brook, and followed the edge of the field. Stuffing my face with blackberries as I went, it wasn’t long before I reached the hashpoint on one edge of the field.

I took a short-cut back before realising that this would put me in the wrong place to leave a The Internet Was Here sign, so I doubled-back to place it on the gate I’d crawled under. Then I returned to the lay-by, where another car had just pulled up (right over the GZ of the geocache I’d hoped to find!) and didn’t seem to be going anywhere. Sadly I couldn’t wait around all day – I had work to do! – so I went home, following the satnav in the car in a route that resulted in a figure-of-eight tracklog.

Tracklog

My GPS keeps a tracklog. Here you go:

Geohashing expedition 2020-09-09 51 -1 tracklog map

Video

You can also watch it at:

Photos

360° panoramic VR photo of the 2020-09-09 51 -1 geohashpoint

× ×

Holograms on Chocolate

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

This is incredibly cool. Using (mostly) common household tools and chemicals and a significant amount of effort, Ben (who already built himself a home electron microscope, as you do) demonstrates how you can etch a hologram directly into chocolate, resulting in a completely edible hologram. I’d never even thought before about the fact that a hologram could be embossed into almost any opaque surface before, so this blew my mind. In hindsight it makes perfect sense, but it still looks like magic to see it done.

Bodley and the Bookworms – Scan and Deliver

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

You know that strange moment when you see your old coworkers on YouTube doing a cover of an Adam and the Ants song? No: just me?

Still good to see the Bodleian put a fun spin on promoting their lockdown-friendly reader services. For some reason they’ve marked this video “not embeddable” (?) in their YouTube settings, so I’ve “fixed” the copy above for you.

WHAT THE BEC?! (#01)

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Just another vlog update from comedian Bec Hill. Oh no, wait… this website is now T-Shirt Famous! (for a very loose definition of “famous”, I guess.) For a closer look, see Instagram.

This isn’t the silliest way I’ve put my web address on something, of course. A little over 17 years ago there was the time I wrote my web address along the central reservation of a road in West Wales using sugar cubes, for example. But it’s certainly the silliest recent way.

Anyway: this t-shirt ain’t the Million Dollar Homepage. It’s much cooler than that. Plus the money’s all going to Water Aid. (If you haven’t claimed a square yourself, you still can!)

DanQ.me on a t-shirt as drawn by comedian Bec Hill
I was pleased to see that Bec even managed to get the blue kinda-sorta on-brand.

On Bec Hill related news, did you see that she did a third “when you listen to music when you’re hungry” video? You should go watch that too. It’s avocado-licious.

×

Lindsey Stirling/Johnny Rzeznik String Session

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

One of the last “normal” things I got to do before the world went full lockdown was to attend a Goo Goo Dolls concert with Ruth, and so to see two musicians I enjoy team up to perform a song and share some words of hope and encouragement for a better future beyond these troubled times… feels fitting and inspiring.

Also awesome to see that Stirling’s perhaps as much a fan of Live in Buffalo as I am.

Fun diversion: I never know how to answer the question “what kind of music do you like?”, because I increasingly (and somewhat deliberately) find that I enjoy a wider and wider diversity of different genres and styles. But perhaps the right answer might be: “I like music that makes me feel the way I feel when I hear Cuz You’re Gone recorded from the Goo Goo Dolls’ concert in Buffalo on 4 July 2004, specifically the bit between 4 minutes 10 seconds and 4 minutes 33 seconds into the song, right at the end of the extended bridge. It’s full of anticipatory energy and building to a wild crescendo that seems to mirror the defiance of both the band and the crowd in the face of the torrential rain that repeatedly almost brought an end to the concert. Music that makes me feel like that bit; that’s the kind of music I like. Does that help?”

My 1:1 with the Queen

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Americans often ask what our relationship is with the Queen. I thought I’d upload my most recent 1:1 so you could see how the regular yearly 1:1 progress chats go.

As a Brit who does software engineering alongside a team from all over the rest of the world… I wish I’d thought of making this video first.

Tour of our old house

We’ve only got a couple of days left before we move to our new house. In order that she and her little brother might better remember our old house, I encouraged our 6 year-old to record a video tour.

Also available via:

Sour Grapes… a Murder Mystery in Lockdown

It had been a long while since our last murder mystery party: we’ve only done one or two “kit” ones since we moved in to our current house in 2013, and we’re long-overdue a homegrown one (who can forget the joy of Murder at the Magic College?), but in the meantime – and until I have the time and energy to write another one of my own – we thought we’d host another.

But how? Courtesy of the COVID-19 crisis and its lockdown, none of our friends could come to visit. Technology to the rescue!

Jen, Dan, Suz, Alec, Matt, JTA and Ruth at Sour Grapes
Not being in the same room doesn’t protect you from finger-pointing.

I took a copy of Michael Akers‘ murder mystery party plan, Sour Grapes of Wrath, and used it as the basis for Sour Grapes, a digitally-enhanced (and generally-tweaked) version of the same story, and recruited Ruth, JTA, Jen, Matt R, Alec and Suz to perform the parts. Given that I’d had to adapt the materials to make them suitable for our use I had to assign myself a non-suspect part and so I created police officer (investigating the murder) whose narration provided a framing device for the scenes.

Sour Grapes clue showing on an iPhone screen.
Actually, the interface didn’t work as well on an iPhone as I’d have expected, but I ran short on testing time.

I threw together a quick Firebase backend to allow data to be synchronised across a web application, then wrote a couple of dozen lines of Javascript to tie it together. The idea was that I’d “push” documents to each participants’ phone as they needed them, in a digital analogue of the “open envelope #3” or “turn to the next page in your book” mechanism common in most murder mystery kits. I also reimplemented all of Akers’ artefacts, which were pretty-much text-only, as graphics, and set up a system whereby I could give the “finder” of each clue a copy in-advance and then share it with the rest of the participants when it was appropriate, e.g. when they said, out loud “I’ve found this newspaper clipping that seems to say…”

The party itself took place over Discord video chat, with which I’d recently had a good experience in an experimental/offshoot Abnib group (separate from our normal WhatsApp space) and my semi-associated Dungeons & Dragons group. There were a few technical hiccups, but only what you’d expect.

Sour Grapes' command centre: the Host Panel
Meanwhile, I had a web page with all kinds of buttons and things to press.

The party itself rapidly descended into the usual level of chaos. Lots of blame thrown, lots of getting completely off-topic and getting distracted solving the wrong puzzles, lots of discussion about the legitimacy of one of several red herrings, and so on. Michael Akers makes several choices in his writing that don’t appear in mine – such as not revealing the identity of the murderer even to the murderer until the final statements – which I’m not a fan of but retained for the sake of honouring the original text, but if I were to run a similar party again I’d adapt this, as I had a few other aspects of the setting and characters. I think it leads to a more fun game if, in the final act, the murderer knows that they committed the crime, that all of the lies they’ve already told are part of their alibi-building, and they’re given carte blanche to lie as much as they like in an effort to “get away with it” from then on.

Sour Grapes: participants share "hearts" with Ruth
Much love was shown for the “catering”.

Of course, Ruth felt the need to cater for the event – as she’s always done with spectacular effect at every previous murder mystery she’s hosted or we’ve collectively hosted – despite the distributed partygoers. And so she’d arranged for a “care package” of wine and cheese to be sent to each household. The former was, as always, an excellent source of social lubrication among people expected to start roleplaying a random character on short notice; the latter a delightful source of snacking as we all enjoyed the closest thing we’ll get to a “night out” in many months.

This was highly experimental, and there are lessons-for-myself I’d take away from it:

  • If you’re expecting people to use their mobiles, remember to test thoroughly on mobiles. You’d think I’d know this, by now. It’s only, like, my job.
  • When delivering clues and things digitally, keep everything in one place. Switching back and forth between the timeline that supports your alibi and the new information you’ve just learned is immersion-breaking. Better yet, look into ways to deliver physical “feelies” to people if it’s things that don’t need sharing, and consider ways to put shared clues up on everybody’s “big screen”.
  • Find time to write more murder mysteries. They’re much better than kit-style ones; I’ve got a system and it works. I really shout get around to writing up how I make them, some day; I think there’s lessons there for other people who want to make their own, too.
Planning a murdery mystery
Those who know me may be surprised to hear that the majority of my work planning an original murder mystery plot, even a highly-digital one like Murder… on the Social Network, happens on paper.

Meanwhile: if you want to see some moments from Sour Grapes, there’s a mini YouTube playlist I might get around to adding to at some point. Here’s a starter if you’re interested in what we got up to (with apologies for the audio echo, which was caused by a problem with the recording software):

× × × × ×