I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve
my video presentation skills, in particular my engagement with the camera and the audience.
I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re an expert on”. The idea came from a talk I used to give at the University of Oxford.
I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve
my video presentation skills, in particular my engagement with the camera and the audience.
I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re passionate about”. The idea came from a blog post I
wrote back in 2014.
Back in 2005 I reblogged a Flash-based interactive advert I’d discovered via del.icio,us. And if that sentence wasn’t early-naughties enough for you, buckle up…
At the end of 2004, Unilever brand Axe (Lynx here in the UK)
continued their strategy of marketing their
deodorant as magically transforming young men into hyper-attractive sex gods. This is, of course, an endless battle, pitting increasingly sexually-charged advertisements against the
fundamental experience of their product, which smells distinctly like locker rooms and school discos. To launch 2005’s new fragrance Feather, they teamed up with London-based
design agency Dare Digital to create a game at domain AxeFeather.com (long since occupied by domain squatters).
In the game, the player’s mouse pointer becomes a feather which they can use to tickle an attractive young woman lying on a bed. The woman’s movements – which vary based on where she’s
tickled – have been captured in digital video. This was aggressively compressed using the then-new H.263-ish
Sorensen Spark codec to make a download just-about small enough to be tolerable for people still on dial-up Internet access (which was still almost as popular as broadband). The ad became a viral hit. I can’t tell you whether it paid for itself in sales, but it
must have paid for itself in brand awareness: on Valentines Day 2005 it felt like it was all the Internet wanted to talk about.
I suspect its success also did wonders for the career of its creative consultant Olivier Rabenschlag, who left Dare a few years
later, hopped around Silicon Valley for a bit, then landed himself a job as Head of Creative (now Chief Creative Officer) with Google. Kudos.
Why?
I told you about the site 16 years ago: why am I telling you again? Because this site, which made
headlines at the time, is gone.
And not just a little bit gone, like a television ad no longer broadcast but which might still exist on YouTube somewhere (and here it is – you’re welcome for the earworm). The website went down in 2009, and because it was implemented in Flash the content
was locked away in a compiled, proprietary format, which has ceased to be meaningfully usable on the modern web.
The ad was pioneering. Flash had only recently gained video support (this would be used the following year for the first version of YouTube), and it had so far been used mostly for
non-interactive linear video. This ad was groundbreaking… but now it’s disappeared like so much other Flash work. And for all that Flash might have been bad for the web,
it’s an important part of our
digital history [recommended reading].
So on a whim… I decided to see if I could recreate the ad.
Call it lockdown fever if you like, because it’s certainly not the work of a sane mind to attempt to resurrect a 16-year-old Internet advertisement. But that’s what I did.
How?
My plan: to reverse-engineer the digital assets (video, audio, cursor etc.) out of the original Flash file, and use them to construct a moderately-faithful recreation of the ad,
suitable for use on the modern web. My version must:
Work in any modern browser, without Flash of course.
Indicate how much of the video content you’d seen, because we live in an era of completionists who want to know they’ve seen it all.
Depend on no third-party frameworks/libraries: just vanilla HTML, CSS, and JavaScript.
Let’s get started.
Reverse-engineering
I grabbed the compiled .swf file from archive.org and ran it through
SWFExtract and an online decompiler: neither was individually able to extract
all of the assets, but together they gave me a full set. I ran the .flv files through Handbrake to get myself a set of
.mp4 files instead.
Seeing that the extracted video files were clearly designed to be carefully-positioned on a static background, and not all in the exact same position, I decided to make my job easier by
combining them all together, and including the background layer (the picture of the bed) as a single video. Integrating the background with the subject meant that I was able to use
video editing software to tweak the position, which I imagined would be much easier than doing so in code. Combining all of the video clips into a single file provides compression
benefits as well as making it easier to encourage a browser to precache the entire video to begin with.
The longest clip was a little over 6 seconds long, so I split my timeline into blocks of 7 seconds, padding each clip with a freeze-frame of its final image to make each exactly 7
seconds long. This meant that calculating the position in the finished video to which I wanted to jump was as simply as multiplying the (0-indexed) clip number by 7 and seeking to that
position. The additional “frozen” frames acted as a safety buffer in case my JavaScript code was delayed by a few milliseconds in jumping to the “next” block.
An additional challenge was that in the original binary, the audio files were stored separately from the video clips… and slightly longer than them! A little experimentation revealed
that the ends of each clip lined up, presumably something to do with how Flash preloads and synchronises media streams. Luckily for me, the audio clips were numbered such that
they mostly mapped to the order in which the videos appeared.
Once I had a video file suitable for use on the web (you can watch the entire clip here, if you really want to), it was time to
write some code.
Regular old engineering
The theory was simple: web page, video, loop the first seven seconds until you click on it, then animate the cursor (a feather) and jump to another seven-second block before jumping
back or, in some cases, on to a completely new seven second block. Simple!
Of course, any serious web development is always a little more complex than you first anticipate.
For example: nowadays, putting a video on a web page is as easy as a <video> tag. But, in an effort to prevent background web pages from annoying you with unexpected
audio, modern browsers won’t let a video play sound unless user interaction is the reason that the video starts playing (or unmutes, if it was playing-but-muted to
begin with). Broadly-speaking, that means that a definitive user action like a “click” event has to be in the call stack when your code makes the video play/unmute.
But changing the .currentTime of a video to force it into a loop: that’s fine! So I set the video to autoplay muted on page load, with a script to make it loop
within its first seven-second block. The actress doesn’t make any sound in block 0 (position A) anyway; so I can unmute the video when the user interacts with a hotspot.
For best performance, I used window.requestAnimationFrame to synchronise my non-interactive events (video loops, virtual cursor repositioning). This posed a slight problem
in that animationframes wouldn’t be triggered if the tab was moved to the background: the video would play through each seven-second block and into the next! Fortunately the
visibilitychange event came to the rescue and I was able to pause the video when it wasn’t being actively watched.
I originally hoped to use the cursor: CSS directive to make the “feather” cursor, but there’d be no nice way to
animate it. Comet Cursor may have been able to use animated GIFs
as cursors back in 1997 (when it wasn’t busy selling all your personal information to advertisers, back when that kind of thing used to attract widespread controversy), but modern
browsers don’t… presumably because it would be super annoying. They also don’t all respect cursor: none, so I used the old trick of using cursor: url(null.png),
none (where null.png is an almost-entirely transparent 1×1 pixel image) to hide the original cursor, then position an image dynamically. I
usegetBoundingClientRect() to allow the video to resize dynamically in CSS and convert coordinates on it represented
as percentages into actual pixel values and vice-versa: this allows it to react responsively to any screen size without breakpoints or excessive code.
Once I’d gone that far I was able to drop the GIF idea entirely and used a CSS animation for the “tickling” motion.
I added a transparent <canvas> element on top of the <video> on which the hit areas are dynamically drawn to help me test the “hotspots” and tweak
their position. I briefly considered implementing a visual tool to help me draw the hotspots, but figured it wasn’t quite worth the time it would take.
As I implemented more and more of the game, I remembered one feature from the original that I’d missed: the “blowaway”. If you trigger block 31 – a result of tickling the woman’s nose –
she’ll blow your cursor off the screen. It’s particularly fun because it subverts the player’s expectations of their user interface: once you’ve got past the surprise of your
cursor being a feather, you quickly settle in to it moving like a regular cursor… but then control’s stolen from you and the cursor vanishes! (Well I thought it was cool… 16 years ago.)
I’ve been working as part of the team working on the new application framework called the Endpoint Encabulator and wanted to share with you what I think makes our project so
exciting: I promise it’ll make for two minutes of your time you won’t seen forget!
Naturally, this project wouldn’t have been possible without the pioneering work that preceded it by John Hellins Quick, Bud Haggart, and others. Nothing’s invented in a vacuum. However,
my fellow developers and I think that our work is the first viable encabulator implementation to provide inverse reactive data binding suitable for deployment in front of a
blockchain-driven backend cache. I’m not saying that all digital content will one day be delivered through Endpoint Encabulator, but… well; maybe it will.
If the technical aspects go over your head, pass it on to a geeky friend who might be able to make use of my work. Sharing is caring!
Watched the pilot of Webbed Briefs by @heydonworks (of Every Layout fame). It’s a sarcastic independent vlog
about web technologies, so I immediately fell in love and subscribed to the feed…
Oh my god I’m so excited. I’m afraid they might fuck up the story even more than David Lynch did in 1984 (not that I don’t
love that film, too, but in a very different way than the books). I mean: I’d have hoped a modern adaptation would have a bigger part for Chani than it clearly does. And I know nothing at all about the lead,
Timothée Chalamet. If only there was something I could do about these fears?
I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it
has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.
Yeah, that’s the kind of thing.
The supporting cast look excellent. I think Josh Brolin will make an awesome Gurney Halleck, Jason Momoa will rock Duncan Idaho, and I’m looking forward to seeing Stephen
McKinley Henderson play Thufir Hawat. But if there’s just one thing you should watch the trailer for… it’s to listen to fragments of Hans Zimmer’s haunting, simplistic choral
adaptation of Pink Floyd’s Eclipse.
We’re discussing the possibility of a Subdivision geohash achievement for people who’ve reached every “X in a
Y”, and Fippe pointed out that I’m only a hash in the Vale of White Horse from being able to claim such an
achievement for Oxfordshire’s regions. And then this hashpoint appears right in the Vale of White Horse: it’s like it’s an omen!
Technically it’s a workday so this might have to be a lunchtime expedition, but I think that might be workable. I’ve got an electric vehicle with a hundred-and-something miles
worth of batteries in the tank and it looks like there might be a lay-by nearby the hashpoint (with a geocache in it!): I can drive down there at lunchtime, walk carefully back up the main road, and try to
get to the hashpoint!
Expedition
I worked hard to clear an hour of my day to take a trip, then jumped in my (new) electric car and set off towards the hashpoint. As I passed Newbridge I briefly considered stopping and
checking up on my geocache there but feeling pressed for time I decided to push on. I parked in the lay-by where
GC5XHJG is apparently hidden but couldn’t find it: I didn’t search for long because the farmer in the adjacent field
was watching me with suspicion and I figured that anyway I could hunt for it on the way back.
Walking along the A338 was treacherous! There are no paths, only a verge covered in thick grass and spiky plants, and a significant number of the larger vehicles (and virtually all of
the motorbikes) didn’t seem to be obeying the 60mph speed limit!
Reaching the gate, I crawled under (reckoning that it’s probably there to stop vehicles and not humans) and wandered along the lane. I saw a red kite and a heron doing their thing
before I reached the bridge, crossed Letcombe Brook, and followed the edge of the field. Stuffing my face with blackberries as I went, it wasn’t long before I reached the hashpoint on
one edge of the field.
I took a short-cut back before realising that this would put me in the wrong place to leave a The Internet Was Here sign, so I doubled-back to place it on the gate I’d crawled under.
Then I returned to the lay-by, where another car had just pulled up (right over the GZ of the geocache I’d hoped to find!) and didn’t seem to be going anywhere. Sadly I couldn’t wait
around all day – I had work to do! – so I went home, following the satnav in the car in a route that resulted in a figure-of-eight tracklog.
This is incredibly cool. Using (mostly) common household tools and chemicals and a significant amount of effort, Ben (who already built himself a home electron microscope, as you do) demonstrates how you can etch a hologram directly into chocolate, resulting in a
completely edible hologram. I’d never even thought before about the fact that a hologram could be embossed into almost any opaque surface before, so this blew my mind. In
hindsight it makes perfect sense, but it still looks like magic to see it done.
You know that strange moment when you see your old coworkers on YouTube doing a cover of an Adam and the Ants song? No: just me?
Still good to see the Bodleian put a fun spin on promoting their lockdown-friendly reader services. For some reason they’ve marked this video “not embeddable” (?) in their YouTube
settings, so I’ve “fixed” the copy above for you.
Just another vlog update from comedian Bec Hill. Oh no, wait… this website is now T-Shirt Famous! (for a very loose definition of
“famous”, I guess.) For a closer look, see Instagram.
This isn’t the silliest way I’ve put my web address on something, of course. A little over 17 years ago there was the time I wrote my web address along the central reservation of a road
in West Wales using sugar cubes, for example. But it’s certainly the silliest recent way.
Anyway: this t-shirt ain’t the Million Dollar Homepage. It’s much cooler than that. Plus the money’s all going to Water Aid.
(If you haven’t claimed a square yourself, you still can!)
One of the last “normal” things I got to do before the world went full lockdown was to attend a Goo Goo Dolls concert with Ruth, and so to see
two musicians I enjoy team up to perform a song and share some words of hope and encouragement for a better future beyond these troubled times… feels fitting and inspiring.
Also awesome to see that Stirling’s perhaps as much a fan of Live in Buffalo as I am.
Fun diversion: I never know how to answer the question “what kind of music do you like?”, because I increasingly (and somewhat deliberately) find that I enjoy a wider and wider
diversity of different genres and styles. But perhaps the right answer might be: “I like music that makes me feel the way I feel when I hear Cuz You’re Gone recorded from the
Goo Goo Dolls’ concert in Buffalo on 4 July 2004, specifically the bit between 4 minutes 10 seconds and 4 minutes 33 seconds into the song, right at the end of the extended bridge. It’s
full of anticipatory energy and building to a wild crescendo that seems to mirror the defiance of both the band and the crowd in the face of the torrential rain that repeatedly almost
brought an end to the concert. Music that makes me feel like that bit; that’s the kind of music I like. Does that help?”
Americans often ask what our relationship is with the Queen. I thought I’d upload my most recent 1:1 so you could see how the regular yearly 1:1 progress chats go.
As a Brit who does software engineering alongside a team from all over the rest of the world… I wish I’d thought of making this video first.
We’ve only got a couple of days left before we move to our new house. In order that she and her little brother might better remember our
old house, I encouraged our 6 year-old to record a video tour.
It had been a long while since our last murder mystery party: we’ve only done one or two “kit” ones since we moved in to our current
house in 2013, and we’re long-overdue a homegrown one (who can forget the joy of Murder at the Magic College?), but in
the meantime – and until I have the time and energy to write another one of my own – we thought we’d host another.
But how? Courtesy of the COVID-19 crisis and its lockdown, none of our friends could come to visit. Technology to the rescue!
I took a copy of Michael Akers‘ murder mystery party plan, Sour Grapes of Wrath, and used it as the basis for Sour Grapes, a digitally-enhanced (and generally-tweaked) version of the same story, and recruited Ruth, JTA, Jen, Matt R, Alec and Suz to perform the parts. Given that I’d had to adapt the materials
to make them suitable for our use I had to assign myself a non-suspect part and so I created police officer (investigating the murder) whose narration provided a framing device for the
scenes.
I threw together a quick Firebase backend to allow data to be synchronised across a web application, then wrote a couple of dozen lines of
Javascript to tie it together. The idea was that I’d “push” documents to each participants’ phone as they needed them, in a digital analogue of the “open envelope #3” or “turn to the
next page in your book” mechanism common in most murder mystery kits. I also reimplemented all of Akers’ artefacts, which were pretty-much text-only, as graphics, and set up a system
whereby I could give the “finder” of each clue a copy in-advance and then share it with the rest of the participants when it was appropriate, e.g. when they said, out loud “I’ve found
this newspaper clipping that seems to say…”
The party itself took place over Discord video chat, with which I’d recently had a good experience in an experimental/offshoot Abnib group (separate from our normal WhatsApp space) and
my semi-associated Dungeons & Dragons group. There were a few technical hiccups, but only what you’d expect.
The party itself rapidly descended into the usual level of chaos. Lots of blame thrown, lots of getting completely off-topic and getting distracted solving the wrong puzzles, lots of
discussion about the legitimacy of one of several red herrings, and so on. Michael Akers makes several choices in his writing that don’t appear in mine – such as not revealing the
identity of the murderer even to the murderer until the final statements – which I’m not a fan of but retained for the sake of honouring the original text, but if I were to run
a similar party again I’d adapt this, as I had a few other aspects of the setting and characters. I think it leads to a more fun game if, in the final act, the murderer knows that they
committed the crime, that all of the lies they’ve already told are part of their alibi-building, and they’re given carte blanche to lie as much as they like in an effort to
“get away with it” from then on.
Of course, Ruth felt the need to cater for the event – as she’s always done with spectacular effect at every previous murder mystery she’s hosted or we’ve collectively hosted – despite
the distributed partygoers. And so she’d arranged for a “care package” of wine and cheese to be sent to each household. The former was, as always, an excellent source of social
lubrication among people expected to start roleplaying a random character on short notice; the latter a delightful source of snacking as we all enjoyed the closest thing we’ll get to a
“night out” in many months.
This was highly experimental, and there are lessons-for-myself I’d take away from it:
If you’re expecting people to use their mobiles, remember to test thoroughly on mobiles. You’d think I’d know this, by now. It’s only, like, my job.
When delivering clues and things digitally, keep everything in one place. Switching back and forth between the timeline that supports your alibi and the new
information you’ve just learned is immersion-breaking. Better yet, look into ways to deliver physical “feelies” to people if it’s things that don’t need sharing, and consider ways to
put shared clues up on everybody’s “big screen”.
Find time to write more murder mysteries. They’re much better than kit-style ones; I’ve got a system and it works. I really shout get around to writing up
how I make them, some day; I think there’s lessons there for other people who want to make their own, too.
Meanwhile: if you want to see some moments from Sour Grapes, there’s a mini YouTube
playlist I might get around to adding to at some point. Here’s a starter if you’re interested in what we got up to (with apologies for the audio echo, which was caused by a problem
with the recording software):