How to beat Skyrim without walking

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I don’t normally watch videos of other people playing video games. I’m even less inclined to watch “walkthroughs”.

This, though, isn’t a walkthrough. It’s basically the opposite of a walkthrough: this is somebody (slowly, painstakingly) playing through Skyrim: Special Edition without using any of the movement controls (WASD/left stick) whatsoever. Wait, what? How is such a thing possible?

That’s what makes the video so compelling. The creator used so many bizarre quirks and exploits to even make this crazy stupid idea work at all. Like (among many, many more):

  • Dragging a bucket towards yourself to “push” yourself backwards (although not upstairs unless you do some very careful pushing “under” your feet).
  • Doing an unarmed heavy attack to “stumble” forward a little at a time, avoiding the stamina loss by eating vegetable soup or by cancelling the attack (e.g. by switching quickselected arrows), which apparently works better if you’re overencumbered.
  • Mid-stumble, consuming a reagent that paralyses yourself to glitch through thin doors. Exploit a bug in dropping gear for your companion near an area-change doorway to get all of the reagent you’ll ever need.
  • Rush-grinding your way to the Whirlwind Sprint shout and Vampire Lord “Bats” ability so you’ve got a way to move forward quickly, then pairing them with paralysis to catapult yourself across the map.
  • When things get desperate, exploiting the fact that you can glitch-teleport yourself places by commanding your companion to go somewhere, quicksaving before they get there, then quickloading to appear there yourself.

This video’s just beautiful: the cumulation of what must be hundreds or thousands of person-hours of probing the “edges” of Skyrim‘s engine to discover all of the potentially exploitable bugs that make it possible.

Polyam Lingo

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Dr. Doe’s latest Sexplanations vlog is on polyamorous language, and despite being – or, perhaps, because I’m – a bit of a long-toothed polyamorist these days, fully a quarter or more of the terms she introduced were new to me! Fascinating!

I Will Never Stop Learning

I’ve been doing a course provided through work to try to improve my ability to connect with an audience over video.

This is my fourth week in the course, and I opted to revisit a video I made during my second week and try to do it again with more engagement, more focus, more punch, and more emotion. I’m pretty pleased with how it turned out. Interestingly, it somewhat mirrors my Howdymattic video from when I first started at Automattic, but I pivoted my “origin story” a little bit and twisted it to fit one of my favourite parts of the Automattic Creed.

Shot during the same outing as the Devil’s Quoits one. Also available on YouTube.

The Devil’s Quoits

I’ve been doing a course provided through work to try to improve my ability to connect with an audience over video. For one of my assignments in this, my fourth week, I picked a topic out from the “welcome” survey I filled out when I first started the course. The topic: the Devil’s Quoits. This stone circle – not far from my new house – has such a bizarre history of construction, demolition, and reconstruction… as well as a fun folk myth about its creation… that I’d thought it’d make a great follow-up to my previous “local history” piece, Oxford’s Long-Lost Zoo. I’d already hidden a “virtual” geocache at the henge, as I previously did for the zoo: a video seemed like the next logical step.

My brief required that the video be only about a minute long, which presented its own challenge in cutting down the story I’d like to tell to a bare minimum. Then on top of that, it took me at least eight takes until I was confident that I’d have one I was happy with, and there’s still things I’d do differently if I did it again (including a better windbreak on my lapel mic, and timing my takes for when geese weren’t honking their way past overhead!).

In any case: part of the ritual of this particular course encourages you to “make videos… as if people will see them”, and I’ve been taking that seriously! Firstly, I’ve been sharing many of my videos with others either at work or on my blog, like the one about how GPS works or the one about the secret of magic. Secondly, I’ve been doing “extra credit” by recording many of my daily-standup messages as videos, in addition to providing them through our usual Slack bot.

Anyway, the short of it is: you’re among the folks who get to see this one. Also available on YouTube.

Tribute to Peter Huntley

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

While I was traipsing off around the countryside to commemorate the anniversary of the death of my dad, one of his former colleagues uploaded to YouTube a video that he originally produced for the UK Bus Awards Presentation Ceremony 2012.

As his son, it felt a little weird for me to be marking the occasion on what: the ninth anniversary of his death? It’s not even a nice round number. But clearly I’m not the only one whose mind drifted to my father on 19 February.

Fun fact: this photo – extracted from the video – was originally taken by me:

Peter Huntley, circa 1985, in a bus depot.
My dad’s crouching to make sure he’s in the frame, because I was less than half his height at this point. The horizon is wonky because I’m crouching too, in order to imitate him, and I’ve lost my balance. Altogether, I rate this piece of photographic art… umm: not bad for a preschooler?

Maybe I should be asking for royalties! Or at least, using the video as an excuse to springboard my career as a professional photographer.

YouTube ID badge showing that Chris Cheek has only one subscriber.
What kind of exposure could I get? Oh.

Well, maybe not then.

× ×

The Coolest Thing About GPS

I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve my video presentation skills, in particular my engagement with the camera and the audience.

I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re an expert on”. The idea came from a talk I used to give at the University of Oxford.

The Secret of Magic

I’m currently doing a course, through work, delivered by BetterOn Video. The aim of the course is to improve my video presentation skills, in particular my engagement with the camera and the audience.

I made this video based on the week 2 prompt “make a video 60-90 seconds long about something you’re passionate about”. The idea came from a blog post I wrote back in 2014.

Axe Feather 2021

tl;dr?

I recreated a 16-year old interactive ad. Experience it here. Get the source code here. Or keep reading for the full story.

What?

Back in 2005 I reblogged a Flash-based interactive advert I’d discovered via del.icio,us. And if that sentence wasn’t early-naughties enough for you, buckle up…

A woman lies on a bed with her legs crossed, playfully wagging her finger.
This screenshot isn’t from the original site but from my homage to it. More on that later.

At the end of 2004, Unilever brand Axe (Lynx here in the UK) continued their strategy of marketing their deodorant as magically transforming young men into hyper-attractive sex gods. This is, of course, an endless battle, pitting increasingly sexually-charged advertisements against the fundamental experience of their product, which smells distinctly like locker rooms and school discos. To launch 2005’s new fragrance Feather, they teamed up with London-based design agency Dare Digital to create a game at domain AxeFeather.com (long since occupied by domain squatters).

In the game, the player’s mouse pointer becomes a feather which they can use to tickle an attractive young woman lying on a bed. The woman’s movements – which vary based on where she’s tickled – have been captured in digital video. This was aggressively compressed using the then-new H.263-ish Sorensen Spark codec to make a download just-about small enough to be tolerable for people still on dial-up Internet access (which was still almost as popular as broadband). The ad became a viral hit. I can’t tell you whether it paid for itself in sales, but it must have paid for itself in brand awareness: on Valentines Day 2005 it felt like it was all the Internet wanted to talk about.

Axe Feather logo visible via Archive.org, circa August 2005, in a Firefox browser window.
The site was archived by the WayBack Machine… but it doesn’t work in a modern browser.

I suspect its success also did wonders for the career of its creative consultant Olivier Rabenschlag, who left Dare a few years later, hopped around Silicon Valley for a bit, then landed himself a job as Head of Creative (now Chief Creative Officer) with Google. Kudos.

Why?

I told you about the site 16 years ago: why am I telling you again? Because this site, which made headlines at the time, is gone.

And not just a little bit gone, like a television ad no longer broadcast but which might still exist on YouTube somewhere (and here it is – you’re welcome for the earworm). The website went down in 2009, and because it was implemented in Flash the content was locked away in a compiled, proprietary format, which has ceased to be meaningfully usable on the modern web.

IE-specific CSS with a comment "Ok, so the scrollbar is IE specific...but I like it, ok?? :)"
The parts of AxeFeather.com’s code that are openly readable don’t help much, but I love this comment, which carries the scent of the adolescent web in the same way at Lynx deodorant carries the scent of an adolescent human.

The ad was pioneering. Flash had only recently gained video support (this would be used the following year for the first version of YouTube), and it had so far been used mostly for non-interactive linear video. This ad was groundbreaking… but now it’s disappeared like so much other Flash work. And for all that Flash might have been bad for the web, it’s an important part of our digital history [recommended reading].

Ruffle window showing an empty bed.
Third-party Flash emulation is imperfect. I tried to make Axe Feather work in Ruffle and got… an empty bed? What is this, a metaphor for being a lonely nerd?

So on a whim… I decided to see if I could recreate the ad.

Call it lockdown fever if you like, because it’s certainly not the work of a sane mind to attempt to resurrect a 16-year-old Internet advertisement. But that’s what I did.

How?

My plan: to reverse-engineer the digital assets (video, audio, cursor etc.) out of the original Flash file, and use them to construct a moderately-faithful recreation of the ad, suitable for use on the modern web. My version must:

  • Work in any modern browser, without Flash of course.
  • Work on mobile devices/with touchscreens, with all of the original functionality available without a keyboard (the original had secret content hidden behind keyboard keypresses). Nowadays, Rabenschlag knows to put mobile-first, but I think we can forgive him for not doing that twelve months before Flash Lite 2.0 would bring .flv support to mobile devices…
  • Indicate how much of the video content you’d seen, because we live in an era of completionists who want to know they’ve seen it all.
  • Depend on no third-party frameworks/libraries: just vanilla HTML, CSS, and JavaScript.

Let’s get started.

Reverse-engineering

Handbrake converting 19.flv to MP4 format.
At this point I noticed that the videos had no audio tracks: the giggling and other sound effects must be stored separately.

I grabbed the compiled .swf file from archive.org and ran it through SWFExtract and an online decompiler: neither was individually able to extract all of the assets, but together they gave me a full set. I ran the .flv files through Handbrake to get myself a set of .mp4 files instead.

Two starting frames from the videos, annotated to show that they are not aligned to the same point.
In what appears to have been an exercise in size optimisation, the original authors cropped the videos differently depending on how much space was needed (e.g. if the subject stretched her arms above her head, more space would be required). Clearly, some re-alignment would be needed.

Seeing that the extracted video files were clearly designed to be carefully-positioned on a static background, and not all in the exact same position, I decided to make my job easier by combining them all together, and including the background layer (the picture of the bed) as a single video. Integrating the background with the subject meant that I was able to use video editing software to tweak the position, which I imagined would be much easier than doing so in code. Combining all of the video clips into a single file provides compression benefits as well as making it easier to encourage a browser to precache the entire video to begin with.

Four layer design. From bottom to top: web page, video (showing woman on bed), (transparent) canvas, cursor (shaped like a feather).
My design called for three “layers” above my web page: the video, a transparent (and usually hidden) canvas showing the hit areas for debugging purposes, and the feather-shaped cursor.

The longest clip was a little over 6 seconds long, so I split my timeline into blocks of 7 seconds, padding each clip with a freeze-frame of its final image to make each exactly 7 seconds long. This meant that calculating the position in the finished video to which I wanted to jump was as simply as multiplying the (0-indexed) clip number by 7 and seeking to that position. The additional “frozen” frames acted as a safety buffer in case my JavaScript code was delayed by a few milliseconds in jumping to the “next” block.

Davinci Resolve showing composition of the actress onto the bed in a timeline.
I used onion-skinning to help “line up” the actress with herself as I composited her onto the bed in a single unified video of 7-second blocks.

An additional challenge was that in the original binary, the audio files were stored separately from the video clips… and slightly longer than them! A little experimentation revealed that the ends of each clip lined up, presumably something to do with how Flash preloads and synchronises media streams. Luckily for me, the audio clips were numbered such that they mostly mapped to the order in which the videos appeared.

Once I had a video file suitable for use on the web (you can watch the entire clip here, if you really want to), it was time to write some code.

Video timeline showing that each 7-second block is comprised of the original clip plus padding, atop a background layer of the bed and each clip's associated audio.
It feels slightly wasteful that over 50% of the resulting video clip is a freeze-frame, but modern video compression algorithms like H.264 reduce the impact considerably and the resulting video file is about the same size as its more-optimised predecessor.

Regular old engineering

The theory was simple: web page, video, loop the first seven seconds until you click on it, then animate the cursor (a feather) and jump to another seven-second block before jumping back or, in some cases, on to a completely new seven second block. Simple!

Of course, any serious web development is always a little more complex than you first anticipate.

Game map illustrating transition between the states of Axe Feather 2021.
I extracted from the .swf 34 distinct animated clips, which I numbered 0 through 33. 6 and 30 appeared to be duplicates of others. 0 and 33 are each two “idling” states from which interaction can lead to other states. Note that my interpretation of the order and relationship of animation sequences differs from the original.

For example: nowadays, putting a video on a web page is as easy as a <video> tag. But, in an effort to prevent background web pages from annoying you with unexpected audio, modern browsers won’t let a video play sound unless user interaction is the reason that the video starts playing (or unmutes, if it was playing-but-muted to begin with). Broadly-speaking, that means that a definitive user action like a “click” event has to be in the call stack when your code makes the video play/unmute.

But changing the .currentTime of a video to force it into a loop: that’s fine! So I set the video to autoplay muted on page load, with a script to make it loop within its first seven-second block. The actress doesn’t make any sound in block 0 (position A) anyway; so I can unmute the video when the user interacts with a hotspot.

For best performance, I used window.requestAnimationFrame to synchronise my non-interactive events (video loops, virtual cursor repositioning). This posed a slight problem in that animationframes wouldn’t be triggered if the tab was moved to the background: the video would play through each seven-second block and into the next! Fortunately the visibilitychange event came to the rescue and I was able to pause the video when it wasn’t being actively watched.

I originally hoped to use the cursor: CSS directive to make the “feather” cursor, but there’d be no nice way to animate it. Comet Cursor may have been able to use animated GIFs as cursors back in 1997 (when it wasn’t busy selling all your personal information to advertisers, back when that kind of thing used to attract widespread controversy), but modern browsers don’t… presumably because it would be super annoying. They also don’t all respect cursor: none, so I used the old trick of using cursor: url(null.png), none (where null.png is an almost-entirely transparent 1×1 pixel image) to hide the original cursor, then position an image dynamically.  I usegetBoundingClientRect() to allow the video to resize dynamically in CSS and convert coordinates on it represented as percentages into actual pixel values and vice-versa: this allows it to react responsively to any screen size without breakpoints or excessive code.

Once I’d gone that far I was able to drop the GIF idea entirely and used a CSS animation for the “tickling” motion.

Woman on bed in idle position B, with hotspots highlighted on each arm, her hed, her chest, her stomach, her hips, the top of her legs, and the bottom of the leg that's extended straight below her.
The hotspot overlay was added as a debugging feature but I left it in the final version. Hold the space bar to highlight hit areas.

I added a transparent <canvas> element on top of the <video> on which the hit areas are dynamically drawn to help me test the “hotspots” and tweak their position. I briefly considered implementing a visual tool to help me draw the hotspots, but figured it wasn’t quite worth the time it would take.

As I implemented more and more of the game, I remembered one feature from the original that I’d missed: the “blowaway”. If you trigger block 31 – a result of tickling the woman’s nose – she’ll blow your cursor off the screen. It’s particularly fun because it subverts the player’s expectations of their user interface: once you’ve got past the surprise of your cursor being a feather, you quickly settle in to it moving like a regular cursor… but then control’s stolen from you and the cursor vanishes! (Well I thought it was cool… 16 years ago.)

A woman blows a feather away from her face.
Sometimes tickling her nose will make her blow your feather off the screen. That’ll show you.

So yeah: that was my project this weekend.

I can’t even begin to explain why anybody would do this. But I did it. If you haven’t already: go have a play. And if you’re interested in how it works, the source code’s free for you to explore.

× × × × × × × × × × × ×

Recursively repackaging .mkvs as .mp4s

A note for my own reference. If you want to repackage a lot of .mkv files as .mp4, without transcoding, here’s a one-liner:

for f in **/*.mkv; do echo "$f"; ffmpeg -y -hide_banner -loglevel panic -i "$f" -c copy "${f%.mkv}.mp4"; rm "$f"; done

Endpoint Encabulator

(This video is also available on YouTube.)

I’ve been working as part of the team working on the new application framework called the Endpoint Encabulator and wanted to share with you what I think makes our project so exciting: I promise it’ll make for two minutes of your time you won’t seen forget!

Naturally, this project wouldn’t have been possible without the pioneering work that preceded it by John Hellins Quick, Bud Haggart, and others. Nothing’s invented in a vacuum. However, my fellow developers and I think that our work is the first viable encabulator implementation to provide inverse reactive data binding suitable for deployment in front of a blockchain-driven backend cache. I’m not saying that all digital content will one day be delivered through Endpoint Encabulator, but… well; maybe it will.

If the technical aspects go over your head, pass it on to a geeky friend who might be able to make use of my work. Sharing is caring!

Note #17871

Watched the pilot of Webbed Briefs by @heydonworks (of Every Layout fame). It’s a sarcastic independent vlog about web technologies, so I immediately fell in love and subscribed to the feed…

Just kidding. It doesn’t have a feed! (Yet?)

Webbed Briefs logo

Dune (2020)

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Oh my god I’m so excited. I’m afraid they might fuck up the story even more than David Lynch did in 1984 (not that I don’t love that film, too, but in a very different way than the books). I mean: I’d have hoped a modern adaptation would have a bigger part for Chani than it clearly does. And I know nothing at all about the lead, Timothée Chalamet. If only there was something I could do about these fears?

I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain.

Yeah, that’s the kind of thing.

The supporting cast look excellent. I think Josh Brolin will make an awesome Gurney Halleck, Jason Momoa will rock Duncan Idaho, and I’m looking forward to seeing Stephen McKinley Henderson play Thufir Hawat. But if there’s just one thing you should watch the trailer for… it’s to listen to fragments of Hans Zimmer’s haunting, simplistic choral adaptation of Pink Floyd’s Eclipse.

Geohashing expedition 2020-09-09 51 -1

This checkin to geohash 2020-09-09 51 -1 reflects a geohashing expedition. See more of Dan's hash logs.

Location

Edge of a field bounded by Letcombe Brook, over the A338 from Landmead Solar Farm.

Participants

Plans

We’re discussing the possibility of a Subdivision geohash achievement for people who’ve reached every “X in a Y”, and Fippe pointed out that I’m only a hash in the Vale of White Horse from being able to claim such an achievement for Oxfordshire’s regions. And then this hashpoint appears right in the Vale of White Horse: it’s like it’s an omen!

Technically it’s a workday so this might have to be a lunchtime expedition, but I think that might be workable. I’ve got an electric vehicle with a hundred-and-something miles worth of batteries in the tank and it looks like there might be a lay-by nearby the hashpoint (with a geocache in it!): I can drive down there at lunchtime, walk carefully back up the main road, and try to get to the hashpoint!

Expedition

I worked hard to clear an hour of my day to take a trip, then jumped in my (new) electric car and set off towards the hashpoint. As I passed Newbridge I briefly considered stopping and checking up on my geocache there but feeling pressed for time I decided to push on. I parked in the lay-by where GC5XHJG is apparently hidden but couldn’t find it: I didn’t search for long because the farmer in the adjacent field was watching me with suspicion and I figured that anyway I could hunt for it on the way back.

Walking along the A338 was treacherous! There are no paths, only a verge covered in thick grass and spiky plants, and a significant number of the larger vehicles (and virtually all of the motorbikes) didn’t seem to be obeying the 60mph speed limit!

Reaching the gate, I crawled under (reckoning that it’s probably there to stop vehicles and not humans) and wandered along the lane. I saw a red kite and a heron doing their thing before I reached the bridge, crossed Letcombe Brook, and followed the edge of the field. Stuffing my face with blackberries as I went, it wasn’t long before I reached the hashpoint on one edge of the field.

I took a short-cut back before realising that this would put me in the wrong place to leave a The Internet Was Here sign, so I doubled-back to place it on the gate I’d crawled under. Then I returned to the lay-by, where another car had just pulled up (right over the GZ of the geocache I’d hoped to find!) and didn’t seem to be going anywhere. Sadly I couldn’t wait around all day – I had work to do! – so I went home, following the satnav in the car in a route that resulted in a figure-of-eight tracklog.

Tracklog

My GPS keeps a tracklog. Here you go:

Geohashing expedition 2020-09-09 51 -1 tracklog map

Video

You can also watch it at:

Photos

360° panoramic VR photo of the 2020-09-09 51 -1 geohashpoint

× ×