Hi, ONS! I know we haven’t really spoken since you ghosted me in 2011, but
I just wanted to clear something up for you –
This is not a mistake (except for the missing last names):
It’s perfectly possible for somebody to live with multiple partners, even if they’re forbidden from marrying more than one.
Back in 2011 you thought it was a mistake, and this prevented my partner, her husband and I from filling out the digital version of the
census. I’m sure it’s not common for somebody to have multiple cohabiting romantic relationships (though it’s possibly more common than some other things you track…), but
surely an “Are you sure?” would be better than a “No you don’t!”
For all I know, you already fixed it. If not: I mocked-up a UI for you.
We worked around it in 2011 by using the paper forms. Apparently this way you still end up “correcting” our relationship status for
us (gee, thanks!) but at least – I gather – the originals are retained. So maybe in a more-enlightened time, future statisticians might be able ask about the demographics
of domestic nonmonogamy and have at least some data to work with from the early 21st century.
I know you’re keen for as many people as possible to do the census digitally this year. But
unless you’ve fixed your forms then my family and I – and thousands of others like us – will either have to use the paper copies you’re trying to phase out… or else
knowingly lie on the digital versions. Which would you prefer?
Enfys published an article this week to their personal blog: How to use gender-inclusive language. It spun out from a post that they co-authored on an internal Automattic blog, and while the while thing is pretty awesome as a primer for anybody you need to show it to, it introduced a new word to my lexicon for
which I’m really grateful.
The Need for a New Word
I’ve long bemoaned the lack of a gender-neutral term encompassing “aunts and uncles” (and, indeed, anybody else in the same category: your parents’ siblings and their spouses). Words
like sibling have been well-established for a century or more; nibling has gained a lot of
ground over the last few decades and appears in many dictionaries… but we don’t have a good opposite to nibling!
Why do we need such a word?
As a convenient collective noun: “I have 5 aunts and uncles” is clumsier than it needs to be.
Where gender is irrelevant: “Do you have and aunts and/or uncles” is clumsier still.
Where gender is unknown: “My grandfather has two children: my father and Jo.” “Oh; so you have an Aunt or Uncle Jo?” Ick.
Where gender is nonbinary: “My Uncle Chris’s spouse uses ‘they/them’ pronouns. They’re my… oh fuck I don’t even remotely have a word for this.”
New Words I Don’t Like
I’m not the first to notice this gap in the English language, and others have tried to fill it.
I’ve heard pibling used, but I don’t like it. I can see what its proponents are trying to do: combine
“parent” and “sibling” (although that in itself feels ambiguous: is this about my parents’ siblings or my siblings’ parents, which aren’t necessarily the same thing). Moreover, the
-ling suffix feels like a diminutive, even if that’s not its etymological root in this particular case, and it feels backwards to use a diminutive to describe somebody
typically in an older generation than yourself.
I’ve heard that some folks use nuncle, and I hate that word even more. Nuncle already has a meaning, albeit an archaic
one: it means “uncle”. Read your Shakespeare! Don’t get me wrong: I’m all for resurrecting useful archaic words: I’m
on a personal campaign to increase use eyeyesterday and, especially, overmorrow (German has übermorgen, Afrikaans has
oormôre, Romanian has poimâine: I want a word for “the day
after tomorrow” too)! If you bring back a word only to try to define it as almost-the-opposite of what you want it to mean, you’re in for trouble.
Auntle is another candidate – a simple fusion of “aunt” and “uncle”… but it still feels a bit connected to the gendered terms it comes from, plus if you look around enough you
find it being used for everything from an affectionate mutation of “aunt” to a term to refer to your uncle’s husband. We can do better.
A New Word I Do!
But Enfys’ post gave me a new word, and I love it:
…
Here are some gender-neutral options for gendered words we hear a lot. They’re especially handy if you’re not sure of the gender of the person you’re addressing:
Mx.: An honorific, alternative to Mr./Mrs./Ms.
Sibling: instead of brother/sister
Spouse: instead of husband/wife
Partner, datefriend, sweetheart, significant other: instead of boyfriend/girlfriend
Parent: instead of mother/father
Nibling: instead of niece/nephew
Pibling, Entle, Nuncle: instead of aunt/uncle
…
Entle! Possibly invented here, this is the best gender-neutral term for “the sibling of your
parents, or the spouse of the sibling of your parents, or another family member who fulfils a similar role” that I’ve ever seen. It brings “ent” from “parent” which, while
etymologically the wrong part of the word for referring to blood relatives (that comes from a PIE root pere- meaning “to produce or bring forth”), feels similar to the contemporary slang root rent (clipped form of “parent”).
It feels new and fresh enough to not be “auntle”, but it’s similar enough to the words “aunt” and “uncle” that it’s easy to pick up and start using without that “what’s that new word I
need to use here?” moment.
I’m totally going to start using entle. I’m not sure I’ll find a use for it today or even tomorrow. But overmorrow? You never know.
Back in 2005 I reblogged a Flash-based interactive advert I’d discovered via del.icio,us. And if that sentence wasn’t early-naughties enough for you, buckle up…
This screenshot isn’t from the original site but from my homage to it. More on that later.
At the end of 2004, Unilever brand Axe (Lynx here in the UK)
continued their strategy of marketing their
deodorant as magically transforming young men into hyper-attractive sex gods. This is, of course, an endless battle, pitting increasingly sexually-charged advertisements against the
fundamental experience of their product, which smells distinctly like locker rooms and school discos. To launch 2005’s new fragrance Feather, they teamed up with London-based
design agency Dare Digital to create a game at domain AxeFeather.com (long since occupied by domain squatters).
In the game, the player’s mouse pointer becomes a feather which they can use to tickle an attractive young woman lying on a bed. The woman’s movements – which vary based on where she’s
tickled – have been captured in digital video. This was aggressively compressed using the then-new H.263-ish
Sorensen Spark codec to make a download just-about small enough to be tolerable for people still on dial-up Internet access (which was still almost as popular as broadband). The ad became a viral hit. I can’t tell you whether it paid for itself in sales, but it
must have paid for itself in brand awareness: on Valentines Day 2005 it felt like it was all the Internet wanted to talk about.
I suspect its success also did wonders for the career of its creative consultant Olivier Rabenschlag, who left Dare a few years
later, hopped around Silicon Valley for a bit, then landed himself a job as Head of Creative (now Chief Creative Officer) with Google. Kudos.
Why?
I told you about the site 16 years ago: why am I telling you again? Because this site, which made
headlines at the time, is gone.
And not just a little bit gone, like a television ad no longer broadcast but which might still exist on YouTube somewhere (and here it is – you’re welcome for the earworm). The website went down in 2009, and because it was implemented in Flash the content
was locked away in a compiled, proprietary format, which has ceased to be meaningfully usable on the modern web.
The parts of AxeFeather.com’s code that are openly readable don’t help much, but I love this comment, which carries the scent of the adolescent web in the same way at Lynx deodorant
carries the scent of an adolescent human.
The ad was pioneering. Flash had only recently gained video support (this would be used the following year for the first version of YouTube), and it had so far been used mostly for
non-interactive linear video. This ad was groundbreaking… but now it’s disappeared like so much other Flash work. And for all that Flash might have been bad for the web,
it’s an important part of our
digital history [recommended reading].
Third-party Flash emulation is imperfect. I tried to make Axe Feather work in Ruffle and got… an empty bed? What is this, a metaphor for being a
lonely nerd?
So on a whim… I decided to see if I could recreate the ad.
Call it lockdown fever if you like, because it’s certainly not the work of a sane mind to attempt to resurrect a 16-year-old Internet advertisement. But that’s what I did.
How?
My plan: to reverse-engineer the digital assets (video, audio, cursor etc.) out of the original Flash file, and use them to construct a moderately-faithful recreation of the ad,
suitable for use on the modern web. My version must:
Work in any modern browser, without Flash of course.
Indicate how much of the video content you’d seen, because we live in an era of completionists who want to know they’ve seen it all.
Depend on no third-party frameworks/libraries: just vanilla HTML, CSS, and JavaScript.
Let’s get started.
Reverse-engineering
At this point I noticed that the videos had no audio tracks: the giggling and other sound effects must be stored separately.
I grabbed the compiled .swf file from archive.org and ran it through
SWFExtract and an online decompiler: neither was individually able to extract
all of the assets, but together they gave me a full set. I ran the .flv files through Handbrake to get myself a set of
.mp4 files instead.
In what appears to have been an exercise in size optimisation, the original authors cropped the videos differently depending on how much space was needed (e.g. if the subject
stretched her arms above her head, more space would be required). Clearly, some re-alignment would be needed.
Seeing that the extracted video files were clearly designed to be carefully-positioned on a static background, and not all in the exact same position, I decided to make my job easier by
combining them all together, and including the background layer (the picture of the bed) as a single video. Integrating the background with the subject meant that I was able to use
video editing software to tweak the position, which I imagined would be much easier than doing so in code. Combining all of the video clips into a single file provides compression
benefits as well as making it easier to encourage a browser to precache the entire video to begin with.
My design called for three “layers” above my web page: the video, a transparent (and usually hidden) canvas showing the hit areas for debugging purposes, and the feather-shaped
cursor.
The longest clip was a little over 6 seconds long, so I split my timeline into blocks of 7 seconds, padding each clip with a freeze-frame of its final image to make each exactly 7
seconds long. This meant that calculating the position in the finished video to which I wanted to jump was as simply as multiplying the (0-indexed) clip number by 7 and seeking to that
position. The additional “frozen” frames acted as a safety buffer in case my JavaScript code was delayed by a few milliseconds in jumping to the “next” block.
I used onion-skinning to help “line up” the actress with herself as I composited her onto the bed in a single unified video of 7-second blocks.
An additional challenge was that in the original binary, the audio files were stored separately from the video clips… and slightly longer than them! A little experimentation revealed
that the ends of each clip lined up, presumably something to do with how Flash preloads and synchronises media streams. Luckily for me, the audio clips were numbered such that
they mostly mapped to the order in which the videos appeared.
Once I had a video file suitable for use on the web (you can watch the entire clip here, if you really want to), it was time to
write some code.
It feels slightly wasteful that over 50% of the resulting video clip is a freeze-frame, but modern video compression algorithms like H.264 reduce the impact considerably and the
resulting video file is about the same size as its more-optimised predecessor.
Regular old engineering
The theory was simple: web page, video, loop the first seven seconds until you click on it, then animate the cursor (a feather) and jump to another seven-second block before jumping
back or, in some cases, on to a completely new seven second block. Simple!
Of course, any serious web development is always a little more complex than you first anticipate.
I extracted from the .swf 34 distinct animated clips, which I numbered 0 through 33. 6 and 30 appeared to be duplicates of others. 0 and 33 are each two “idling” states
from which interaction can lead to other states. Note that my interpretation of the order and relationship of animation sequences differs from the original.
For example: nowadays, putting a video on a web page is as easy as a <video> tag. But, in an effort to prevent background web pages from annoying you with unexpected
audio, modern browsers won’t let a video play sound unless user interaction is the reason that the video starts playing (or unmutes, if it was playing-but-muted to
begin with). Broadly-speaking, that means that a definitive user action like a “click” event has to be in the call stack when your code makes the video play/unmute.
But changing the .currentTime of a video to force it into a loop: that’s fine! So I set the video to autoplay muted on page load, with a script to make it loop
within its first seven-second block. The actress doesn’t make any sound in block 0 (position A) anyway; so I can unmute the video when the user interacts with a hotspot.
For best performance, I used window.requestAnimationFrame to synchronise my non-interactive events (video loops, virtual cursor repositioning). This posed a slight problem
in that animationframes wouldn’t be triggered if the tab was moved to the background: the video would play through each seven-second block and into the next! Fortunately the
visibilitychange event came to the rescue and I was able to pause the video when it wasn’t being actively watched.
I originally hoped to use the cursor: CSS directive to make the “feather” cursor, but there’d be no nice way to
animate it. Comet Cursor may have been able to use animated GIFs
as cursors back in 1997 (when it wasn’t busy selling all your personal information to advertisers, back when that kind of thing used to attract widespread controversy), but modern
browsers don’t… presumably because it would be super annoying. They also don’t all respect cursor: none, so I used the old trick of using cursor: url(null.png),
none (where null.png is an almost-entirely transparent 1×1 pixel image) to hide the original cursor, then position an image dynamically. I
usegetBoundingClientRect() to allow the video to resize dynamically in CSS and convert coordinates on it represented
as percentages into actual pixel values and vice-versa: this allows it to react responsively to any screen size without breakpoints or excessive code.
Once I’d gone that far I was able to drop the GIF idea entirely and used a CSS animation for the “tickling” motion.
The hotspot overlay was added as a debugging feature but I left it in the final version. Hold the space bar to highlight hit areas.
I added a transparent <canvas> element on top of the <video> on which the hit areas are dynamically drawn to help me test the “hotspots” and tweak
their position. I briefly considered implementing a visual tool to help me draw the hotspots, but figured it wasn’t quite worth the time it would take.
As I implemented more and more of the game, I remembered one feature from the original that I’d missed: the “blowaway”. If you trigger block 31 – a result of tickling the woman’s nose –
she’ll blow your cursor off the screen. It’s particularly fun because it subverts the player’s expectations of their user interface: once you’ve got past the surprise of your
cursor being a feather, you quickly settle in to it moving like a regular cursor… but then control’s stolen from you and the cursor vanishes! (Well I thought it was cool… 16 years ago.)
Sometimes tickling her nose will make her blow your feather off the screen. That’ll show you.
Earlier this month, I made my first attempt at cooking pizza in an outdoor wood-fired oven. I’ve been making pizza for years: how hard can it be?
Do you know what temperature Teflon (PTFE) burns at? I do. Now. (About 470ºC.)
It turned out: pretty hard. The oven was way hotter than I’d appreciated and I burned a few crusts. My dough was too wet to slide nicely off my metal peel (my wooden peel
disappeared, possibly during my house move last year), and my efforts to work-around this by transplanting cookware in and out of the
oven quickly lead to flaming Teflon and a shattered pizza stone. I set up the oven outside the front door and spent all my time running between the kitchen (at the back of the house)
and the front door, carrying hot tools, while hungry children snapped at my ankles. In short: mistakes were made.
I roped in Robin to help make dough, because he’s damn good at it. And because I had a mountain of work to do today.
I suspect that cooking pizza in a wood-fired oven is challenging in the same way that driving a steam locomotive is. I’ve not driven one, except in simulators, but it seems like you’ve
got a lot of things to monitor at the same time. How fast am I going? How hot is the fire? How much fuel is in it? How much fuel is left? How fast is it burning through it? How far to
the next station? How’s the water pressure? Oh fuck I forget to check on the fire while I was checking the speed…
“How hot is it?” is a question I’m now much better at answering. Thanks, technology! This oven’s still warming up. (!)
So it is with a wood-fired pizza oven. If you spend too long preparing a pizza, you’re not tending the fire. If you put more fuel on the fire, the temperature drops before it climbs
again. If you run several pizzas through the oven back-to-back, you leech heat out of the stone (my oven’s not super-thick, so it only retains heat for about four consecutive pizzas
then it needs a few minutes break to get back to an even temperature). If you put a pizza in and then go and prepare another, you’ve got to remember to come back 40 seconds later to
turn the first pizza. Some day I’ll be able to manage all of those jobs alone, but for now I was glad to have a sous-chef to hand.
Also, if too much of the flour you use to keep your peel moves slick falls into the oven, it catches fire. Now you have two fires to attend to.
Today I was cooking out amongst the snow, in a gusty crosswind, and I learned something else new. Something that perhaps I should have thought of already: the angle of the pizza
oven relative to the wind matters! As the cold wind picked up speed, its angle meant that it was blowing right across the air intake for my fire, and it was sucking all of
the heat out of the back of the oven rather than feeding the flame and allowing the plasma and smoke to pass through the top of the oven. I rotated the pizza oven so that the air blew
into rather than across the oven, but this fanned the flames and increased fuel consumption, so I needed to increase my refuelling rate… there are just so many
variables!
You know what, though? Everything turned out pretty-much okay.
The worst moment of the evening was probably when I took a bite out of a pizza that, it turned out, I’d shunted too-deep into the oven and it had collided with the fire. How do I know?
Because I bit into a large chunk of partially-burned wood. Not the kind of smoky flavour I was looking for.
But apart from that, tonight’s pizza-making was a success. Cooking in a sub-zero wind was hard, but with the help of my excellent sous-chef we churned out half a dozen good pizzas (and
a handful of just-okay pizzas), and more importantly: I learned a lot about the art of cooking pizza in a box of full of burning wood. Nice.
Dialect could be described as a rules-light, GM-less (it has a “facilitator” role, but they have no more authority than any player on
anything), narrative-driven/storytelling roleplaying game based on the concept of isolated groups developing their own unique dialect and using the words they develop as a vehicle to
tell their stories.
It’s also super-pretty to leaf through and hold.
This might not be the kind of RPG that everybody likes to play – if you like your rules more-structured, for example, or
you’re not a fan of “one-shot”/”beer and pretzels” gaming – but I was able to grab a subset of our usual roleplayers – Alec, Matt R, Penny, and
I – and have a game (with thanks to Google Meet for videoconferencing and Roll20 for the virtual tabletop: I’d have used Foundry but its card support is still pretty terrible!).
The Outpost
A game of Dialect begins with a backdrop – what other games might call a scenario or adventure – to set the scene. We opted for The Outpost, which put the four of us
among the first two thousand humans to colonise Mars, landing in 2045. With help from some prompts provided by the backdrop we expanded our situation in order to declare the “aspects”
that would underpin our story, and then expand on these to gain a shared understanding of our world and society:
Refugees from plague: Our expedition left Earth to escape from a series of devastating plagues that were ravaging the planet, to try to get a fresh start on another
world.
Hostile environment: Life on Mars is dominated by the ongoing struggle for sufficient food and water; we get by, but only thanks to ongoing effort and discipline and
we lack some industries that we haven’t been able to bootstrap in the five years we’ve been here (we had originally thought that others would follow).
Functionalist, duty-driven society: The combination of these two factors led us to form a society based on supporting its own needs; somewhat short of a caste system,
our culture is one of utilitarianism and unity.
Our finished game board, or tableau.
It soon became apparent that communication with Earth had been severed, at least initially, from our end: radicals, seeing the successes of our new social and economic systems,
wanted to cement our differences by severing ties with the old world. And so our society lives in a hub-and-spoke cave system beneath the Martian desert, self-sustaining except for the
need to send rovers patrolling the surface to scout for and collect valuable surface minerals.
In this world, and prompted by our cards, we each developed a character. I was Jeramiah, the self-appointed “father” of the expedition and of this unusual new social order, who
remembers the last disasters and wars of old Earth and has revolutionary plans for a better world here on Mars, based on controlled growth and a planned economy. Alec played Sandy –
“Tyres” to their friends – a rover-driving explorer with one eye always on the horizon and fresh stories for the colony brought back from behind every new crater and mountain. Penny
played Susie, acting not only as the senior medic to the expedition but something more: sort-of the “mechanic” of our people-driven underground machine, working to keep alive the
genetic records we’d brought from Earth and keep them up-to-date as our society eventually grew, in order to prevent the same kinds of catastrophe happening here. “Picker” Ben was our
artist, for even a functionalist society needs somebody to record its stories, celebrate its accomplishments, and inspire its people. It’s possible that the existence of his position
was Jeramiah’s doing: the two share a respect for the stark, barren, undeveloped beauty of the Martian surface.
We developed our language using prompt cards, improvised dialogue, and the needs of our society. But the decades that followed brought great change. More probes began to land from
Earth, more sophisticated than the ones that had delivered us here. They brought automated terraforming equipment, great machines that began to transform Mars from a barren wasteland
into a place for humans to thrive. These changes fractured our society: there were those that saw opportunity in this change – a chance to go above ground and live in the sun, to expand
across the planet, to make easier the struggle of our day-to-day lives. But others saw it as a threat: to our way of life, which had been shaped by our challenging environment; to our
great social experiment, which could be ruined by the promise of an excessive lifestyle; to our independence, as these probes were clearly the harbingers of the long-promised second
wave from Earth.
Even as new colonies were founded, the Martians of the Hub (the true Martians, who’d been here for yams time, lived and defibed here, not these tanning desert-dwellers that
followed) resisted the change, but it was always going to be a losing battle. Jeramiah took his last breath in an environment suit atop a dusty Martian mountain a day’s drive from the
Hub, watching the last of the nearby deserts that was still untouched by the new green plants that had begun to spread across the surface. He was with his friend Sandy, for despite all
of the culture’s efforts to paint them as diametrically opposed leaders with different ideas of the future, they remained friends until the end. As the years went by and more and more
colonists arrived, Sandy left for Phobos, always looking for a new horizon to explore. Sick of the growing number of people who couldn’t understand his language or his art, Ben
pioneered an expedition to the far side of the planet where he lived alone, running a self-sustaining agri-home and exploring the hills until his dying day. We were never sure where
Susie ended up, but it wasn’t Mars: she’d talked about joining humanity’s next big jump, to the moons of Jupiter, so perhaps she’s out there on one of the colonies of Titan or Europa.
Maybe, low clicks, she’s even keeping our language alive out there.
Retrospective
The whole event was a lot of fun and I’m keen to repeat it, perhaps with a different group and a different backdrop. The usual folks know who they are, but if you’re not one of
those and you want in next time we play, drop me a message of some kind.
It’s that time of year again when I comparison-shop for car insurance, and every time I come across a new set of reasons to hate the developers at Confused.com. How do you confuse me?
Let me count the ways.
No means yes
I was planning to enumerate my concerns to them directly, via their contact form, but when I went to do so I spotted this bit of
genius, which clinched it and made me write a blog post instead:
Clicking the word “Yes” means “Yes”. Clicking the word “No” means “Yes” as well.
Turns out that there’s a bit of the old sloppy-paste going on there:
Honestly, I’m used to my unusual name causing trouble by now and I know how to work around it in the way that breaks the fewest systems (I can even usually
get airline tickets without too much difficulty nowadays). But these kinds of (arbitrary) restrictions must frustrate folks like Janice Keihanaikukauakahihulihe’ekahaunaele.
I guess their developers didn’t realise that this blog post was parody?
Also, that’s not my title!
This one, though, pisses me off:
As everybody knows, there are only six titles, and two of them are “Dr”.
This is a perfect example of why your forms should ask for what you actually want to know, not for what you think people want to tell you. Just ask!
If you want to know my gender, ask for my gender! (I’m a man, by the way.)
I don’t understand why you want to know – after all, it’s been illegal since 2012 to risk-assess/price car insurance differently on the grounds of gender – but maybe you’ve
got a valid reason. Which hopefully you’ll tell me in a tooltip. Like you’re using it as a (terrible checksum) when you check my driving license details, that’s fine!
If you want to know my title, ask for my title! (I prefer not to use one, but if you must use one I’d prefer Mx.)
This ought to be an optional field, of course, and ideally you want a free text input or else you’ll always have missed somebody (Lord, Reverend, Prince, Wing Commander…).
It’s in your interests because I’m totally going to pick at random otherwise. Today I’m a Ms.
Consistency? Never heard of it.
It’s not a big thing, but if you come up with a user interface paradigm like “clicking More… shows more buttons”, you ought to stick to it.
Maybe their internal style guide says “a More… button with three additional options should use buttons, but four additional options should be a drop-down”. But it seems more-likely
that they just don’t have one.
Again, I’m not sure exactly what all of this data is used for, nor why there’s a need to differentiate between married couples and civil partnerships, but let’s just assume this is all
necessary and legitimate and just ask ourselves: why are we using drop-downs now for “More…”? We were using buttons just a second ago!
This was just crying out for a type-in field. But I guess the same developer who did the “Title” question did this one too, and wanted to show off the fancy “more buttons” control
they’d written. (Imaginary style guide be damned!)
What’s my occupation again?
There’s so much to unpack in the “occupation” part of the form that I’m not even sure where to begin. Let’s just pick out a few things:
I never answered a question this hard even in the exams I did when I was a student. Why do we care where students live… except if they’re postgrads? If I’m a mature student
studying a postgraduate course in medicine while living at home with my parents… which of the five possible options should I pick? And, again: what difference could it conceivably
make?
The student thing is just the beginning, though. You can declare up to two jobs, but if the first one is “house person/parent” you can’t have a second one. If you’re self-employed, that
has to be your first job even though the guidance says that the one you spend most time on must be the first one (this kind of thing infuriated me when I used to spend 60% of
my work time employed, 20% self-employed, and 20% studying).
I’m not saying it’s easy to make a form like this. I know from experience that it’s not. I am saying that Confused.com make it look a lot harder than it is.
Well that clears everything up. Also, I think you mean “houseperson”, unless you’re referring to somebody who is half-house/half-person, like some kind of architectural werewolf.
What do you mean, you live with your partner?
At a glance, this sounds like a “poly world problem”, but hear me out:
What you’re seeing here is a reference-identity error. I can’t possibly be living together with somebody as a couple if their marital status isn’t “Living With Partner”.
I put Ruth‘s martial status as married, because she’s married to JTA. But then when it asked how she was related to me, it wouldn’t accept
“Living together (couple)”.
If I put Ruth as the primary policyholder (proposer) though, I don’t even get the option of “living together (couple)” to describe her relationship with me. ‘Cos it’s physically
impossible to have a partner and be married, right?
Even if you don’t think it’s odd that they hide “living with partner” button as an option to describe a married person’s relationship to somebody other than their spouse… you’ve still
got to agree that it’s a little bit odd that they don’t hide the “spouse” button. In other words, this user interface is more-okay with you having multiple spouses than it is
with you having a spouse and an unmarried partner!
And of course this isn’t just about polyamorous folks: there are perfectly “normal” reasons that a person might end up confused by this interface, too. For example a separated (but not
yet divorced) couple, one of whom has a new partner (it’s not even inconceivable that such a pair might share custody of a car). Also interesting is the fact that the form doesn’t
care about the gender of your spouse (it doesn’t ask for “husband” or “wife”) but does care about the gender of your parent, child, or sibling. What gives?
Half a dozen easy fixes. Go for it, Confused.com.
Given that their entire marketing plan for most of the last two decades has been that they reduce customer confusion, Confused.com’s user interface leaves a lot to be
desired. As I’ve mentioned before – and speaking as a web developer that’s been in the game for longer than their company has – it’s not necessarily easy to get this kind of
thing right. But you can improve a form like this, a little at a time. And every little win counts for something: a more-satisfied returning customer, perhaps, or a new word-of-mouth
recommendation.
Or you can just let it languish and continue to have the kind of form that people mock on the public Internet.
It’ll be a year until I expect to comparison-shop for car insurance again: let’s see how they get on, shall we?
Update (21 January 2021): Confused.com Respond!
I didn’t expect to receive any response to this post: most organisations don’t when I call-out the problems with their websites (not least
because I’m more than a little bit sarcastic about it!). I never heard back from the Digital Climate Strike folks, for example,
when I pointed out that their website was a great example of exactly the kind of problem they were protesting. But Confused.com
passed on my thoughts to Product Manager Gareth who took a look at them and gave me a £20 Amazon gift card by way of thanks. Nice one, Confused.com!
On account of the pandemic, I’d expected my fortieth birthday to be a somewhat more-muted affair than I’d hoped.
I had a banner, I got trolled by bagels, and I received as a gift a pizza oven with which I immediately set fire to several pieces of cookware, but I hadn’t expected to be able to do anything like the
“surprise” party of my thirtieth, and that saddened me a little. So imagine my surprise when I come back from an evening walk the day after my birthday to discover than an
actual (remote) surprise party really had been arranged without my knowing!
“Hello, remote guests! What are you doing here?”
Not content with merely getting a few folks together for drinks, though, Ruth and team had gone to great trouble (involving lots of use of the
postal service) arranging a “kit” murder mystery party in the Inspector McClue series – The Diamonds, The Dagger, and One Classy Dame – for us all to play. The story is sort-of
a spiritual successor to The Brie, The Bullet, and The Black Cat, which we’d played fifteen years earlier. Minor
spoilers follow.
“Hello, local guests. Wait… why are you all in costumes…?”
Naturally, I immediately felt underdressed, having not been instructed that I might need a costume, and underprepared, having only just heard for the first time that I would be playing
the part of German security sidekick Lieutenant Kurt Von Strohm minutes before I had to attempt my most outrageous German accent.
Fortunately I was able to quickly imbibe a few glasses of champagne and quickly get into the spirit. Hic.
The plot gave me in particular a certain sense of deja vu. In The Brie, The Bullet, and The Black Cat, I played a French nightclub owner who later turned out to be an English
secret agent supplying the French Resistance with information. But in The Diamonds, The Dagger, and One Classy Dame I played a Gestapo officer who… also later turned out to be
an English secret agent infiltrating the regime and, you guessed it, supplying the French Resistance.
As she had previously with Sour Grapes, Ruth had worked to ensure that a “care package” had reached each murder mystery guest. Why yes,
it was a boozy care package.
It was not the smoothest nor the most-sophisticated “kit” murder mystery we’ve enjoyed. The technology made communication challenging, the reveal was less-satisfying than some others
etc. But the company was excellent. (And the acting way pretty good too, especially by our murderer whose character was exquisitely played.)
The largest bottle, though, was with us: we opened the Jeroboam of champagne Ruth and JTA had been saving from their
anniversary (they have a tradition involving increasing sizes of bottle; it’s a whole thing; I’ll leave them to write about it someday).
And of course the whole thing quickly descended into a delightful shouting match with accusations flying left, right, and centre and nobody having a clue what was going on. Like all of
our murder mystery parties!
I’m not sure how I feel about Google Meet’s automatic transcription feature. It was generally pretty accurate, but it repeatedly thought that it heard the word “Jewish” being spoken
by those of us who were putting on German accents, even though none of us said that.
In summary, the weekend of my fortieth birthday was made immeasurably better by getting to hang out with (and play a stupid game with) some of my friends despite the lockdown, and I’m
ever so grateful that those closest to me were able to make such a thing happen (and without me even noticing in advance).
Clearly those closest to me know me well, because for my birthday today I received a beautiful (portable: it packs into a bag!) wood-fired pizza oven, which I immediately assembled,
test-fired, cleaned, and prepped with the intention of feeding everybody some homemade pizza using some of Robin‘s fabulous bread dough, this
evening.
Fuelled up with wood pellets the oven was a doddle to light and bring up to temperature. It’s got a solid stone slab in the base which looked like it’d quickly become ideal for some
fast-cooked, thin-based pizzas. I was feeling good about the whole thing.
But then it all began to go wrong.
The confined space quickly heats up to a massive 400-500ºC.
If you’re going to slip pizzas onto hot stone – especially using a light, rich dough like this one – you really need a wooden peel. I own a wooden peel… somewhere: I haven’t seen it
since I moved house last summer. I tried my aluminium peel, but it was too sticky, even with a dusting of semolina or a light layer of
oil. This wasn’t going to work.
I’ve got some stone slabs I use for cooking fresh pizza in a conventional oven, so I figured I’d just preheat them, assemble pizzas directly on them, and shunt the slabs in. Easy as
(pizza) pie, right?
Within 60 seconds the pizza was cooked and, in its elevated position atop a second layer of stone, the crust began to burn. The only-mildy-charred bits were delicious, though.
This oven is hot. Seriously hot. Hot enough to cook the pizza while I turned my back to assemble the next one, sure. But also hot enough to crack apart my old pizza
stone. Right down the middle. It normally never goes hotter than the 240ºC of my regular kitchen oven, but I figured that it’d cope with a hotter oven. Apparently not.
So I changed plan. I pulled out some old round metal trays and assembled the next pizza on one of those. I slid it into the oven and it began to cook: brilliant! But no sooner had I
turned my back than… the non-stick coating on the tray caught fire! I didn’t even know that was a thing that could happen.
Hello fire. I failed to respect you sufficiently when I started cooking. I’ve learned my lesson now.
Those first two pizzas may have each cost me a piece of cookware, but they tasted absolutely brilliant. Slightly coarse, thick, yeasty dough, crisped up nicely and with a hint of
woodsmoke.
But I’m not sure that the experience was worth destroying a stone slab and the coating of a metal tray, so I’ll be waiting until I’ve found (or replaced) my wooden peel before I tangle
with this wonderful beast again. Lesson learned.
While talking about external CSS, he hinted at what I consider to be a distinct fourth way with its own unique use
cases:; using the Link: HTTP header. I’d like to share with you how it works and why I think it needs to be
kept in people’s minds, even if it’s not suitable for widespread deployment today.
Injecting CSS using the Link: HTTP Header
Every one of Jeremy’s suggestions involve adding markup to the HTML document itself. Which makes sense; you almost always
want to associate styles with a document regardless of the location it’s stored or the medium over which it’s transmitted. The most popular approach to adding CSS to a page uses the <link> HTML element, but did you know… the <link> element has a semantically-equivalent HTTP header,Link:.
A webserver adds headers when it serves a document anyway. Adding one more is no big deal.
Why is this important?
This isn’t something you should put on your website right now. This (21-year-old!) standard is still only really supported in Firefox and pre-Blink Opera, so you lose perhaps 95% of the
Web (it could be argued that because CSSought to be considered
progressive enhancement, it’s tolerable so long as your HTML is properly-written).
If it were widely-supported, though, that would be a really good thing: HTTP headers beat meta/link tags for configurability, performance management, and separation of concerns. Need some specific examples? Sure:
here’s what you could use HTTP stylesheet linking for:
You have no idea how many times in my career I’d have injected CSS Link: headers using a reverse proxy server the
standard was universally-implemented. This technique would have made one of my final projects at the Bodleian so much easier…
Performance improvement using aggressively preloaded “top” stylesheets before the DOM parser even fires up.
Stylesheet injection by edge caches to provide regionalised/localised changes to brand identity.
Strong separation of content and design by hosting content and design elements in different systems.
Branding your staff intranet differently when it’s accessed from outside the network than inside it.
Rebranding proprietary services on your LAN without deep inspection, using reverse proxies.
Less-destructive user stylesheet injection by plugins etc. that doesn’t risk breaking icky on-page Javascript (e.g. theme switchers).
Browser detection? 😂 You could use this technique today to detect Firefox. But you absolutely
shouldn’t; if you think you need browser detection in CSS, use this instead.
Unfortunately right now though, stylesheet Link: headers remain consigned to the bin of “cool stylesheet standards that we could probably use if it weren’t for fucking Google”; see also
alternate stylesheets.
My friend still uses a seriously retro digital music player, rather than his phone, to listen to music. It’s not a Walkman or a Minidisc player, I suppose, but it’s still pretty
elderly. But it’s not one of these.
I’m not here to speak about the legality of retaining offline copies of music from streaming services. YouTube Music seems to permit you to do this using their app, but I’ll bet there’s
something in their terms and conditions that specifically prohibits doing so any other way. Not least because Google’s arrangement with rights holders probably stipulates that they
track how many times tracks are played, and using a different player (like my friend’s portable device) would throw that off.
But what I’m interested in is the feasibility. And in answering that question, in explaining how to work out that it’s feasible.
The web interface to YouTube Music shows playlists of songs and streaming is just a click away.
Spoiler: I came up with an approach, and it looks like it works. My friend can fill up their Zune or whatever the hell
it is with their tunes and bop away. But what I wanted to share with you was the underlying technique I used to develop this approach, because it involves skills that as a web
developer I use most weeks. Hold on tight, you might learn something!
youtube-dl can download “playlists” already, but to download a personal playlist requires that you faff about with authentication and it’s a bit of a drag. Just extracting
the relevant metadata from the page is probably faster, I figured: plus, it’s a valuable lesson in extracting data from web pages in general.
Here’s what I did:
Step 1. Load all the data
I noticed that YouTube Music playlists “lazy load”, and you have to scroll down to see everything. So I scrolled to the bottom of the page until I reached the end of the playlist: now
everything was in the DOM, I could investigate it with my inspector.
Step 2. Find each track’s “row”
Using my browser’s debugger “inspect” tool, I found the highest unique-sounding element that seemed to represent each “row”/track. After a little investigation, it looked like
a playlist always consists of a series of <ytmusic-responsive-list-item-renderer> elements wrapped in a <ytmusic-playlist-shelf-renderer>. I tested
this by running document.querySelectorAll('ytmusic-playlist-shelf-renderer ytmusic-responsive-list-item-renderer') in my debug console and sure enough, it returned a number
of elements equal to the length of the playlist, and hovering over each one in the debugger highlighted a different track in the list.
The web application captured right-clicks, preventing the common right-click-then-inspect-element approach… so I just clicked the “pick an element” button in the debugger.
Step 3. Find the data for each track
I didn’t want to spend much time on this, so I looked for a quick and dirty solution: and there was one right in front of me. Looking at each track, I saw that it contained several
<yt-formatted-string> elements (at different depths). The first corresponded to the title, the second to the artist, the third to the album title, and the fourth to
the duration.
Better yet, the first contained an <a> element whose href was the URL of the piece of music.
Extracting the URL and the text was as simple as a .querySelector('a').href on the first
<yt-formatted-string> and a .innerText on the others, respectively, so I ran [...document.querySelectorAll('ytmusic-playlist-shelf-renderer
ytmusic-responsive-list-item-renderer')].map(row=>row.querySelectorAll('yt-formatted-string')).map(track=>[track[0].querySelector('a').href, `${track[1].innerText} -
${track[0].innerText}`]) (note the use of [...*] to get an array) to check that I was able to get all the data I needed:
Lots of URLs and the corresponding track names in my friend’s preferred format (me, I like to separate my music into folders
by album, but I suppose I’ve got a music player with more than a floppy disk’s worth of space on it).
Step 4. Sanitise the data
We’re not quite good-to-go, because there’s some noise in the data. Sometimes the application’s renderer injects line feeds into the innerText (e.g. when escaping an
ampersand). And of course some of these song titles aren’t suitable for use as filenames, if they’ve got e.g. question marks in them. Finally, where there are multiple spaces in a row
it’d be good to coalesce them into one. I do some experiments and decide that .replace(/[\r\n]/g, '').replace(/[\\\/:><\*\?]/g, '-').replace(/\s{2,}/g, ' ') does a
good job of cleaning up the song titles so they’re suitable for use as filenames.
I probably should have it fix quotes too, but I’ll leave that as an exercise for the reader.
Step 5. Produce youtube-dl commands
Okay: now we’re ready to combine all of that output into commands suitable for running at a terminal. After a quick dig through the documentation, I decide that we needed the following
switches:
-x to download/extract audio only: it defaults to the highest quality format available, which seems reasomable
-o "the filename.%(ext)s" to specify the output filename but accept the format provided by the quality requirement (transcoding to your preferred format is a
separate job not described here)
--no-playlist to ensure that youtube-dl doesn’t see that we’re coming from a playlist and try to download it all (we have our own requirements of each song’s
filename)
--download-archive downloaded.txt to log what’s been downloaded already so successive runs don’t re-download and the script is “resumable”
The output isn’t pretty, but it’s suitable for copy-pasting into a terminal or command prompt where it ought to download a whole lot of music for offline play.
This isn’t an approach that most people will ever need: part of the value of services like YouTube Music, Spotify and the like is that you pay a fixed fee to stream whatever you like,
wherever you like, obviating the need for a large offline music collection. And people who want to maintain a traditional music collection offline are most-likely to want to do
so while supporting the bands they care about, especially as (with DRM-free digital downloads commonplace) it’s never been
easier to do so.
But for those minority of people who need to play music from their streaming services offline but don’t have or can’t use a device suitable for doing so on-the-go, this kind of approach
works. (Although again: it’s probably not permitted, so be sure to read the rules before you use it in such a way!)
Step 6. Learn something
But more-importantly, the techniques of exploring and writing console Javascript demonstrated are really useful for extracting all kinds of data from web pages (data scraping), writing your own userscripts, and much more. If there’s
one lesson to take from this blog post it’s not that you can steal music on the Internet (I’m pretty sure everybody who’s lived on this side of 1999 knows that by now), but
that you can manipulate the web pages you see. Once you’re viewing it on your computer, a web page works for you: you don’t have to consume a page in the way that the
author expected, and knowing how to extract the underlying information empowers you to choose for yourself a more-streamlined, more-personalised, more-powerful web.
So the NHS blood donation rules are changing again. And while they’re certainly getting closer, they’re still not quite hitting the bullseye yet.
That’s great. Prior to 2011 men who’d ever had sex with men, as well as women who’d had sex with such a man within the last 6 months, were banned from donating blood. That rule
clearly spun out of the AIDS hysteria of the 1980s and generally entrenched homophobia. It probably did little to
protect the recipients of blood, and certainly did a lot to increase the stigma experienced by non-straight men.
You throw enough policies at a problem, eventually one will get close-enough, right?
The 2011 change permitted donation by men who’d previously had sex with men… so long as they hadn’t done so within the last year. Which opened the doors to donation by a lot of men:
e.g. bisexual men who’d been in relationships exclusively with women, gay men who’d been celibate for a period, etc. It still wasn’t great, but it was a step in the right
direction.
So when I saw that the rules were changing to better target only risky behaviours, rather than behaviours that are so broad-brush as to target identities, I was
initially delighted. Evidence-based medicine, you say? For the win.
Go on! Stick it in me! I’ll still be able to give blood, right?
But… it’s not all sunshine and rainbows. The new rules prohibit blood donation regardless of gender by people who’ve had sex with more than one person in the last three months.
Sorry Brandon, we only want Andre and Carlos’ blood.
So if for example if there’s a V-shaped relationship consisting of three men, who only have sex within their thruple… two of them are now allowed to give blood but the third isn’t?
(This isn’t a contrived example. I know such a thruple.)
Stranger still: if you swap Brandon in the diagram above for a woman then you get a polycule that’s a lot like mine, but the woman in the middle used to be allowed to give
blood… and now can’t! My partner Ruth is in exactly the position: her situation hasn’t changed, but because she’s been in a long-term
relationship with exactly two people she’s now not allowed to give blood. Wot?
On the whole, this rule change is an improvement. We’re getting closer to a perfect answer. But it’s amusing to see where the policy misses again and excludes
donors who would otherwise be perfectly viable.
Update: as this is attracting a lot of attention I just wanted to remind people that the whole discussion is, of course, a lot
more complicated than can be summarised in a single, short, opinionated blog post. Take a look at the FAIR steering
group’s recommendations and compare to the government’s press release.
Update #2: justifying choice of words – “AIDS hysteria”
refers specifically to the media (and to a lesser extent the policy) reactions to the (very real, very devastating) pandemic. For a while there it was perfectly normal to see (often
misguided, sometimes homophobic) scaremongering news coverage suggesting that everybody was at enormous risk from HIV.
So I made a COVID conspiracy theory-themed lorem ipsum generator:
I blame my friend Bryn, who put the idea into my head while he was coming up with fake COVID conspiracy theories (I realise this sentence makes it sound like there are real COVID
conspiracy theories) on a WhatsApp group we’re both in:
This is about the minimum level of encouragement I need to do just about anything in tech.
It’s implemented using perchance, a platform for creating random text generators that I’ve been
playing with – sometimes with the kids – lately. It’s really easy to use and provides a kind of instant-satisfaction that I think is important
if you want to inspire the next generation of software engineers. This means, among other things, that you can clone, edit, and mashup my tool:
perhaps you can make it better! Or perhaps you’ll use perchance to write some fiction, or poetry, or something else entirely. But regardless, I’d encourage you to have a play.
Mostly my generator comes up with meaningless gibberish, nonsense, and laughable claims. So it’s marginally more-trustworthy than your typical COVID conspiracy theorist.
I’ve been having a tough time these last few months. Thanks to COVID, I’m sure I’m not alone in that.
Times are strange, and even when you get a handle on how they’re strange they can still affect you: lockdown stress can quickly magnify anything else you’re already going
through.
We’ve all come up with our own coping strategies; here’s part of mine.
Only people who are highly-allergic to pine needles normally look like this when they’re shopping for a Christmas tree.
These last few months have occasionally seen me as emotionally low as… well, a particularly tough spell a decade ago. But this time around I’ve
benefited from the self-awareness and experience to put some solid self-care into practice!
By way partly of self-accountability and partly of sharing what works for me, let me tell you about the silly mnemonic that reminds me what I need to keep track of as part of each day:
GEMSAW! (With thanks to Amy Blankson for, among other things, the idea of this kind of acronym.)
Because it’s me, I’ve cited a few relevant academic sources for you in my summary, below:
Gratitude
Taking the time to stop and acknowledge the good things in your life, however small, is associated with lower stress levels (Taylor, Lyubomirsky & Stein, 2017) to a degree that can’t just be explained by the placebo effect (Cregg
& Cheavens, 2020).
Frankly, the placebo effect would be fine, but it’s nice to have my practice of trying to intentionally recognise something good in each day validated by the science too!
Exercise
I don’t even need a citation; I’m sure everybody knows that aerobic exercise is associated with reduced risk and severity of depression: the biggest problem comes from the
fact that it’s an exceptionally hard thing to motivate yourself to do if you’re already struggling mentally!
But it turns out you don’t need much to start to see the benefits (Josefsson, Lindwall & Archer,
2014): I try to do enough to elevate my heart rate each day, but that’s usually nothing more than elevating my desk to standing height, putting some headphones on, and dancing
while I work!
Warming up. Things only get nuts when the bass drops, but I’ll spare you having to watch that.
Meditation/Mindfulness
Understandably a bit fuzzier as a concept and tainted by being a “hip” concept. A short meditation break or mindfulness exercise might be verifiably therapeutic, but more
(non-terrible) studies are needed (Vonderlin, Biermann, Bohus & Lyssenko 2020). For me, a 2-5 minute
meditation break punctuates a day and feels like it contributes towards the goal of staying-sane-in-challenging-times, so it makes it into my wellbeing plan.
Maybe it’s doing nothing. But I’m not losing much time over it so I’m not worried.
Sunlight
During my 20s I gradually began to suffer more and more from “winter blues”. Nobody’s managed to make an argument for the underlying cause of seasonal affective disorder that
hasn’t been equally-well debunked by some other study. Small-scale studies often justify light therapy (e.g. Lam, Levitan & Morehouse 2006) but it’s possibly
no-more-effective than a placebo at scale (SBU 2007).
Since my early 30s, I’ve always felt better to get myself 30 minutes of lightbox on winter mornings (I use one of these bad
boys). I admit it’s possible that the benefits are just the result of tricking my brain into waking-up more promptly and therefore feeing like I’m being more-productive with my
waking hours! But either way, getting some sunlight – whether natural or artificial – makes me feel better, so it makes it onto my daily self-care checklist.
10 minutes of overhead, unoccluded sunlight is the minimum therapeutic dose. That translates to about 30 minutes of winter sun at my latitude or 10,000 lux full-spectrum sunlamp.
Acts of kindness
It’s probably not surprising that a person’s overall happiness correlates with their propensity for kindness (Lyubomirsky, King & Diener 2005). But what’s more interesting is that the causal link can be “gamed”. That is: a
deliberate effort to engage in acts of kindness results in increased happiness (Buchanan & Bardi
2010)!
Beneficial acts of kindness can be as little as taking the time to acknowledge somebody’s contribution or compliment somebody’s efforts. The amount of effort it takes is far
less-important for happiness than the novelty of the experience, so the type of kindness you show needs to be mixed-up a bit to get the best out of it. But demonstrating kindness
helps to make the world a better place for other humans, so it pays off even if you’re coming from a fully utilitarian perspective.
Writing
I write a lot anyway, often right here, and that’s very-definitely for my own benefit first and foremost. But off the back of
some valuable “writing therapy” (Baikie
& Wilhelm 2005) I undertook earlier this year, I’ve been continuing with the simpler, lighter approach of trying to no more than three sentences about something that’s had an
impact on me that day.
As an approach, it doesn’t help everybody (Zachariae 2015), but writing a little about your day – not even
about how you feel about it, just the facts will do (Koschwanez, Robinson, Beban, MacCormick, Hill, Windsor, Booth, Jüllig &
Broadbent 2017; fuck me that’s a lot of co-authors) – helps to keep you content, and I’m loving it.
Despite the catchy acronym (Do I need to come up with a GEMSAW logo?
I’m pretty sure real gemcutting is actually more of a grinding process…) and stack of references, I’m not actually writing a self-help book; it just sounds like I am.
I don’t claim to be an authority on anything beyond my own head, and I’m not very confident on that subject! I just wanted to share with you something that’s been working
pretty well at keeping me sane for the last month or two, just in case it’s of any use to you. These are challenging times; do what you need to find the happiness you can, and
hang in there.
This weekend I announced and then hosted Homa Night II, an effort to use
technology to help bridge the chasms that’ve formed between my diaspora of friends as a result mostly of COVID. To a lesser extent
we’ve been made to feel distant from one another for a while as a result of our very diverse locations and lifestyles, but the resulting isolation was certainly compounded by lockdowns
and quarantines.
Long gone are the days when I could put up a blog post to say “Troma Night tonight?” and expect half a dozen friends to turn up at my house.
Back in the day we used to have a regular weekly film night called Troma Night, named after the studio
who dominated our early events and whose… genre… influenced many of our choices thereafter. We had over 300 such film
nights, by my count, before I eventually left our shared hometown of Aberystwyth ten years ago. I wasn’t the last one of the Troma Night
regulars to leave town, but more left before me than after.
Observant readers will spot a previous effort I made this year at hosting a party online.
Earlier this year I hosted Sour Grapes, a murder mystery party (an irregular highlight of our Aberystwyth social calendar,
with thanks to Ruth) run entirely online using a mixture of video chat and “second screen”
technologies. In some ways that could be seen as the predecessor to Homa Night, although I’d come up with most of the underlying technology to make Homa Night possible on a
whim much earlier in the year!
The idea spun out of a few conversations on WhatsApp but the final name – Homa Night – wasn’t agreed until early in November.
How best to make such a thing happen? When I first started thinking about it, during the first of the UK’s lockdowns, I considered a few options:
Streaming video over a telemeeting service (Zoom, Google Meet, etc.)
Very simple to set up, but the quality – as anybody who’s tried this before will attest – is appalling. Being optimised for speech rather than music and sound effects gives the audio
a flat, scratchy sound, video compression artefacts that are tolerable when you’re chatting to your boss are really annoying when they stop you reading a crucial subtitle, audio and
video often get desynchronised in a way that’s frankly infuriating, and everybody’s download speed is limited by the upload speed of the host, among other issues. The major benefit of
these platforms – full-duplex audio – is destroyed by feedback so everybody needs to stay muted while watching anyway. No thanks!
Teleparty or a similar tool Teleparty (formerly Netflix Party, but it now supports more services) is a pretty clever way to get almost exactly what I want:
synchronised video streaming plus chat alongside. But it only works on Chrome (and some related browsers) and doesn’t work on tablets, web-enabled TVs, etc., which would exclude some
of my friends. Everybody requires an account on the service you’re streaming from, potentially further limiting usability, and that also means you’re strictly limited to the media
available on those platforms (and further limited again if your party spans multiple geographic distribution regions for that service). There’s definitely things I can learn from
Teleparty, but it’s not the right tool for Homa Night.
“Press play… now!”
The relatively low-tech solution might have been to distribute video files in advance, have people download them, and get everybody to press “play” at the same time! That’s at least
slightly less-convenient because people can’t just “turn up”, they have to plan their attendance and set up in advance, but it would certainly have worked and I seriously
considered it. There are other downsides, though: if anybody has a technical issue and needs to e.g. restart their player then they’re basically doomed in any attempt to get back
in-sync again. We can do better…
A custom-made synchronised streaming service…?
A custom solution that leveraged existing infrastructure for the “hard bits” proved to be the right answer.
So obviously I ended up implementing my own streaming service. It wasn’t even that hard. In case you want to try your own, here’s how I did it:
Media preparation
First, I used Adobe Premiere to create a video file containing both of the night’s films, bookended and separated by “filler” content to provide an introduction/lobby, an intermission,
and a closing “you should have stopped watching by now” message. I made sure that the “intro” was a nice round duration (90s) and suitable for looping because I planned to hold people
there until we were all ready to start the film. Thanks to Boris & Oliver for the background
music!
Honestly, the intermission was just an excuse to keep my chroma key gear out following its most-recent use.
Next, I ran the output through Handbrake to produce “web optimized” versions in 1080p and 720p output sizes. “Web optimized” in this case means that
metadata gets added to the start of the file to allow it to start playing without downloading the entire file (streaming) and to allow the calculation of what-part-of-the-file
corresponds to what-part-of-the-timeline: the latter, when coupled with a suitable webserver, allows browsers to “skip” to any point in the video without having to watch the intervening
part. Naturally I’m encoding with H.264 for the widest possible compatibility.
Even using my multi-GPU computer for the transcoding I had time to get up and walk around a bit.
Real-Time Synchronisation
To keep everybody’s viewing experience in-sync, I set up a Firebase account for the application: Firebase provides an easy-to-use Websockets
platform with built-in data synchronisation. Ignoring the authentication and chat features, there wasn’t much
shared here: just the currentTime of the video in seconds, whether or not introMode was engaged (i.e. everybody should loop the first 90 seconds, for now), and
whether or not the video was paused:
Firebase makes schemaless real-time databases pretty easy.
To reduce development effort, I never got around to implementing an administrative front-end; I just manually went into the Firebase database and acknowledged “my” computer as being an
administrator, after I’d connected to it, and then ran a little Javascript in my browser’s debugger to tell it to start pushing my video’s currentTime to the server every
few seconds. Anything else I needed to edit I just edited directly from the Firebase interface.
Other web clients’ had Javascript to instruct them to monitor these variables from the Firebase database and, if they were desynchronised by more than 5 seconds, “jump” to the correct
point in the video file. The hard part of the code… wasn’t really that hard:
// Rewind if we're passed the end of the intro loopfunction introModeLoopCheck() {
if (!introMode) return;
if (video.currentTime > introDuration) video.currentTime =0;
}
function fixPlayStatus() {
// Handle "intro loop" modeif (remotelyControlled && introMode) {
if (video.paused) video.play(); // always play
introModeLoopCheck();
return; // don't look at the rest
}
// Fix current timeconst desync =Math.abs(lastCurrentTime - video.currentTime);
if (
(video.paused && desync > DESYNC_TOLERANCE_WHEN_PAUSED) ||
(!video.paused && desync > DESYNC_TOLERANCE_WHEN_PLAYING)
) {
video.currentTime = lastCurrentTime;
}
// Fix play statusif (remotelyControlled) {
if (lastPaused &&!video.paused) {
video.pause();
} elseif (!lastPaused && video.paused) {
video.play();
}
}
// Show/hide paused notification
updatePausedNotification();
}
Web front-end
Finally, there needed to be a web page everybody could go to to get access to this. As I was hosting the video on S3+CloudFront anyway, I put the HTML/CSS/JS there too.
I decided to carry the background theme of the video through to the web interface too.
I tested in Firefox, Edge, Chrome, and Safari on desktop, and (slightly less) on Firefox, Chrome and Safari on mobile. There were a few quirks to work around, mostly to do with browsers
not letting videos make sound until the page has been interacted with after the video element has been rendered, which I carefully worked-around by putting a popup “over” the
video to “enable sync”, but mostly it “just worked”.
Delivery
On the night I shared the web address and we kicked off! There were a few hiccups as some people’s browsers got disconnected early on and tried to start playing the film before it was
time, and one of these even when fixed ran about a minute behind the others, leading to minor spoilers leaking via the rest of us riffing about them! But on the whole, it worked. I’ve
had lots of useful feedback to improve on it for the next version, and I might even try to tidy up my code a bit and open-source the results if this kind of thing might be useful to
anybody else.
I’ve been working as part of the team working on the new application framework called the Endpoint Encabulator and wanted to share with you what I think makes our project so
exciting: I promise it’ll make for two minutes of your time you won’t seen forget!
Naturally, this project wouldn’t have been possible without the pioneering work that preceded it by John Hellins Quick, Bud Haggart, and others. Nothing’s invented in a vacuum. However,
my fellow developers and I think that our work is the first viable encabulator implementation to provide inverse reactive data binding suitable for deployment in front of a
blockchain-driven backend cache. I’m not saying that all digital content will one day be delivered through Endpoint Encabulator, but… well; maybe it will.
If the technical aspects go over your head, pass it on to a geeky friend who might be able to make use of my work. Sharing is caring!
You're browing in "1999 Web 1.0" mode. Ready to go back to 2025?