Normally this kind of thing would go into the ballooning dump of “things I’ve enjoyed on the Internet” that is my reposts archive. But sometimes something is
so perfect that you have to try to help it see the widest audience it can, right? And today, that thing is: Mackerelmedia
Fish.
Historical fact: escaped fish was one of the primary reasons for websites failing in 1996.
What is Mackerelmedia Fish? I’ve had a thorough and pretty complete experience of it, now, and I’m still not sure. It’s one or more (or none)
of these, for sure, maybe:
A point-and-click, text-based, or hypertext adventure?
A statement about the fragility of proprietary technologies on the Internet?
An ARG set in a parallel universe in which the 1990s never ended?
A series of surrealist art pieces connected by a loose narrative?
Rock Paper Shotgun’s article about it opens with “I
don’t know where to begin with this—literally, figuratively, existentially?” That sounds about right.
This isn’t the reward for “winning” the “game”. But I was proud of it anyway.
What I can tell you with confident is what playing feels like. And what it feels like is the moment when you’ve gotten bored waiting for page 20 of Argon Zark to finish appear so you decide to reread your already-downloaded copy of the 1997 a.r.k bestof book, and for a moment you think to yourself: “Whoah; this must be what living in the future
feels like!”
Because back then you didn’t yet have any concept that “living in the future” will involve scavenging for toilet paper while complaining that you can’t stream your favourite shows in 4K on your pocket-sized
supercomputer until the weekend.
I was always more of a Bouncing Blocks than a Hamster Dance guy, anyway.
Mackerelmedia Fish is a mess of half-baked puns, retro graphics, outdated browsing paradigms and broken links. And that’s just part of what makes it great.
It’s also “a short story that’s about the loss of digital history”, its creator Nathalie Lawhead
says. If that was her goal, I think she managed it admirably.
Everything about this, right down to the server signature (Artichoke), is perfect.
If I wasn’t already in love with the game already I would have been when I got to the bit where you navigate through the directory indexes of a series of deepening folders,
choose-your-own-adventure style. Nathalie writes, of it:
One thing that I think is also unique about it is using an open directory as a choose your own adventure. The directories are branching. You explore them, and there’s text at the
bottom (an htaccess header) that describes the folder you’re in, treating each directory as a landscape. You interact with the files that are in each of these folders, and uncover the
story that way.
Back in the naughties I experimented with making choose-your-own-adventure games in exactly this way. I was experimenting with different media by which this kind of
branching-choice game could be presented. I envisaged a project in which I’d showcase the same (or a set of related) stories through different approaches. One was “print” (or at least
“printable”): came up with a Twee1-to-PDF
converter to make “printable” gamebooks. A second was Web hypertext. A third – and this is the one which was most-similar to what Nathalie has now so expertly made real – was
FTP! My thinking was that this would be an adventure game that could be played in a browser or even from the command line on any
(then-contemporary: FTP clients aren’t so commonplace nowadays) computer. And then, like so many of my projects, the half-made
version got put aside “for later” and forgotten about. My solution involved abusing the FTP protocol terribly, but it
worked.
(I also looked into ways to make Gopher-powered hypertext fiction and toyed with the idea of using YouTube
annotations to make an interactive story web [subsequently done amazingly by Wheezy Waiter, though the death of YouTube
annotations in 2017 killed it]. And I’ve still got a prototype I’d like to get back to, someday, of a text-based adventure played entirely through your web browser’s debug
console…! But time is not my friend… Maybe I ought to collaborate with somebody else to keep me on-course.)
My first batch of pet frogs died quite quickly, but these ones did okay.
In any case: Mackerelmedia Fish is fun, weird, nostalgic, inspiring, and surreal, and you should give it a go. You’ll need to be on a Windows
or OS X computer to get everything you can out of it, but there’s nothing to stop you starting out on your mobile, I imagine.
Sso long as you’re capable of at least 800 × 600 at 256 colours and have 4MB of RAM,
if you know what I mean.
Back in 2011, some folks cross-compiled Doom (the original, not the reboot, obviously) to JavaScript, leveraging the capabilities of the then-relatively-young
<canvas> element and APIs. I was really impressed to see that JavaScript had come so far and that
performance on desktop devices was so slick. Sure, this was an 18-year-old video game, but it was playable in a browser, which was a long way from the environment for which it
was originally developed.
Now Doom 3‘s playable in a browser, and my mind’s blown all over again. This follows almost the same curve – Doom 3’s 16 years old – but it still goes to show that there’s
little limit to the power of client-side browser programming. They’ve done this magic with WebAssembly; while WebAssembly goes
slightly against my ideas about the open-source nature of the Web, I still respect the power it commands to do heavyweight crunching tasks like this one.
How long until AAA developers start developing with the Web as an additional platform?
I know only a small percentage of you use VR and to everyone else I might as well by telling you how spiffy the handrails are up in this ivory tower, but for what it’s worth,
Boneworks is the first game in a while to make me think VR might be getting somewhere. It’s not there yet. The physics is full of little niggles as you might expect from a game
trying to juggle so much. The major issue with the climbing is only your hands and head can be moved and your in-game legs just flop around getting in the way of things like two
stubborn trails of cum dangling off your mum’s chin, but forget all that.
…
Speaking of VR, Yahtzee’s still playing with it and thinks it’s improving, which is high praise.
So there’s hope yet.
I really need to dig my heavyweight gear out of the attic, but I’m waiting until we (eventually) move house. And I absolutely agree with Yahtzee’s observation about the value of
VR games in which you can sit down, sometimes.
As part of the preparing to leave the Bodleian I’ve been revisiting a lot of the documentation I’ve written over the last eight
years. It occurred to me that I’ve never written publicly about how the Bodleian’s digital signage/interactives actually work; there are possible lessons to learn.
The Bodleian‘s digital signage is perhaps more-diverse, both in terms of technology and audience, than that of most organisations. We’ve got
signs in areas that are exclusively reader-facing to help students and academics find what they’re looking for, signs in publicly accessible rooms that advertise and educate, and signs
in gallery spaces upon which we try to present engaging and often-interactive content to support exhibitions.
Getting an extra touchscreen for the office for prototyping/user testing purposes was great, even when it wasn’t showing MLP: FiM.
Throughout those three spheres, we’ve routinely delivered a diversity of content (let’s just ignore the countdown clock, for now…). Traditional
directional signage, advertisements, games, digital exhibitions, interpretation, feedback surveys…
In the vast majority of cases – and this is where the Bodleian’s been unusual (though certainly not unique) among cultural sector institutions – we’ve created
those in-house rather than outsourcing them.
Using off-the-shelf technology also allows the Bodleian to in-house much of their hardware maintenance, as a secondary part of other job roles. Singing into your screwdriver remains
optional, though.
To do this economically – the volume of work on interactive signage is inconsistent throughout the year – we needed to align the skills required with skills used elsewhere in the
organisation. To do this, we use the web as our medium! Collectively, the Bodleian’s Digital Communications team already had at least some experience in programming, web design, graphic
design, research, user testing, copyediting etc.: the essential toolkit for web application development.
Whether you were playing Pong on the video wall at the back or testing your Middle-earth knowledge on the touchscreen at the front… behind the
scenes you were interacting with a web page I wrote.
By shifting our digital signage platform to lean heavily on web technologies, we were able to leverage talented people we already had to produce things that we might otherwise
have had to outsource. This, in turn, meant that more exhibitions and displays get digital enhancement, on a shorter turnaround.
It also means that there’s a tighter integration between exhibition content and content for web and social media: it’s easier for us to re-use content across multiple platforms.
Sometimes we’ve even made our digital interactives, or adapted version of them, available directly online, allowing our exhibitions to reach people that can’t get to our physical spaces
at all.
Because we’re able to produce our own content on-demand, even our smaller, shorter-duration displays can have hands-on digital interactives associated with them.
On to the technology! We’re using a real mixture of tech: when it’s donated or reclaimed from previous projects (and when the bidding and acquisition processes are, well… as you’d
expect at the University of Oxford), you learn not to say no to freebies. Our fleet includes:
Samsung Android tablets with freestanding kiosk frames. We run the excellent-value Kiosk Browser Lockdown app on
these, which loads on boot and prevents access to anything but a specified website.
OnelanNTBs connected to a mixture of
touch and non-touch screens, wall-mounted or in kiosk frames. We use Onelan’s standard digital signage features as well as – for interactive content – their built-in touch-capable web
browser.
Dell PCs of the standard variety supplied by University IT services, connected to wall-mounted touch screens, running Google Chrome in Kiosk Mode. More on this below.
The browsers’ responsive simulators are invaluable when we’re targeting signage at five (!) different resolutions.
When you’re developing content for a very small number of browsers and a limited set of screen sizes, you quickly learn to throw a lot of “best practice” web development out of the
window. You’ll never come across a text browser or screen reader, so alt-text doesn’t matter. You’ll never have to rescale responsively, so you might as well absolutely-position almost
everything. The devices are all your own, so you never need to ask permission to store cookies. And because you control the platform, you can get away with making configuration tweaks
to e.g. allow autoplaying videos with audio. Coming from a conventional web developer background to producing digital signage content makes feels incredibly lazy.
Helping your users see your interactive as “app-like” rather than “web-like” encourages them to feel comfortable engaging with it in ways uncharacteristic of web pages. In our Shakespeare’s Dead interactive, for example, we started the experience in the middle of a long horizontally-scrolling “page”, which might
feel very unusual in a conventional browser.
Using Chrome to run digital signage requires, in the Bodleian’s case, a couple of configuration tweaks and the right command-line switches. We use:
chrome://flags/#overscroll-history-navigation – disabling this prevents users from triggering “back”/”forward” by swiping with two fingers
chrome://flags/#pull-to-refresh – disabling this prevents the user from triggering a “refresh” by scrolling up beyond the top of the page (this only happens on some
kinds of devices)
chrome://flags/#system-keyboard-lock – we don’t use attached keyboards, but if you do, you might want to set this flag so you can use the keyboard.lock()
API to intercept e.g. ALT+F4 so users can’t escape the application
running on startup with e.g. chrome --kiosk --noerrdialogs --allow-file-access-from-files --disable-touch-drag-drop --incognito https://example.com/some/url
Kisok mode makes the browser run fullscreen and prevents e.g. opening additional tabs, giving an instant “app-like” experience. As we don’t have keyboards attached to our
digital signage, this also prevents visitors from closing Chrome.
Turning off error dialogs reduces the risk that an error will result in an unslightly message to the user.
Enabling “file access from files” allows content hosted at file:// addresses to access content at other file:// addresses, which makes it possible to write “offline” sites
(sometimes useful where we’re serving large videos or on previous occasions when WiFi has been shaky) that can still take advantage of features like the Fetch API.
Unless you need drag-and-drop, it’s simpler to disable it; this prevents a user long-press-and-dragging an image around the screen.
Incognito mode ensures that the browser doesn’t remember what site was showing last time it ran; our computers often end up switched off at the wall at the end of the day, and
without this the browser will offer to load the site it had open last time, when it runs.
We usually host our interactives directly on the web, at “secret” addresses, and this is generally preferable to us as we can more-easily make on-the-fly adjustments to
content (plus it makes it easier to hook up analytic tools).
Be sure to test the capabilities of your hardware! Our Onelan NTBs, unlike your desktop PCs, can’t handle multitouch input, which
affects the design of our user interfaces for these devices.
Meanwhile, in the application’s CSS code, we set * { user-select: none; } to prevent the user from highlighting
text by selecting it with their finger. We also make heavy use of absolutely-sized/positioned, overflow: hidden blocks to ensure that scrollbars never appear, and
CSS animations to make content feel dynamic and to draw attention to particular elements.
There’s no substitute for good testing. And there’s no stress-testing quite like letting a 5 year-old loose on your work.
Altogether, this approach gives the Bodleian the capability to produce engaging interactive content at low cost and using the existing skills of their digital and exhibitions teams.
It’s not an approach that would work for every cultural institution: in particular, some of the Bodleian’s sister institutions already
outsource the technical parts of their web work, and so don’t have the expertise in-house to share with a web-powered digital signage solution.
A few minor CSS tweaks to make the buttons finger-friendly and our Halloween game Shadows Out Of
Time, which I’d already made web-friendly, was touchscreen-ready too. I wonder if they’ll get this one out again, this
Halloween?
But for those museums that can fit into this model – or can adapt to do so in future – using the web to produce interactive digital content and digital signage is a highly
cost-effective way to engage with visitors, even (or especially!) when dealing with short-lived and/or rotating displays.
It’s also been among my favourite parts of my job at the Bod these last 8½ years, and I’m sure I’ll miss it!
But while we’ve learned to distrust user names and text more generally, pictures are different. You can’t synthesize a picture out of nothing, we assume; a picture had to be of
someone. Sure a scammer could appropriate someone else’s picture, but doing so is a risky strategy in a world with google reverse search and so forth. So we tend to trust pictures.
A business profile with a picture obviously belongs to someone. A match on a dating site may turn out to be 10 pounds heavier or 10 years older than when a picture was taken, but if
there’s a picture, the person obviously exists.
No longer. New adverserial machine learning algorithms allow people to rapidly generate synthetic ‘photographs’ of people who have never existed. Already faces of this sort are
being used in espionage.
Computers are good, but your visual processing systems are even better. If you know what to look for, you can spot these fakes at a single glance — at least for the time being. The
hardware and software used to generate them will continue to improve, and it may be only a few years until humans fall behind in the arms race between forgery and detection.
Our aim is to make you aware of the ease with which digital identities can be faked, and to help you spot these fakes at a single glance.
…
I was at a conference last month where research was presented which concluded pretty solidly that the mechanisms used to
make “deepfakes” meant that it was probably impossible to create artificial intelligence that can learn to distinguish between real and fake pictures of humans. Simply put, this is
because the way we make such images is with generative adversarial networks, an AI technique which thrives upon having an effective discriminator component, and any research into differentiating between real and fake images
feeds the capability of the next generation of discriminators!
Instead, then, the best medium-term defence against deepfakes is training humans to be able to identify them, and that’s what this website aims to do. I was pleased that I did
very well on my first attempt (I sort-of knew what to look for already, based on a basic understanding of the underlying technologies) but I was also pleased that I was able to learn to
do better with the aid of the authors’ tips. Nice.
Back in February my friend Katie shared with me an already four-year-old piece of interactive fiction, Bus Station: Unbound, that I’d somehow managed to miss the first time around. In the five months since then I’ve
periodically revisited and played through it and finally gotten around to writing a review:
All of the haunting majesty of its subject, and a must-read-thrice plot
Perhaps it helps to be as intimately familiar with Preston Bus Station – in many ways, the subject of the piece – as the protagonist. This work lovingly and faithfully depicts the
space and the architecture in a way that’s hauntingly familiar to anybody who knows it personally: right down to the shape of the rubberised tiles near the phone booths, the
forbidding shadows of the underpass, and the buildings that can be surveyed from its roof.
But even without such a deep recognition of the space… which, ultimately, soon comes to diverge from reality and take on a different – darker, otherworldly – feel… there’s a magic
to the writing of this story. The reader is teased with just enough backstory to provide a compelling narrative without breaking the first-person illusion. No matter how
many times you play (and I’ve played quite a few!), you’ll be left with a hole of unanswered questions, and you’ll need to be comfortable with that to get the most out of the story,
but that in itself is an important part of the adventure. This is a story of a young person who doesn’t – who can’t – know everything that they need to bring them comfort
in the (literally and figuratively) cold and disquieting world that surrounds them, and it’s a world that’s presented with a touching and tragic beauty.
Through multiple playthroughs – or rewinds, which it took me a while to notice were an option! – you’ll find yourself teased with more and more of the story. There are a few
frankly-unfair moments where an unsatisfactory ending comes with little or no warning, and a handful of places where it feels like your choices are insignificant to the story, but
these are few and far between. Altogether this is among the better pieces of hypertext fiction I’ve enjoyed, and I’d recommend that you give it a try (even if you don’t share the
love-hate relationship with Preston Bus Station that is so common among those who spent much of their youth sitting in it).
It’s no secret that I spent a significant proportion of my youth waiting for or changing buses at (the remarkable) Preston
Bus Station, and that doubtless biases my enjoyment of this game by tingeing it with nostalgia. But I maintain that it’s a well-written piece of hypertext interactive fiction with a
rich, developed world. You can play it starting from here, and you should. It looks like the story’s
accompanying images died somewhere along the way, but you can flick through them all here and get a feel for the shadowy,
brutalist, imposing place.
I can’t get enough of this comic. Despite the fact that it’s got no written dialogue at all I must’ve read it half a dozen times and seen more in it each time. Go read the whole thing…
My nephew, Michael, died on 22 May 2019. He was 15 years old.
He loved his family, tractors, lorries, tanks, spaceships and video games (mostly about tractors, lorries, tanks and spaceships), and confronted every challenge in his short,
difficult life with a resolute will that earned him much love and respect. Online in his favourite game, Elite Dangerous by Frontier Developments, he was known as CMDR Michael
Holyland.
In Michael’s last week of life, thanks to the Elite Dangerous player community, a whole network of new friends sprang up in our darkest hour and made things more bearable with a
magnificent display of empathy, kindness and creativity. I know it was Michael’s wish to celebrate the generosity he was shown, so I’ve written this account of how Frontier and
friends made the intolerable last days of a 15-year-old boy infinitely better.
…
I’m not crying, you’re crying.
A beautiful article which, despite its tragedy, does an excellent job of showcasing how video gaming communities can transcend barriers of distance, age, and ability and bring joy to
the world. I wish that all gaming communities could be this open-minded and caring, and that they could do so more of the time.
You could fit almost the entire history of videogames into the time span covered by the silent film era, yet we consider it a mature medium, rather than one just breaking out of its
infancy. Like silent movies, classic games are often incomplete, damaged, or technically limited, but have a beauty all their own. In this spirit, indie game developer Joe Blair and I
built Metropoloid, a remix of Fritz Lang’s Metropolis which replaces its famously lost score with that of its contemporaries from the early days of games.
I’ve watched Metropolis a number of times over the decades, in a variety of the stages of its recovery, and I
love it. I’ve watched it with a pre-recorded but believed-to-be-faithful soundtrack and I’ve watched it with several diolive accompaniment. But this is the first time I’ve watched it to
the soundtrack of classic (and contemporary-retro) videogames: the Metroid, Castlevania, Zelda, Mega Man and Final Fantasy
series, Doom,Kirby, F-Zero and more. If you’ve got a couple of hours to spare and a love of classic film and classic videogames, then you’re in
the slim minority that will get the most out of this fabulous labour of love (which, at the time of my writing, has enjoyed only a few hundred views and a mere 26 “thumbs up”: it
certainly deserves a wider audience!).
Google can’t be trusted to maintain the services of theirs that you depend upon (relevant XKCD?). That’s not a phenomenon that’s unique to Google,
of course: it’s perhaps just that they produce so many new and often-experimental services that they inevitably cease supporting more of them than some of the many other providers who’ve killed the silos that people depended upon.
How could things be better? For a start, Google could make a better commitment to open-source and developing standards rather than platforms. But if you don’t think you can trust them
to do that – and you can’t – then the only solution for individuals is to use fewer Google products to break the Google-monoculture. Encourage the competition to weaken their
position, and break free from silos in general where it’s possible to do so.
148+ projects and services dead. But hey, we’re getting Stadia so everything’s okay, right? <sigh>
Human beings leave physical impressions upon the things they love and use just as much as their do upon the lives of people and the planet they live upon. For every action, there’s a
reaction. For every pressure, there’s an affect on mass and volume. And in the impressions left by that combination, particularly if you’re lucky enough to see the sides of a rare,
unrestored vintage Pac-Man cabinet, lies the never before told story of how we really played the game.
Until now, I don’t believe anyone has ever written about it.
…
Interesting exploration of the history of the cabinets housing Pac-Man, observing the ergonomic impact of the controls on the way that people would hold the side of the machine and, in
turn, how that would affect where and how the paint would wear off.
The Bodleian has a specific remit for digital archiving… but sometimes they just like collecting stuff, too, I’m sure.
The team responsible for digital archiving had plans to spend World Digital Preservation Day running a stand in Blackwell Hall for some
time before I got involved. They’d asked my department about using the Heritage Window – the Bodleian’s 15-screen video wall – to show a carousel of slides with relevant content over
the course of the day. Or, they added, half-jokingly, “perhaps we could have Pong up there as it’ll be its 46th birthday?”
Free reign to play about with the Heritage Window while smarter people talk to the public about digital archives? Sure, sign me up.
But I didn’t take it as a joke. I took it as a challenge.
Emulating Pong is pretty easy. Emulating Pong perfectly is pretty hard. Indeed, a lot of the challenge in the preservation of (especially digital) archives in general is in
finding the best possible compromise in situations where perfect preservation is not possible. If these 8″ disks are degrading, is is acceptable to copy them onto a different medium? If this video file is unreadable in
modern devices, is it acceptable to re-encode it in a contemporary format? These are the kinds of questions that digital preservation specialists have to ask themselves all the damn
time.
The JS Gamepad API lets your web browser talk to controller devices.
Emulating Pong in a way that would work on the Heritage Window but be true to the original raised all kinds of complications. (Original) Pong’s aspect ratio doesn’t fit nicely on a 16:9
widescreen, much less on a 27:80 ultrawide. Like most games of its era, the speed is tied to the clock rate of the processor. And of course, it should be controlled using a
“dial”.
By the time I realised that there was no way that I could thoroughly replicate the experience of the original game, I decided to take a different track. Instead, I opted to
reimplement Pong. A reimplementation could stay true to the idea of Pong but serve as a jumping-off point for discussion about how the experience of playing the game
may be superficially “like Pong” but that this still wasn’t an example of digital preservation.
Bip… boop… boop… bip… boop… bip…
Here’s the skinny:
A web page, displayed full-screen, contains both a <canvas> (for the game, sized appropriately for a 3 × 3 section of the video wall) and a
<div> full of “slides” of static content to carousel alongside (filling a 2 × 3 section).
Javascript writes to the canvas, simulates the movement of the ball and paddles, and accepts input from the JS
Gamepad API (which is awesome, by the way). If there’s only one player, a (tough! – only three people managed to beat it over the course of the day!) AI plays the other paddle.
A pair of SNES controllers adapted for use as USB
controllers which I happened to own already.
Increasingly, the Bodleian’s spaces seem to be full of screens running Javascript applications I’ve written.
I felt that the day, event, and game were a success. A few dozen people played Pong and explored the other technology on display. Some got nostalgic about punch tape, huge floppy disks,
and even mechanical calculators. Many more talked to the digital archives folks and I about the challenges and importance of digital archiving. And a good time was had by all.
I’ve open-sourced the entire thing with a super-permissive license so you can deploy it yourself (you know, on your ultrawide
video wall) or adapt it as you see fit. Or if you’d just like to see it for yourself on your own computer, you can (but unless
you’re using a 4K monitor you’ll probably need to use your browser’s mobile/responsive design simulator set to 3200 × 1080 to make it fit your screen). If you don’t have
controllers attached, use W/S to control player 1 and the cursor keys for player 2 in a 2-player game.