Ordnance Survey, the national mapping agency for Great Britain, is set to publish revised maps for the whole of the British isles in the wake of social distancing measures. The
new 1:50,000 scale maps are simply a revision of the older 1:25,000 scale map but all geographical features have been moved further apart.
…
Gareth’s providing a daily briefing including all the important things that the government wants you to know about the coronavirus crisis… and a few things that they didn’t think to
tell you, but perhaps they should’ve. Significantly more light-hearted than wherever you’re getting your news from right now.
This is the best diplomatic report I’ve read in a long time. A real gem. Recounts the story of the horses
gifted by Saparmurat Niyazov (Turkmenbashi) to the British government. A fantastic snapshot of life in the FSU in the early 1990s. Posting the whole thing – enjoy the read.
The performance tradeoff isn’t about where the bottleneck is. It’s about who has to carry the burden. It’s one thing for a developer to push the burden onto a
server they control. It’s another thing entirely to expect visitors to carry that load when connectivity and device performance isn’t a constant.
…
This is another great take on the kind of thing I was talking about the other day: some developers who favour heavy frameworks (e.g.
React) argue for the performance benefits, both in development velocity and TTFB. But TTFB alone is not a valid metric of a user’s perception of an application’s performance: if you’re sending a fast payload that then requires extensive
execution and/or additional calls to the server-side, it stands to reason that you’re not solving a performance bottleneck, you’re just moving it.
I, for one, generally disfavour solutions that move a Web application’s bottleneck to the user’s device (unless there are other compelling reasons that it should be there, for example
as part of an Offline First implementation, and even then it should be done with care). Moving the burden of the bottleneck to the user’s device disadvantages those on slower or older
devices and limits your ability to scale performance improvements through carefully-engineered precaching e.g. static compilation. It also produces a tendency towards a thick-client
solution, which is sometimes exactly what you need but more-often just means a larger initial payload and more power consumption on the (probably mobile) user’s device.
Next time you improve performance, ask yourself where the time saved has gone. Some performance gains are genuine; others are just moving the problem around.
New rules for old games! The Board Game Remix Kit is a collection of tips, tweaks, reimaginings and completely new games that you can play with the board and
pieces from games you might already own: Monopoly, Cluedo, Scrabble and Trivial Pursuit.
The 26 rule tweaks and new games include:
Full Houses: poker, but played with Monopoly properties
Citygrid: a single-player city-building game
Use Your Words: Scrabble with storytelling
Them’s Fightin’ Words: a game of making anagrams, and arguing about which one would win in a fight
Hunt the Lead Piping: hiding and searching for the Cluedo pieces in your actual house
Guess Who Done It: A series of yes/no questions to identify the murderer (contributed by Meg Pickard)
Zombie Mansion: use the lead piping to defend the Cluedo mansion
Judy Garland on the Moon with a Bassoon: a drawing game that uses the answers to trivia questions as prompts
The Board Game Remix Kit was originally released in 2010 by the company Hide&Seek (which closed in 2014). We are releasing it here as a pdf (for
phones/computers) and an epub (for ereaders) under a CC-BY-SA license.
If you enjoy the Kit and can afford it, please consider a donation to the World Health Organisation’s COVID-19
Response Fund.
Confined to your house? What a great opportunity to play board games with your fellow confinees.
Only got old family classics like Monopoly, Cluedo and Scrabble? Here’s a guide to mixing-them-up into new, fun, and highly-playable alternatives. Monopoly certainly needs it.
Normally this kind of thing would go into the ballooning dump of “things I’ve enjoyed on the Internet” that is my reposts archive. But sometimes something is
so perfect that you have to try to help it see the widest audience it can, right? And today, that thing is: Mackerelmedia
Fish.
What is Mackerelmedia Fish? I’ve had a thorough and pretty complete experience of it, now, and I’m still not sure. It’s one or more (or none)
of these, for sure, maybe:
A point-and-click, text-based, or hypertext adventure?
A statement about the fragility of proprietary technologies on the Internet?
An ARG set in a parallel universe in which the 1990s never ended?
A series of surrealist art pieces connected by a loose narrative?
Rock Paper Shotgun’s article about it opens with “I
don’t know where to begin with this—literally, figuratively, existentially?” That sounds about right.
What I can tell you with confident is what playing feels like. And what it feels like is the moment when you’ve gotten bored waiting for page 20 of Argon Zark to finish appear so you decide to reread your already-downloaded copy of the 1997 a.r.k bestof book, and for a moment you think to yourself: “Whoah; this must be what living in the future
feels like!”
Because back then you didn’t yet have any concept that “living in the future” will involve scavenging for toilet paper while complaining that you can’t stream your favourite shows in 4K on your pocket-sized
supercomputer until the weekend.
Mackerelmedia Fish is a mess of half-baked puns, retro graphics, outdated browsing paradigms and broken links. And that’s just part of what makes it great.
It’s also “a short story that’s about the loss of digital history”, its creator Nathalie Lawhead
says. If that was her goal, I think she managed it admirably.
If I wasn’t already in love with the game already I would have been when I got to the bit where you navigate through the directory indexes of a series of deepening folders,
choose-your-own-adventure style. Nathalie writes, of it:
One thing that I think is also unique about it is using an open directory as a choose your own adventure. The directories are branching. You explore them, and there’s text at the
bottom (an htaccess header) that describes the folder you’re in, treating each directory as a landscape. You interact with the files that are in each of these folders, and uncover the
story that way.
Back in the naughties I experimented with making choose-your-own-adventure games in exactly this way. I was experimenting with different media by which this kind of
branching-choice game could be presented. I envisaged a project in which I’d showcase the same (or a set of related) stories through different approaches. One was “print” (or at least
“printable”): came up with a Twee1-to-PDF
converter to make “printable” gamebooks. A second was Web hypertext. A third – and this is the one which was most-similar to what Nathalie has now so expertly made real – was
FTP! My thinking was that this would be an adventure game that could be played in a browser or even from the command line on any
(then-contemporary: FTP clients aren’t so commonplace nowadays) computer. And then, like so many of my projects, the half-made
version got put aside “for later” and forgotten about. My solution involved abusing the FTP protocol terribly, but it
worked.
(I also looked into ways to make Gopher-powered hypertext fiction and toyed with the idea of using YouTube
annotations to make an interactive story web [subsequently done amazingly by Wheezy Waiter, though the death of YouTube
annotations in 2017 killed it]. And I’ve still got a prototype I’d like to get back to, someday, of a text-based adventure played entirely through your web browser’s debug
console…! But time is not my friend… Maybe I ought to collaborate with somebody else to keep me on-course.)
In any case: Mackerelmedia Fish is fun, weird, nostalgic, inspiring, and surreal, and you should give it a go. You’ll need to be on a Windows
or OS X computer to get everything you can out of it, but there’s nothing to stop you starting out on your mobile, I imagine.
Sso long as you’re capable of at least 800 × 600 at 256 colours and have 4MB of RAM,
if you know what I mean.
How manymoredevelopers have to point out how bloated we’ve made the web with our frameworks, tracking scripts, and other
3rd party solutions before we take things seriously? We’ve been banging on about this for ages. It’s like a plague!
And as Bridget goes on to say, it’s especially important at this unusual time, with many people confined to home and using the Internet to try to get accurate and up-to-date
information and resources (and sometimes overwhelming major websites), that performance, accessibility, and availability matters most.
There’ll be many lessons to learn from the coronavirus crisis. But these lessons aren’t just related to healthcare, or work, or supermarket logistics. No field will be left untouched.
And one of the things that I hope my field learns, when this is over, is the importance of things working even when things get tough. Test the sad paths, test your site under
heavy load, test your site with the CDNs simulated “down”, and develop accordingly. Because this isn’t the worst kind of
crisis we could face, and we have to be ready for the worse ones that could come.
I started at Automattic on November 20, 2019, and it’s an incredible place to work. I’m constantly impressed by my coworkers kindness, intelligence, and compassion. If you’re
looking for a rewarding remote job that you can work from anywhere in the world, definitely apply.
I’m still overjoyed and amazed I was hired. While going through the hiring process, I devoured the blog posts from people describing their journeys. Here’s my contribution to the
catalog. I hope it helps someone.
…
I’ve written about my own experience of Automattic’s hiring process and how awesome it is, but if you’re looking for a more-concise summary of
what to expect from applying to and interviewing for a position, this is pretty great.
The BEST THING I’ve seen on Twitter this week (month?) is Justin Alexander’s thread documenting “The Dungeon of Drezzar,” Peter Heeringa and Troy Wilhelmson’s spectacular multilevel
dungeon built into a series of dresser drawers.
…
Well now I feel like my DM isn’t trying hard enough. Move aside, Roll20!
Going to that page results in about 14 Mb of data being transmitted from their server to your device (which you’ll pay for
if you’re on a metered connection). For comparison, reading my recent post about pronouns results in about 356 Kb of data. In other words, their page is forty times more bandwidth-consuming, despite the fact that my page has about four times the word count. The page
you’re reading right now, thanks to its images, weighs in at about 650 Kb: you could still download it more than twenty times while
you were waiting for theirs.
Worse still, the most-heavyweight of the content they deliver is stuff that’s arguably strictly optional and doesn’t add to the message:
Eight different font files are served from three different domains (the fonts alone consume about 140 Kb) – seven more are
queued but not used.
Among the biggest JavaScript files they serve is that of Hotjar analytics: I understand the importance of measuring your impact, but making
your visitors – and the planet – pay for it is a little ironic.
The biggest JavaScript file seems to be for Mapbox, which as far as I can see is never actually used: that map on the page is a static
image which, incidentally, I was able to reduce from 0.5 Mb to 0.2 Mb just by running it through
a free online image compressor.
And because the site sets virtually no caching headers, even if you’ve visited the website before you’re likely to have to download the whole thing again. Every single
time.
It’s not just about bandwidth: all of those fonts, that JavaScript, their 60 Kb of CSS (this page sent you 13 Kb) all has to be parsed and interpreted by your device. If you’re on a mobile device or a laptop, that means you’re burning through lithium (a non-renewable resource whose extraction and disposal is highly polluting) and
regardless of your device you’re using you’re using more electricity to visit their site than you need to. Coding antipatterns like document.write() and active event
listeners that execute every time you scroll the page keep your processor working hard, turning electricity into waste heat. It took me over 12 seconds on a high-end smartphone and a
good 4G connection to load this page to the point of usability. That’s 12 seconds of a bright screen, a processor running full tilt,a data connection working its hardest, and a
battery ticking away. And I assume I’m not the only person visiting the website today.
This isn’t really about this particular website, of course (and I certainly don’t want to discourage anybody from the important cause of saving the planet!). It’s about the
bigger picture: there’s a widespread and long-standing trend in web development towards bigger, heavier, more power-hungry websites, built on top of heavyweight frameworks that push the
hard work onto the user’s device and which favour developer happiness over user experience. This is pretty terrible: it makes the Web slow, and brittle, and it increases the digital
divide as people on slower connections and older devices get left behind.
I walked the dog, bought myself a wax jacket from a local garden centre – so I could feel more ‘Country’ – and in the late afternoon my boss called me to say… you’re fired.
…
So begins Robin‘s latest blog (you may remember I’ve shared, talked about, and contributed to the success of his previous blogging adventure, 52 Reflect). This time around, it sort-of reads like a contemporaneously-written “what I did on my holidays” report, except that it’s pretty-much about the
opposite of a holiday: it chronicles Robin’s adventures in the North of England during these strange times.
I’ve no doubt that this represents the start of another riotous series of posts and perhaps exactly what you need to lift your spirits in these trying times. A second chapter is already online.
I’ve seen a fair share of tutorial links floating around in newsletters and Twitter and the like recently. They all promise the same thing, namely how to use React to create a
résumé.
I mean, I get it. It’s important to have something to build towards when learning a new skill, especially with development.
At first blush a résumé seems like a good thing to build towards: They are relatively small in terms of complexity and can probably use content that already exists on your LinkedIn
profile. If you’re looking for a job, it’s also a handy way to double-dip on a skill that is in high demand.
I checked out a few of these tutorials, and after noticing some patterns, I’d like to mention a few things you could do to your résumé instead. I’m not going to link to the ones I
tested because I don’t want to give bad advice more exposure than it is already getting.
…
I can’t even begin to conceive of the kind of mind that, when faced with the question of how to put their résumé/CV online, start by
installing a Javascript framework. My CV‘s online (and hey, it got me my
current job so that’s awesome) and I think it’s perfectly fabulous. Simple, human-readable, semantic HTML with microformats support. Perfectly readable on anything from lynx upwards and
you’d probably get by in telnet. Total size including all images, fonts, style and script is under 140kb, and can all be inlined with a quick command so I can have a single-file version
that looks just as great (I use this version to email to people, but I’m thinking I ought to just inline everything, all the time). Under 1kb of my payload is JavaScript, and it’s all
progressive enhancement: using an IntersectionObserver (which I’ve written about before) to highlight the current “section” of the document in
the menu. Print CSS so it looks right when you put it onto dead trees. Etc. etc.
My entire CV requires a quarter of the bandwidth of just the JavaScript of any of the handful of React-based ones I looked
up. The mind boggles. I tried disabling JavaScript on a few of them (even if you believe “nobody uses the Web without JavaScript” – and you’re wrong – then you have to admit that
sometimes JavaScript fails) and they did horrific things like not loading images or links not working, as if <img> and <a> tags were
something that requires you to npm install html@0.9 before they work..
A simpler, faster, more-accessible, more-secure Web is possible. It’s not even particularly hard. It just requires a little thought. Don’t take a sledgehammer to a walnut: the
best developers are the ones who choose the right tool for the job. Your résumé/CV is not a real-time backendless application on a
post-relational-backed microservices architecture, or whatever’s “hip” this week. It’s a page that you want to be as easy as possible to read by the widest number of people. Why make
life harder for you, and for them?