After “Monty Python’s Flying Circus” ended, Graham Chapman worked with an up-and-coming young writer named Douglas Adams on a new sketch comedy show for the BBC. It was called “Out of
the Trees,” and it bombed. Only one episode was made, and that aired only once, on January 10, 1976.
Once the Beeb gave up on “Out of the Trees,” they did to it what they did to so many other programs of that era: they erased it.
…
Chapman had recorded the show on one of the very earliest home videotape formats… it took two years to build a compatible player.
It’s neither Chapman nor Adams best work, and you can see how it got canned after only a pilot episode. But it’s not terrible.
But the lesson here is one about the challenge of archiving non-print media. Anything that needs a device to “play” it, whether it’s as simple as a vinyl record or as complex as a
videogame, is at greater risk of being lost forever. And the faster the pace of technology moves, the more stuff gets left behind as technology moves on. Is a digital dark age looming?
Are we already in it, but that won’t be known until some future date?
Found while walking into High Wycombe to work after dropping my canine caching-companion (pictured) off at the nearby veterinary hospital for an operation. Didn’t need her help with
this easy find, luckily! Shame about all the fly tipping littering this otherwise pleasant path. Greetings from Oxfordshire!
Now I’ve added support for Spartan3 too and, seeing as the implementations shared functionality, I’ve
combined all three – Gemini, Spartan, and Gopher – into a single package: CapsulePress.
CapsulePress is a Gemini/Spartan/Gopher to WordPress bridge. It lets you use WordPress as a CMS for any or all of
those three non-Web protocols in addition to the Web.
For example, that means that this post is available on all of:
It’s also possible to write posts that selectively appear via different media: if I want to put something exclusively on my gemlog, I can, by assigning metadata that
tells WordPress to suppress a post but still expose it to CapsulePress. Neat!
I’ve open-sourced the whole thing under a super-permissive license, so if you want your own WordPress blog to “feed” your Gemlog… now you can. With a few caveats:
It’s hard to use. While not as hacky as the disparate piles of code it replaced, it’s still not the cleanest. To modify it you’ll need a basic comprehension of all
three protocols, plus Ruby, SQL, and sysadmin skills.
It’s super opinionated. It’s very much geared towards my use case. It’s improved by the use of templates. but it’s still probably only suitable for this
site for the time being, until you make changes.
It’s very-much unfinished. I’ve got a growing to-do list, which should
be a good clue that it’s Not Finished. Maybe it never will but. But there’ll be changes yet to come.
Whether or not your WordPress blog makes the jump to Geminispace4, I hope you’ll came take a look at mine at one of the URLs linked above,
and then continue to explore.
If you’re nostalgic for the interpersonal Internet – or just the idea of it, if you’re too young to remember it… you’ll find it there. (That Internet never actually went away,
but it’s harder to find on today’s big Web than it is on lighter protocols.)
It turns out that by default, WordPress replaces emoji in its feeds (and when sending email) with images of those emoji, using the Tweemoji set, and with the alt-text set to the original emoji. These images are hosted at https://s.w.org/images/core/emoji/…-based
URLs.
I can see why this functionality was added: what if the feed reader didn’t support Unicode or didn’t have a font capable of showing the appropriate emoji?
But I can also see reasons why it might not be desirable to everybody. For example:
Downloading an image will always be slower than rendering an emoji.
The code to include an image is always more-verbose than simply including an emoji.
As seen above: a feed reader which imposes a minimum size on embedded images might well render one “wrong”.
It’s marginally more-verbose for screen reader users to say “Image: heart emoji” than just “heart emoji”, I imagine.
Serving an third-party image when a feed item is viewed has potential privacy implications that I try hard to avoid.
Replacing emoji with images is probably unnecessary for modern feed readers anyway.
That’s all there is to it. Now, my feed reader shows my system’s emoji instead of a huge image:
I’m always grateful to discover that a piece of WordPress functionality, whether core or in an extension, makes proper use of hooks so that its functionality can be changed, extended,
or disabled. One of the single best things about the WordPress open-source ecosystem is that you almost never have to edit somebody else’s code (and remember to re-edit it
every time you install an update).
After a break of nine and a half years, webcomic Octopuns is back. I have two thoughts:
That’s awesome. I love Octopuns and I’m glad it’s back. If you want a quick taster – a quick slice, if you will – of its kind of humour, I suggest starting with Pizza.
How did I know that Octopuns was back? My RSS reader told me. RSS remains a magical way to keep an eye on what’s happening on the Internet: it’s like a subscription service that delivers you exactly what
you want, as soon as it’s available.
If I ran a fast food franchise affected by this kind of legal action, do you know what I would do? I’d try to turn it back around into marketing exercise with a bit of crowdsourcing!
Think about it: get your customers to take photos and send them to you. For every franchisee that uses a photo you take, you get a voucher for a free meal (redeemable at any
outlet, of course). And where it appears on the digital signage menus they all seem to have nowadays, your photo will have your name on it too.
Most submissions will be… unsuitable, of course. You’ll need a team of people vetting submissions. But for every 50 people who send a blurry picture of an unappetising bit of
sludge-meat in a bun; for every 10 people who actually try hard but get too much background in or you can see the logo on their clothing or whatever; for every 5 people that
deliberately send something offensive… you might get one genuinely good candid burger picture. Those pics get pushed out to franchisees to use. Sorted.
Now if anybody complains that you fake your photos you can explain that every one of your food pictures was taken by a real-life customer, and their name or handle is on the
bottom of each one. Sure, you get to vet them, but they’re still all verifiably genuine pictures of your food.
And you probably only have to do this gimmick for a year and then everybody will forget. Crowdsourcing as a marketing opportunity: that’s what I’d be doing if I were crowned
Burger King.
Earlier this year, for reasons of privacy/love of selfhosting, I moved the DanQ.me mailing list from Mailchimp to Listmonk (there’s a blog post about how I set it up), relaying
outbound messages via an SMTP server provided by my domain registrar, Gandi.
And because I learned a few things while doing so, I wrote this blog post so that next time I have to configure Postfix + DKIM, I’ll know where to find a guide. If it helps you in the meantime, that’s just a bonus.
Postfix
Running your own mailserver is a pain. I used to do it for all of my email, but – like many other nerds – when spam reached its peak and deliverability became an issue, I gave
up and oursourced it1.
Luckily, I don’t need it to do much. I just need a mail transfer agent with an (unauthenticated, but local-only) SMTP endpoint: something that Listmonk can dump emails into, which will then reach out to the mailservers representing each of the recipients and
relay them on. A default install of Postfix does all that out-of-the-box, so I ran sudo apt install postfix, accepted all the default
options, and put the details into Listmonk.
Next, I tweaked my DNS configuration to add an SPF record, and tested it.
This ought to have been enough to achieve approximate parity with what Gandi had been providing me with. Not bad.
I sent a test email to a Gmail account, where I noticed two problems:
The first problem was that Postfix on Debian isn’t configured by-default to use opportunistic TLS when talking to other
mailservers. That’s a bit weird, but I’m sure there’s a good reason for it. The solution was to add smtp_tls_security_level = may to my
/etc/postfix/main.cf.
The second problem was that without a valid DKIM signature on them, about half of my test emails were going straight to the
spam folder. Again, it seems that since the last time I seriously ran a mailserver 20 years ago, this has become something that isn’t strictly required… but your emails aren’t
going to get through if you don’t.
I’ve put it off this long, but I think it’s finally time for me to learn some practical DKIM.
Understanding DKIM
What’s DKIM, then?
A server that wants to send email from a domain generates a cryptographic keypair.
The public part of the key is published using DNS. The private part is kept securely on the server.
When the server relays mail on behalf of a user, it uses the private key to sign the message body and a stated subset of the headers3,
and attaches the signature as an email header.
When a receiving server (or, I suppose, a client) receives mail, it can check the signature by acquiring the public key via DNS and validating the signature.
In this way, a recipient can be sure that an email received from a domain was sent with the authorisation of the owner of that domain. Properly-implemented, this is a strong mitigation
against email spoofing.
OpenDKIM
To set up my new server to sign outgoing mail, I installed OpenDKIM and its keypair generator using sudo apt install opendkim
opendkim-tools. It’s configuration file at /etc/opendkim.conf needed the following lines added to it:
# set up a socket for Postfix to connect to:
Socket inet:12301@localhost
# set up a file to specify which IPs/hosts can send through us without authentication and get their messages signed:
ExternalIgnoreList refile:/etc/opendkim/TrustedHosts
InternalHosts refile:/etc/opendkim/TrustedHosts
# set up a file to specify which selector/domain are used to each incoming email address:
SigningTable refile:/etc/opendkim/SigningTable
# set up a file to specify which signing key to use for each selector/domain:
KeyTable refile:/etc/opendkim/KeyTable
Into /etc/opendkim/TrustedHosts I put a list of local IPs/domains that would have their emails signed by this server. Mine looks like this (in this example I’m using
example.com as my domain name, and default as the selector for it: the selector can be anything you like, it only matters if you’ve got multiple
mailservers signing mail for the same domain). Note that 192.168.0.0/16 is the internal subnet on which my sending VM will
run.
/etc/opendkim/SigningTable maps email addresses (I’m using a wildcard) to the subdomain whose TXT record will hold the public key for the signature. This also goes on to
inform the KeyTable which private key to use:
*@example.com default._domainkey.example.com
And then /etc/opendkim/KeyTable says where to find the private key for that:
The public key needs publishing via DNS. Conveniently, when you create a keypair using its tools, OpenDKIM provides a sample (in
BIND-style) for you to copy-paste from or adapt: look in /etc/opendkim/keys/example.com/default.txt!
Once we’ve restarted both services (sudo service postfix restart; sudo service opendkim restart), we can test it!
So I learned something new today.
If you, too, love to spend your Saturday mornings learning something new, have a look at those subscription
options to decide how you’d like to hear about whatever I get up to next.
Footnotes
1 I still outsource my personal email, and I sing the praises of the excellent folks
behind ProtonMail.
2 My desktop email client also doubles as my newsreader, because, yes, of course
you can still find me on USENET. Which, by the way, is undergoing a mini-revival…
3 Why doesn’t DKIM sign
all the headers in an email? Because intermediary servers and email clients will probably add their own headers, thereby invalidating the signature! DKIM gets used to sign the From: header, for obvious reasons, and ought to be used for other headers whose tampering could be
significant such as the Date: and Subject:, but it’s really up to the signing server to choose a subset.
Woodward Draw by Daniel Linssen is the kind of game that my inner
Scrabble player both loves and hates. I’ve been playing on and off for the last three days to complete it, and it’s been great. While not perfectly polished1 and with a few rough
edges2, it’s still a great example of
what one developer can do with a little time.
It deserves a hat tip of respect, but I hope you’ll give it more than that by going and playing it (it’s free, and you can play online or download a copy3). I should probably check
out their other games!
Footnotes
1 At one point the background colour, in order to match a picture word, changed to almost
the same colour as the text of the three words to find!
2 The tutorial-like beginning is a bit confusing until you realise that you have to play
the turn you’re told to, to begin with, for example.
The poor little geopup’s only got tiny legs, and the 8km we’ve walked so-far has got her pretty tired-out, so this’ll be the last cache of the series before we go and find ourselves
some lunch and go home. It’s been a very enjoyable series so far, and I fully intend to return to complete it (and perhaps find some of those earlier caches that I failed to spot).
For this final cache of the morning (well, afternoon: barely!), I found the likely spot straightaway and picked up something that looked out of place. Nope, no sign of the cache though;
that’s strange. It took a few seconds to realise that yes, the cache was hidden behind the thing I’d picked up… it was just also covered with leaf little and detritus.
Soon had it retrieved in the end, though.
A huge number of butterflies flocked in the field to our right: it was quite impressive. I’ve snapped a picture showing just one, so that I can later look up what kind of butterfly it
is!
Sometimes the geo-sense “just works”. This was one of those moments. I was approaching the area and checking the distance. Then I walked straight to a likely location. Then I picked up
the cache. Done and done.
Turning South and crossing our own path, the sun came out at last and we were bathed in glorious warm light. Between that, and the familiarity of the trail we passed, the geopup and I
completely forgot for a moment that we were out to look for this next cache and overshot it: we had to turn back to get to the coordinates and find the cache. TFTC!
Worra lorra porkers! The geopup is a huge fan of sausages but I don’t think she understood that the cornucopia she was looking at across the field was the same thing, just a few years
off being ripe. Great cache container too. TFTC, and let’s chuck an FP in because this series as a whole
has definitely earned another one in my mind by now…
The geopup and I tried a couple of likely hiding places before we found this one. A nice-sized container and well-suited to its hiding place, here, TFTC!
Came by this location while doing the nearby WAG series. Was delighted to see that an OpenCache was on the route too so the geopup and I dedicated the time to a decent search. We think
we found what was once the hiding place, but the cache itself was sadly nowhere to be seen.
Between nearby GC7QC7R, which acts as a spur to this series, and OK00F7, which sits on (and predates) this series, I was feeling confident of a find here… but after an extended search the
geopup and I had to admit defeat. To be honest, she was willing to give up and press on immediately, having seen a muddy puddle up ahead that she wanted to play about it, and her
persistent lead-pulling in that direction might have reduced both my patience and the efficacy of my search! But we found a few things that might match the hint and didn’t see success,
soo… 🤷♂️