Dynamically-Deployed Static Site Subdomains on Caddy

I’ve recently been experimenting with where I host my small and open-source static sites. In my latest experiment, I wanted to try a low-maintenance selfhosting solution1. Here’s what I wanted:

  • Pushing to the main branch of my GitHub/Codeberg/wherever repo would send a webook to my server.
  • Upon receiving the webhook, my server would pull the latest changes2.
  • Using a wildcard certificate, my webserver automatically mounts each project at a subdomain matching its project name3.

Here’s what I came up with:

Step 1: webhook handler

I’m using Caddy as my webserver, because despite its considerable power and versatility it’s a breeze to set up. To sort wildcard DNS later I’ll want to swap in a custom build, but to get started I just ran apt install caddy. Then I used apt install webhook to install Adnan Hajdarević’s webhook endpoint, and tied the two together in my Caddyfile:

webhook.duckling.danq.me {
  reverse_proxy localhost:9000
}
My static server’s called duckling.danq.me, so you’ll see that turn up a lot in these configs.

Then I created a webhook in a GitHub repository:

GitHub webhook pointing to https://webhook.duckling.danq.me/hooks/github-push, with a secret set and SSL verification enabled, triggered by a push event.
I generated a long random string to use as the secret, and kept a copy for later.

When you create a webhook in GitHub it immediately sends a test event, but it doesn’t quite look like a real push event so I pushed an inconsequential change to the repo to trigger another. Once you’ve got a “real” one sent, you can re-send it via the “Recent Deliveries” tab as many times as you like, to help with testing.

Then, on the server, I checked-out a copy of the code (anonymously: this is a public repository so I don’t need keys to read from it anyway) and set up my /etc/webhook.conf to expect these calls:

[
    {
      "id": "github-push",
      "execute-command": "/var/www/github-push/webhook.sh",
      "command-working-directory": "/var/www/github-push/",
      "pass-arguments-to-command": [
        {
          "source": "payload",
          "name": "repository.name"
        }
      ],
      "trigger-rule": {
        "and": [
          {
            "match": {
              "type": "payload-hash-sha256",
              "secret": "[MY SECRET KEY HERE]",
              "parameter": {
                "source": "header",
                "name": "X-Hub-Signature-256"
              }
            }
          },
          {
            "match": {
              "type": "value",
              "value": "refs/heads/main",
              "parameter": {
                "source": "payload",
                "name": "ref"
              }
            }
          }
        ]
      }
    }
  ]
  
The trigger-rule directives ensure that (a) the secret key is correct (it uses a HMAC hash across the entire JSON request, so it prevents payload tampering too) and (b) the event only triggers on pushes to the main branch. The execute-command specifies the Bash script I want to run when the webhook is triggered. The pass-arguments-to-command configuration says to send the repo name on to that script.

Now all I needed to do was write the /var/www/github-push/webhook.sh Bash script so that it pulled the latest copy of the code when triggered:

#!/bin/bash
cd /var/www/github-push/$1 && git pull

I was able to test this by pushing inconsequential changes to my codebase and watching them get replicated down to my webserver. Neat!

Step 2: low-maintenance webserver

After pointing the DNS for *.static.duckling.danq.me at my static server, I set about configuring Caddy to be able to use DNS-01 challenges to get itself wildcard SSL certificates4. Caddy can’t do DNS-01 challenges out of the box, so you either need to write your own renewal script or compile Caddy with plugins corresponding to your DNS provider. My domains’ DNS are managed by a mixture of AWS Route 53, Gandi, and Namecheap, so my xcaddy build step looked like this:

xcaddy build \
  --with github.com/caddy-dns/route53 \
  --with github.com/caddy-dns/gandi \
  --with github.com/caddy-dns/namecheap

Of course, if I’d have preferred somebody else build it for me, CaddyServer’s download configurator would have done it for me on-demand.

For Gandi and Namecheap I just need a personal access token or API key, respectively, but Route 53’s configuration is slightly more-involved: I needed to create a new user via IAM and give it permission to write DNS TXT records for the appropriate hosted zone. Fortunately the guide for the caddy-dns/route53 repo had an almost copy-pastable example.

I added the AWS access key and secret key as environment variables (like this!) into my /etc/systemd/system/multi-user.target.wants/caddy.service service definition, and then told my Caddyfile to make use of them when renewing the wildcard certificate:

*.static.duckling.danq.me {
    tls {
        dns route53 {
          access_key_id {env.AWS_ACCESS_KEY_ID}
          secret_access_key {env.AWS_SECRET_ACCESS_KEY}
        }
      }
      root * /var/www/github-push/{http.request.host.labels.4}
      file_server
    }
}
The {http.request.host.labels.4} refers to the fourth part of the domain name, when separated at the dots and counted from the right, so 0 = me, 1 = danq, 2 = duckling, 3 = static, and 4 = the part that we’re interested in. So long as I don’t store any other directories in the /var/www/github-push/ directory then this will simply map each subdomain onto its git repository name and return a 404 for any other request.

DNS-01 challenges are necessarily slower than HTTP-01/ALPN challenges, because they’re limited by DNS propogation, so it took a while before the certificate was issued. I ran Caddy in the foreground to watch the logs while it did so:

Caddy webserver logs, with a highlighted section showing a DNS-01 challenge for *.static.duckling.danq.me repeatedly fail and then eventually succeed, then a certificate chain being installed.

You can see the whole thing working (for now at least; I don’t know if I’m keeping this approach!) by going to e.g. embed-html.static.duckling.danq.me, which dynamically tracks the main branch of the embed-html repo on GitHub.

I don’t yet know if this is going to be the future forever-home of my many static site side projects, but it’s certainly been the most-satisfying experiment to run so-far.

Footnotes

1 I’ve drifted away from selfhosting simple static sites lately because I’ve accidentally broken them with configuration changes too many times! But I figured I’d be open to in-housing them again if I had a single simple architecture for them all, so I spun up a VPS and gave it a go

2 Running a build script or some other static site generation tool is out of scope for now, but I want to be able to confirm that it would be possible in the future.

3 It also needs to be possible for me to map other domain names to it, but that’s a triviality.

4 It’s absolutely possible to use tls { on_demand } to do this, but it’s better to use a wildcard certificate which can be pre-generated and doesn’t let people trick your server into making ludicrous numbers of certificate requests by hammering random subdomain names.

× ×

Moving a static site from GitHub to Codeberg Pages

Late to the party,1 I finally got around to experimentally moving a GitHub Pages-hosted static site to Codeberg. I wanted a low-risk site to try first, so I moved Beige Buttons, the site hosting my “90s PC turbo button simulator” web component.

Ê

Mostly for my own benefit later, here’s the steps I took and the things I learned along the way:

  1. You can migrate a repository across in about two clicks. Easy!
  2. Codeberg Pages is deployed from the pages branch. If there’s no build step to the static site, all you need to do is rename the main branch to pages (and probably make it the default branch).2
  3. The default URL is https://username.codeberg.page/repository.
  4. You can use a custom domain by adding a .domains file that lists domains; if migrating from GitHub Pages you can just rename your CNAME file to .domains.
  5. You’ll need to tweak your DNS CNAME, ALIAS (or, worst-case, A/AAAA) record to point at Codeberg Pages.3

Change propogation feels slightly slower than GitHub, but perfectly tolerable.

The one thing that’s causing me trouble is that Codeberg Pages’ CORS headers prevent people from hotlinking the Beige Buttons JS, so there are some projects for which this wouldn’t be a suitable migration (issues are raised). But for most static sites, it’d probably Just Work and seems to be a great alternative.

Footnotes

1 With thanks to Kev for reminding me I’d had this on my list.

2 There are other ways to deploy but they don’t support custom domains yet.

3 Like GitHub Pages, Codeberg Pages uses LetsEncrypt for certificate provision, so you don’t need to change any CAA records.

Note #26999

In my first few weeks at my new employer, my code contributions have added 218 lines of code, deleted 2,663. Only one of my PRs has resulted in a net increase in the size of their codebases (by two lines).

GitHub change summary showing +218 lines added, -2,663 lines removed.

I need to pick up the pace if I’m going to reach the ultimate goal of deleting ALL of the code within my lifetime. (That’s the ultimate aim, right?)

×

Note #25341

Not every code review is fireworks, but most Three Rings changes come from the actual needs of the voluntary organisations that Three Rings supports. Some of our users were confused by the way the Admin > Roles page was laid out, so one of our volunteers wrote an improved version.

And because we’re all about collaboration, discussion, learning from one another, and volunteer-empowerment… I’ve added a minor suggestion… but approved their change “with or without” it. I trust my fellow volunteer to either accept my suggestion (if it’s right), reject it (if it’s wrong), or solicit more reviews or bring it to Slack or our fortnightly dev meeting (if it requires discussion).

GitHub screenshot showing a suggested change to a pull request. Dan-Q is suggesting that the new sentence 'These roles belong to .' should not have a full stop at the end, for consistency with other similar headings.

×

Note #25337

My first task this International Volunteer Day is to test a pull request that aims to fix a bug with Three Rings’ mail merge fields functionality. As it’s planned to be a hotfix (direct into production) we require extra rigor and more reviewers than code that just goes into the main branch for later testing on our beta environment.

My concern is that fixing this bug might lead to a regression not described by our automated tests, so I’m rolling back to a version from a couple of months ago to compare the behaviour of the affected tool then and now. Sometimes you just need some hands-on testing!

Screenshot showing web application Three Rings viewed on a local development machine, being used to compose an email about International Volunteer Day. A merge field called 'first_name' is included in the email, and it's hand-annotated with the question 'working?'. Alongside, GitHub Desktop shows that we've rolled-back to a revision from two months ago.

×

Note #24772

Day #2 of my sabbatical had a morning in which I’ve mostly been roped into some charity-related digital forensics… until I got distracted by dndle.app, which apparently I accidentally broke yesterday! Move Fast and Fix Things!

Dan, looking concerned, in front of Github and Dndle.app.

×

Zero

✅ Inbox Zero
✅ Slack Notification Zero
✅ Assigned PR Reviews Zero
✅ Owned PRs… one, but it’s approved and just waiting for the right moment to merge

That’s got the be the first time in… literally years… that I’ve ended a workday so “clean”. Feels amazing.

There’ll be a mess again tomorrow, but hopefully only of a manageable size because I’m particularly clean to finish this week at “Work Zero”.

Digital Dustbusting

tl;dr: I’m tidying up and consolidating my personal hosting; I’ve made a little progress, but I’ve got a way to go – fortunately I’ve got a sabbatical coming up at work!

At the weekend, I kicked-off what will doubtless be a multi-week process of gradually tidying and consolidating some of the disparate digital things I run, around the Internet.

I’ve a long-standing habit of having an idea (e.g. gamebook-making tool Twinebook, lockpicking puzzle game Break Into Us, my Cheating Hangman game, and even FreeDeedPoll.org.uk!), deploying it to one of several servers I run, and then finding it a huge headache when I inevitably need to upgrade or move said server because there’s such an insane diversity of different things that need testing!

Screenshot from Cheating Hangman: I guessed an 'E', but when I guessed an 'O' I was told that there was one (the computer was thinking of 'CLOSE'), but now there isn't because it's switched to a different word that ends with 'E'.
My “cheating hangman” game spun out from my analysis of the hardest words for an optimal player to guess, which was in turn inspired by the late Nick Berry’s examination of optimal strategy.

I can simplify, I figured. So I did.

And in doing so, I rediscovered several old projects I’d neglected or forgotten about. I wonder if anybody’s still using any of them?

Hosting I’ve tidied so far…

  • Cheating Hangman is now hosted by GitHub Pages.
  • DNDle, my Wordle-clone where you have to guess the Dungeons & Dragons 5e monster’s stat block, is now hosted by GitHub Pages. Also, I fixed an issue reported a month ago that meant that I was reporting Giant Scorpions as having a WIS of 19 instead of 9.
  • Abnib, which mostly reminds people of upcoming birthdays and serves as a dumping ground for any Abnib-related shit I produce, is now hosted by GitHub Pages.
  • RockMonkey.org.uk, which doesn’t really do much any more, is now hosted by GitHub Pages.
  • EGXchange, my implementation of a digital wallet for environmentally-friendly cryptocurrency EmmaGoldCoin, which I’ve written about before, is now hosted by GitHub Pages.
  • Sour Grapes, the single-page promo for a (remote) murder mystery party I hosted during a COVID lockdown, is now hosted by GitHub Pages.
  • A convenience-page for giving lost people directions to my house is now hosted by GitHub Pages.
  • Dan Q’s Things is now automatically built on a schedule and hosted by GitHub Pages.
  • Robin’s Improbable Blog, which spun out from 52 Reflect, wasn’t getting enough traffic to justify “proper” hosting so now it sits in a Docker container on my NAS.
  • My μlogger server, which records my location based on pings from my phone, has also moved to my NAS. This has broken Find Dan Q, but I’m not sure if I’ll continue with that in its current form anyway.
  • All of my various domain/subdomain redirects have been consolidated on, or are in the process of moving to, to a tiny Linode/Akamai instance. It’s a super simple plain Nginx server that does virtually nothing except redirect people – this is where I’ll park the domains I register but haven’t found a use for yet, in future.
Screenshot showing EGXchange, saying "everybody has an EGX wallet, log in to yours now".
I was pretty proud of EGXchange.org, but I’ll be first to admit that it’s among the stupider of my throwaway domains.

It turns out GitHub pages is a fine place to host simple, static websites that were open-source already. I’ve been working on improving my understanding of GitHub Actions anyway as part of what I’ve been doing while wearing my work, volunteering, and personal hats, so switching some static build processes like DNDle’s to GitHub Actions was a useful exercise.

Stuff I’m still to tidy…

There’s still a few things I need to tidy up to bring my personal hosting situation under control:

DanQ.me

Screenshot showing this blog post.
You’re looking at it. But later this year, you might be looking at it… elsewhere?

This is the big one, because it’s not just a WordPress blog: it’s also a Gemini, Spartan, and Gopher server (thanks CapsulePress!), a Finger server, a general-purpose host to a stack of complex stuff only some of which is powered by Bloq (my WordPress/PHP integrations): e.g. code to generate the maps that appear on my geopositioned posts, code to integrate with the Fediverse, a whole stack of configuration to make my caching work the way I want, etc.

FreeDeedPoll.org.uk

Right now this is a Ruby/Sinatra application, but I’ve got a (long-running) development branch that will make it run completely in the browser, which will further improve privacy, allow it to run entirely-offline (with a service worker), and provide a basis for new features I’d like to provide down the line. I’m hoping to get to finishing this during my Automattic sabbatical this winter.

Screenshot showing freedeedpoll.org.uk
The website’s basically unchanged for most of a decade and a half, and… umm… it looks it!

A secondary benefit of it becoming browser-based, of course, is that it can be hosted as a static site, which will allow me to move it to GitHub Pages too.

Geohashing.site

When I took over running the world’s geohashing hub from xkcd‘s Randall Munroe (and davean), I flung the site together on whatever hosting I had sitting around at the time, but that’s given me some headaches. The outbound email transfer agent is a pain, for example, and it’s a hard host on which to apply upgrades. So I want to get that moved somewhere better this winter too. It’s actually the last site left running on its current host, so it’ll save me a little money to get it moved, too!

Screenshot from Geohashing.site's homepage.
Geohashing’s one of the strangest communities I’m honoured to be a part of. So it’d be nice to treat their primary website to a little more respect and attention.

My FreshRSS instance

Right now I run this on my NAS, but that turns out to be a pain sometimes because it means that if my home Internet goes down (e.g. thanks to a power cut, which we have from time to time), I lose access to the first and last place I go on the Internet! So I’d quite like to move that to somewhere on the open Internet. Haven’t worked out where yet.

Next steps

It’s felt good so far to consolidate and tidy-up my personal web hosting (and to rediscover some old projects I’d forgotten about). There’s work still to do, but I’m expecting to spend a few months not-doing-my-day-job very soon, so I’m hoping to find the opportunity to finish it then!

× × × × ×