I’m not sure that I process death in the same way that “normal” people do. I blame my family.
My sisters and I have wished one another a “Happy Dead Dad Day” every 19 February since his death.
When my grandmother died in 2006 I was just in the process of packing up the car with Claire to
try to get up to visit her before the inevitable happened. I received the phone call to advise me that she’d passed, and – ten emotional
minutes later – Claire told me that she’d “never seen anybody go through the five stages of grief as fast as that
before”. Apparently I was a textbook example of the Kübler-Ross model, only at speed. Perhaps I should volunteer to stand in front of introductory psychology classes and feel things, or
something.
I guess there isn’t actually a market for Happy Dead Dad Day greetings cards?
Since my dad’s death seven years ago, I’ve marked Dead Dad Day every 19 February a way that’s definitely “mine”: with a pint or three of Guinness
(which my dad enjoyed… except if there were a cheaper Irish stout on draught because he never quite shook off his working-class roots) and some outdoors and ideally a hill, although
Oxfordshire makes the latter a little difficult. On the second anniversary of my dad’s death, I commemorated his love of setting out and checking the map later by making
my first geohashing expedition: it seemed appropriate that even without him, I could make a journey without either of us
being sure of either the route… or the destination.
Eating cornflakes together in the garden was a tradition of my dad and I’s since at least 23 years before this photo was taken.
As I implied at his funeral, I’ve always been far more-interested in celebrating life than
mourning death (that might be why I’m not always the best at supporting those in grief). I’m not saying that it isn’t sad
that he went before his time: it is. What’s worst, I think, is when I remember how close-but-not-quite he came to getting to meet his grandchildren… who’d have doubtless called him
“Grandpeter”.
We all get to live, and we’re all going to die, and I’d honestly be delighted if I thought that people might remember me with the same kind of smile (and just occasionally tear) that
finds my face every Dead Dad Day.
With 20+ years of kottke.org archives, I’ve been thinking about this issue [continuing to host old content that no longer reflects its authors views] as well. There are many
posts in the archive that I am not proud of. I’ve changed my mind in some cases and no longer hold the views attributed to me in my own words. I was too frequently a young
and impatient asshole, full of himself and knowing it all. I was unaware of my privilege and too frequently assumed things of other people and groups that were incorrect and
insensitive. I’ve amplified people and ideas in the past that I wouldn’t today.
…
Very much this! As another blogger with a 20+ year archive, I often find myself wondering how much of an impression of me is made upon my readers by some of my older posts, and what it
means to retain them versus the possibility – never yet exercised – of deleting them. I certainly have my fair share of posts that don’t represent me well or that are frankly
embarrassing, in hindsight!
I was thinking about this recently while following a thread on BoardGameGeek in which a poster
advocated for the deletion of a controversial article from the site because, as they said:
…people who stumble on our site and see this game listed could get a very (!!!) bad impression of the hobby…
This is a similar concern: a member of an online community is concerned that a particular piece of archived content does not reflect well on them. They don’t see any way in which the
content can be “fixed”, and so they propose that it is removed from the community. Parallels can be drawn to the deletionist faction within Wikipedia (if you didn’t know that Wikipedia had large-scale philosophical
disputes before now, you’re welcome: now go down the meta-wiki rabbit hole).
As for my own blog, I fall on the side of retention: it’s impossible to completely “hide” my past by self-censorship anyway as there’s sufficient archives and metadata to reconstruct
it, and moreover it feels dishonest to try. Instead, though, I do occasionally append rebuttals to older content – where I’ve time! – to help contextualise them and show that they’re
outdated. I’ve even considered partially automating this by e.g. adding a “tag” that I can rapidly apply to older posts that haven’t aged well which would in turn add a disclaimer to
the top of them.
Cool URIs don’t change. But the content behind them can. The fundamental message ought to be preserved, where possible, and so
appending and retaining history seems to be a more-valid approach than wholesale deletion.
There is a distinct lack of coloration in today’s automobiles, with the majority seemingly finished in a shade that could be found on a greyscale chart. Things are no better in the
interior; nearly always black, beige or grey, colours that architectural and couture designers refer to as neutrals. To make matters worse, these shades are all too often matched to
the exterior pigment (i.e. black with black, silver with grey) to create insidious and mind-numbing monochrome vehicles that appear to have simply been dipped whole into a large vat
of colourant.
1937 Delahaye 135, ivory and navy blue with dark red leather
Things were not always this gloomy. From the dawn of motoring through the 1920s, cars were painted in a full spectrum of colours, often in vivid combinations. The world’s first motor
vehicle, the 1886 Benz Patent-Motorwagen was green, with its fully-exposed engine finished in bright red. At the Villa d’Este or Pebble Beach Concours d’Elegance one sees a veritable
riot of colour that would likely be a bit shocking to today’s consumers: black with orange, yellow with orange, dark and light blue, dark and light green, red with blue, maroon with
red; the palette was limitless.
…
I’m not even remotely “into” cars but I loved this article… and I do think that it’s a bit of a shame that cars don’t exhibit the variety of colour that they used to, any longer. As a
kid, I remember that the old chap who lived on the other side of our street kept a remarkably old-fashioned but regal looking car (I’ve no idea what it was: I was only very young) in
racing green with maroon trim and leather, and chrome window frames. I used to think how cool it was that he got to have a car that was so distinctive and unusual, because it was
already rare to see things that didn’t just fit into the same boxy, bland palettes. Since then, things have only gotten worse: I can’t remember the time that my daily commute took me
past a car that wasn’t painted in an all-encompassing single-colour coat of metallic black, white, silver, red, or blue and with interior plastic entirely in one of two shades of dark
grey.
Hopefully it’s just a phase that we, as a society, are going through.
I’m a big believer in the idea that the hardware I lay my hands on all day, every day, needs to be the best for its purpose. On my primary desktop, I type on a Das Keyboard 4 Professional (with Cherry MX brown switches) because it looks, feels, and sounds spectacular. I use
the Mac edition of the keyboard because it, coupled with a few tweaks, gives me the best combination of features and compatibility across all of the Windows, MacOS, and Linux (and
occasionally other operating systems) I control with it. These things matter.
I don’t know what you think these buttons do, but if you’re using my keyboard, you’re probably wrong. Also, they need a clean. I swear they don’t look this grimy
when I’m not doing macro-photography.
I also care about the mouse I use. Mice are, for the most part, for the Web and for gaming and not for use in most other applications (that’s what keyboard shortcuts are for!) but
nonetheless I spend plenty of time holding one and so I want the tool that feels right to me. That’s why I was delighted when, in replacing my four year-old Logitech MX1000 in 2010 with my first Logitech Performance MX, I felt
able to declare it the best mouse in the world. My Performance MX lived for about four years, too – that seems to be how long a mouse can stand the kind of use that I give it –
before it started to fail and I opted to replace it with an identical make and model. I’d found “my” mouse, and I was sticking with it. It’s a great shape (if you’ve got larger hands),
is full of features including highly-configurable buttons, vertical and horizontal scrolling (or whatever you want to map them to), and a cool “flywheel” mouse wheel that can
be locked to regular operation or unlocked for controlled high-speed scrolling at the touch of a button: with practice, you can even use it as a speed control by gently depressing the
switch like it was a brake pedal. Couple all of that with incredible accuracy on virtually any surface, long battery life, and charging “while you use” and you’ve a recipe for success,
in my mind.
My second Performance MX stopped properly charging its battery this week, and it turns out that they don’t make them any more, so I bought its successor, the Logitech MX Master 2S.
On the left, the (new) Logitech MX Master 2S. On the right, my (old) Logitech Performance MX.
The MX Master 2S is… different… from its predecessor. Mostly in good ways, sometimes less-good. Here’s the important differences:
Matte coating: only the buttons are made of smooth plastic; the body of the mouse is now a slightly coarser plastic: you’ll see in the photo above how much less light
it reflects. It feels like it would dissipate heat less-well.
Horizontal wheel replaces rocker wheel: instead of the Performance MX’s “rocker” scroll wheel that can be pushed sideways for horizontal scroll, the MX Master 2S adds
a dedicated horizontal scroll (or whatever you reconfigure it to) wheel in the thumb well. This is a welcome change: the rocker wheel in both my Performance MXes became less-effective
over time and in older mice could even “jam on”, blocking the middle-click function. This seems like a far more-logical design.
New back/forward button shape: to accommodate the horizontal wheel, the “back” and “forward” buttons in the thumb well have been made smaller and pushed closer
together. This is the single biggest failing of the MX Master 2S: it’s clearly a mouse designed for larger hands, and yet these new buttons are slightly, but noticeably, harder to
accurately trigger with a large thumb! It’s tolerable, but slightly annoying.
Bluetooth support: one of my biggest gripes about the Performance MX was its dependence on Unifying, Logitech’s proprietary wireless protocol. The MX Master 2S
supports Unifying but also supports Bluetooth, giving you the best of both worlds.
Digital flywheel: the most-noticable change when using the mouse is the new flywheel and braking mechanism, which is comparable to the change in contemporary cars
from a mechanical to a digital handbrake. The flywheel “lock” switch is now digital, turning on or off the brake in a single stroke and so depriving you of the satisfaction of using
it to gradually “slow down” a long spin-scroll through an enormous log or source code file. But in exchange comes an awesome feature called SmartShift, which dynamically
turns on or off the brake (y’know, like an automatic handbrake!) depending on the speed with which you throw the wheel. That’s clever and intuitive and “just works” far better than
I’d have imagined: I can choose to scroll slowly or quickly, with or without the traditional ratchet “clicks” of a wheel mouse, with nothing more than the way I flick my finger (and
all fully-configurable, of course). And I’ve still got the button to manually “toggle” the brake if I need it. It took some getting used to, but this change is actually really cool!
(I’m yet to get used to the sound of the digital brake kicking in automatically, but that’s true of my car too).
Basic KVM/multi-computing capability: with a button on the underside to toggle between different paired Unifying/Bluetooth transceivers and software support for
seamless edge-of-desktop multi-computer operation, Logitech are clearly trying to target folks who, like me, routinely run multiple computers simultaneously from a single keyboard and
mouse. But it’s a pointless addition in my case because I’ve been quite happy using Synergy to do this for
the last 7+ years, which does it better. Still, it’s a harmless “bonus” feature and it might be of value to others, I suppose.
All in all, the MX Master 2S isn’t such an innovative leap forward over the Performance MX as the Performance MX was over the MX1000, but it’s still great that this spectacular series
of heavyweight workhouse feature-rich mice continues to innovate and, for the most part, improve upon the formula. This mouse isn’t cheap, and it isn’t for everybody, but if you’re a
big-handed power user with a need to fine-tune all your hands-on hardware to get it just right, it’s definitely worth a look.
A man threatened to sue a technology magazine for using his image in a story about why all hipsters look the same, only to find out the picture was of a completely different guy.
Asynchronous JavaScript in the form of Single Page Applications (SPA) offer an incredible opportunity for improving the user experience of your web
applications. CSS frameworks like Bootstrap enable developers to quickly contribute styling as they’re working on the structure and behaviour of things.
Unfortunately, SPA and CSS frameworks tend to result in relatively complex solutions where traditionally separated concerns – HTML-structure, CSS-style, and JS-behaviour – are blended
together as a matter of course — Counter to the lessons learned by previous generations.
This blending of concerns can prevent entry level developers and valued specialists (Eg. visual design, accessibility, search engine optimization, and internationalization) from
making meaningful contributions to a project.
In addition to the increasing cost of the few developers somewhat capable of juggling
all of these concerns, it can also result in other real world business implications.
…
What is a front-end developer? Does anybody know, any more? And more-importantly, how did we get to the point where we’re actively encouraging young developers into habits like
writing (cough React cough) files containing a bloaty, icky mixture of content, HTML (markup), CSS (style), and Javascript (behaviour)? Yes, I get that the idea is that individual components should be packaged
together (if you’re thinking in a React-like worldview), but that alone doesn’t justify this kind of bullshit antipattern.
It seems like the Web used to have developers. Then it got complex so we started differentiating back-end from front-end developers and described those who, like me, spanned the divide,
as full-stack developers We gradually became a minority as more and more new developers, deprived of the opportunity to learn each new facet organically in this newly-complicated
landscape, but that’s fine. But then… we started treating the front-end as the only end, and introducing all kinds of problems as a result… and most people don’t seem to have
noticed, yet, exactly how much damage we’re doing to Web applications’ security, maintainability, future-proofibility, archivability, addressibility…
The current iteration of my blog diverges from an architectural principle common to most of previous versions of the last 20 years. While
each previous change in design and layout was intended to provide a single monolithic upgrade, this version tries to provide me with a platform for continuous ongoing
experimentation and change.
I’ve been trying to make better use of my blog as a vehicle for experimenting with web technologies, as I used to with personal sites back in the 1990s and early 2000s; to see a vanity
site like this one as a living playground rather than something that – like most of the sites I’m paid to work on – something whose design is, for the most part, static for
long periods of time.
The “popular” flag and associated background colour in the “Blog” top-level menu became permanent after a period of A/B testing. Thanks, unwitting testers!
I’m not entirely happy with the design of these boxes, but that’s a job for another day.
The grid of recent notes, shares, checkins and videos on my
homepage is powered by the display: grid; CSS directive. The number of columns varies by screen width from six
on the widest screens down to three or just one on increasingly small screens. Crucially, grid-auto-flow: dense; is used to ensure an even left-to-right filling of the
available space even if one of the “larger” blocks (with grid-column: span 2; grid-row: span 2;) is forced for space reasons to run onto the next line. This means that
content might occasionally be displayed in a different order from that in which it is written in the HTML (which is reverse
order of publication), but in exchange the items are flush with both sides.
The large “5 Feb” item in this illustration should, reverse-chronologically, appear before the “3 Feb” item, but there isn’t room for it on the previous line. grid-auto-flow:
dense; means that the “3 Feb” item is allowed to bubble-up and fill the gap, appearing out-of-order but flush with the edge.
Not all web browsers support display: grid; and while that’s often only one of design and not of readability because these browsers will fall back to usually-very-safe
default display modes like block and inline, as appropriate, sometimes there are bigger problems. In Internet Explorer 11, for example, I found (with thanks to
@_ignatg) a problem with my directives specifying the size of these cells (which are actually <li> elements because, well,
semantics matter). Because it understood the directives that ought to impact the sizing of the list items but not
the one that redeclared its display type, IE made… a bit of a mess of things…
Thanks, Internet Explorer. That’s totally what I was looking for.
Do websites need to look the same in every browser? No. But the content should be readable
regardless, and here my CSS was rendering my content unreadable. Given that Internet Explorer users represent a little
under 0.1% of visitors to my site I don’t feel the need to hack it to have the same look-and-feel: I just need it to have the same content readability. CSS Feature Queries to the rescue!
CSS Feature Queries – the @supports selector – make it possible to apply parts of your stylesheet if and only if
the browser supports specific CSS features, for example grids. Better yet, using it in a positive manner (i.e. “apply these
rules only if the browser supports this feature”) is progressive enhancement, because browsers that don’t understand the @supports selector act in
the same way as those that understand it but don’t support the specified feature. Fencing off the relevant parts of my stylesheet in a @supports (display: grid) { ... }
block instructed IE to fall back to displaying that content as a boring old list: exactly what I needed.
It isn’t pretty, but it’s pretty usable!
Reduced-motion support
I like to put a few “fun” features into each design for my blog, and while it’s nowhere near as quirky as having my head play peek-a-boo when you
hover your cursor over it, the current header’s animations are in the same ballpark: hover over or click on some of the items in the header menu to see for yourself..
I’m most-pleased with the playful “bounce” of the letter Q when you hover over my name.
These kinds of animations are fun, but they can also be problematic. People with inner ear disorders (as well as people who’re just trying to maximise the battery life on their portable
devices!) might prefer not to see them, and web designers ought to respect that choice where possible. Luckily, there’s an emerging standard to acknowledge that: prefers-reduced-motion. Alongside its cousins inverted-colors, prefers-reduced-transparency, prefers-contrast and
prefers-color-scheme (see below for that last one!), these new CSS tools allow developers to optimise based on the accessibility
features activated by the user within their operating system.
In Windows you turn off animations while in MacOS you turn on not-having animations, but the principle’s the same.
If you’ve tweaked your accessibility settings to reduce the amount of animation your operating system shows you, this website will respect that choice as well by not animating the
contents of the title, menu, or the homepage “tiles” any more than is absolutely necessary… so long as you’re using a supported browser, which right now means Safari or Firefox (or the
“next” version of Chrome). Making the change itself is pretty simple: I just added a @media screen and (prefers-reduced-motion: reduce) { ... } block to disable or
otherwise cut-down on the relevant animations.
Dark-mode support
…
Similarly, operating systems are beginning to
support “dark mode”, designed for people trying to avoid eyestrain when using their computer at night. It’s possible for your browser to respect this and try to “fix” web pages for
you, of course, but it’s better still if the developer of those pages has anticipated your need and designed them to acknowledge your choice for you. It’s only supported in Firefox and
Safari so far and only on recent versions of Windows and MacOS, but it’s a start and a helpful touch for those nocturnal websurfers out there.
Come to the dark side, Luke. Or just get f.lux, I suppose.
It’s pretty simple to implement. In my case, I just stacked some overrides into a @media (prefers-color-scheme: dark) { ... } block, inverting the background and primary
foreground colours, softening the contrast, removing a few “bright” borders, and darkening rather than lightening background images used on homepage tiles. And again, it’s an example of
progressive enhancement: the (majority!) of users whose operating systems and/or browsers don’t yet support this feature won’t be impacted by its inclusion in my stylesheet, but those
who can make use of it can appreciate its benefits.
This isn’t the end of the story of CSS experimentation on my blog, but it’s a part of the it that I hope you’ve enjoyed.
One of the most common and effective ways to manage the caching of your assets is via the Cache-Control HTTP header. This header applies to
individual assets, meaning everything on our pages can have a very bespoke and granular cache policy. The amount of control we’re granted makes for very intricate and powerful caching
strategies.
A Cache-Control header might look something like this:
Cache-Control: public, max-age=31536000
Cache-Control is the header, and each of public and max-age=31536000 are directives. The Cache-Control header can accept one or more directives, and it is these
directives, what they really mean, and their optimum use-cases that I want to cover in this post.
…
A great reference for configuring your HTTP caching headers.
…Jesse James Garett of Adaptive Path finally gave this use of Javascript a name. He called it Asynchronous Javascript and XML (thankfully shortened to Ajax). Ajax wasn’t a single
method or technology, but a group of principles based on the work being done at Google that described how to handle JavaScript in more demanding web applications. The short version is
actually fairly simple. Use XMLHttpRequest to make request to the server, take the XML data returned, and lay it out on the page using JavaScript and semantic HTML.
Ajax took off after that. It underpins major parts of modern web development, and has spawned a number of frameworks and methodologies. The technology itself has been swapped out over
time. XML was replaced by JSON, and XMLHttpRequest by fetch. That doesn’t matter though. Ajax showed a lot of developers that the web wasn’t just about documents.
The web could be used to build applications. That might seem like a small notion, but trust me when I say, it was momentous.
The History of The Web is great, and you should read it, but this piece is particularly interesting. Ajax and its spiritual successors laid the groundwork for rich Internet
applications, shell apps, and the real-time Web, but they also paved the way for a handful of the inevitable practices that went alongside: Javascript-required websites, API-driven web
and mobile applications with virtually no HTML content whatsoever… and these things have begun to become problematic for the health of the Web as a whole.
I love Ajax, but it absolutely must be employed as progressive enhancement where possible.
Human beings leave physical impressions upon the things they love and use just as much as their do upon the lives of people and the planet they live upon. For every action, there’s a
reaction. For every pressure, there’s an affect on mass and volume. And in the impressions left by that combination, particularly if you’re lucky enough to see the sides of a rare,
unrestored vintage Pac-Man cabinet, lies the never before told story of how we really played the game.
Until now, I don’t believe anyone has ever written about it.
…
Interesting exploration of the history of the cabinets housing Pac-Man, observing the ergonomic impact of the controls on the way that people would hold the side of the machine and, in
turn, how that would affect where and how the paint would wear off.
Apparently the NCSF (US) are typing to make 28 February into Metamour Day: a
celebration of one’s lover’s lovers. While I’m not convinced that’ll ever get Hallmark’s interest, I thought it provided a good opportunity to sing the praises of my metamour, JTA.
This is a man who knows how to use Greek myths and legends to add magic to his daughter’s museum visit.
I first met JTA 15 years ago at Troma Night XX, when his girlfriend Ruth – an attendee of Troma Night since its earliest days the previous year – brought him along and we all mocked his three-letter initialism.
Contrary to our previous experience, thanks to Liz, of people bringing boyfriends once but never again (we always assumed that we
scared them off), JTA became a regular, even getting to relive some of the early nights that he’d missed in our nostalgic 50th event. Before long, I felt glad to count him among my friends.
Almost 13 years ago I described JTA thusly, and I stand by it:
You have a fantastic temper which you keep carefully bottled away and of which you draw out only a little at a time and only where it is genuinely justly deserved. Conversely, your
devotion to the things you love and care about is equally inspiring.
We’d be friends anyway, but having a partner-in-common has given us the opportunity for a closer relationship still. I love you, man: y’know, in the Greek way. Happy metamour
appreciation day.
Thread by @LaurenHerschel: “After what has been a surprisingly okayish Christmas, I had a moment today in SuperStore. Saw a lady who remindedndma, who even in the early stages of
dementia, completely understood that my mom died. I thought I’d share t […]”
After what has been a surprisingly okayish Christmas, I had a moment today in SuperStore. Saw a lady who reminded me of my 92yo grandma, who even in the early stages of
dementia, completely understood that my mom died.
I thought I’d share the Ball in the Box analogy my Dr told me
So grief is like this:
There’s a box with a ball in it. And a pain button.
And no, I am not known for my art skills.
In the beginning, the ball is huge. You can’t move the box without the ball hitting the pain button. It rattles around on its own in there and hits the button over and over. You
can’t control it – it just keeps hurting. Sometimes it seems unrelenting.
Over time, the ball gets smaller. It hits the button less and less but when it does, it hurts just as much. It’s better because you can function day to day more easily. But the
downside is that the ball randomly hits that button when you least expect it.
For most people, the ball never really goes away. It might hit less and less and you have more time to recover between hits, unlike when the ball was still giant.
I thought this was the best description of grief I’ve heard in a long time.
I told my step dad about the ball in the box (with even worse pictures). He now uses it to talk about how he’s feeling.
“The Ball was really big today. It wouldn’t lay off the button. I hope it gets smaller soon.”