How to explain academic publishing to a five year old

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Last week I tweeted a cow-based academic publishing analogy in response to the prompt in the title, and the replies and quote-tweets extended the metaphor so gloriously, so creatively, so bleakly and hilariously at the same time, that I’ve pulled my favourites together below.

Here’s the original tweet:

Speaking as a goat, I approve of open access.

When I took a diversion from my various computer science related qualifications to study psychotherapy for a while, I was amazed to discover how fortunate we computer scientists are that so much of our literature is published open access. It probably comes from the culture of the discipline, whose forefathers were publishing their work as open-source software or on the Internet long before academic journals reached the online space. But even here, there’s journal drama and all the kinds of problems that Ned (and the people who replied to his tweet) joke about.

New browser on the block: Flow

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

2020 is only three weeks old, but there has been a lot of browser news that decreases rendering engine diversity. It’s time for some good news on that front: a new rendering engine, Flow. Below I conduct an interview with Piers Wombwell, Flow’s lead developer.

This year alone, on the negative side Mozilla announced it’s laying off 70 people, most of whom appear to come from the browser side of things, while it turns out that Opera’s main cash cow is now providing loans in Kenya, India, and Nigeria, and it is looking to use ‘improved credit scoring’ (from browsing data?) for its business practices.

On the positive side, the Chromium-based Edge is here, and it looks good. Still, rendering engine diversity took a hit, as we knew it would ever since the announcement.

So let’s up the diversity a notch by welcoming a new rendering engine to the desktop space. British company Ekioh is working on a the Flow browser, which sports a completely new multi-threaded rendering engine that does not have any relation to WebKit, Gecko, or Blink.

Well, I didn’t expect this bit of exciting news!

I’m not convinced that Flow is the solution to all the world’s problems (its target platforms and use-cases alone make it unlikely to make it onto the most-used-browsers leaderboard any time soon), but I’m really glad that my doomsaying about the death of browser diversity being a one-way street might… might… turn out not to be true.

Time will tell. But for now, this is Good News for the Web.

CSS4 is here!

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I think that CSS would be greatly helped if we solemnly state that “CSS4 is here!” In this post I’ll try to convince you of my viewpoint.

I am proposing that we web developers, supported by the W3C CSS WG, start saying “CSS4 is here!” and excitedly chatter about how it will hit the market any moment now and transform the practice of CSS.

Of course “CSS4” has no technical meaning whatsoever. All current CSS specifications have their own specific versions ranging from 1 to 4, but CSS as a whole does not have a version, and it doesn’t need one, either.

Regardless of what we say or do, CSS 4 will not hit the market and will not transform anything. It also does not describe any technical reality.

Then why do it? For the marketing effect.

Hurrah! CSS4 is here!

I’m sure that, like me, you’re excited to start using the latest CSS technologies, like paged media, hyphen control, the zero-specificity :where() selector, and new accessibility selectors like the ‘prefers-reduced-motion’ @media query. The browser support might not be “there” yet, but so long as you’ve got a suitable commitment to progressive enhancement then you can be using all of these and many more today (I am!). Fantastic!

But if you’ve got more than a little web savvy you might still be surprised to hear me say that CSS4 is here, or even that it’s a “thing” at all. Welll… that’s because it isn’t. Not officially. Just like JavaScript’s versioning has gone all evergreen these last few years, CSS has gone the same way, with different “modules” each making their way through the standards and implementation processes independently. Which is great, in general, by the way – we’re seeing faster development of long-overdue features now than we have through most of the Web’s history – but it does make it hard to keep track of what’s “current” unless you follow along watching very closely. Who’s got time for that?

When CSS2 gained prominence at around the turn of the millennium it was revolutionary, and part of the reason for that – aside from the fact that it gave us all some features we’d wanted for a long time – was that it gave us a term to rally behind. This browser already supports it, that browser’s getting there, this other browser supports it but has a f**ked-up box model (you all know the one I’m talking about)… we at last had an overarching term to discuss what was supported, what was new, what was ready for people to write articles and books about. Nobody’s going to buy a book that promises to teach them “CSS3 Selectors Level 3, Fonts Level 3, Writing Modes Level 3, and Containment Level 1”: that title’s not even going to fit on the cover. But if we wrapped up a snapshot of what’s current and called it CSS4… now that’s going to sell.

Can we show the CSS WG that there’s mileage in this idea and make it happen? Oh, I hope so. Because while the modular approach to CSS is beautiful and elegant and progressive… I’m afraid that we can’t use it to inspire junior developers.

Also: I don’t want this joke to forever remain among the top results when searching for CSS4

Reinventing Git interface

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I also ask not to discard all this nonsense right away, but at least give it a fair round of thought. My recommendations, if applied in their entirety, can radically change Git experience. They can move Git from advanced, hacky league to the mainstream and enable adoption of VCS by very wide groups of people. While this is still a concept and obviosly requires a hell of a job to become reality, my hope is that somebody working on a Git interface can borrow from these ideas and we all will get better Git client someday.

I love Git. But I love it more conceptually than I do practically. Everything about its technical design is elegant and clever. But most of how you have to act when you’re using it just screams “I was developed by lots of engineers and by exactly zero UX developers.” I can’t imagine why.

Nikita proposes ways in which it can be “fixed” while retaining 100% backwards-compatibility, and that’s bloody awesome. I hope the Git core team are paying attention, because these ideas are gold.

I Sexually Identify as an Attack Helicopter

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I sexually identify as an attack helicopter.

I lied. According to US Army Technical Manual 0, The Soldier as a System, “attack helicopter” is a gender identity, not a biological sex. My dog tags and Form 3349 say my body is an XX-karyotope somatic female.

But, really, I didn’t lie. My body is a component in my mission, subordinate to what I truly am. If I say I am an attack helicopter, then my body, my sex, is too. I’ll prove it to you.

When I joined the Army I consented to tactical-role gender reassignment. It was mandatory for the MOS I’d tested into. I was nervous. I’d never been anything but a woman before.

But I decided that I was done with womanhood, over what womanhood could do for me; I wanted to be something furiously new.

To the people who say a woman would’ve refused to do what I do, I say—

Isn’t that the point?

This short story almost-certainly isn’t what you’d expect, based on the title. What it is sits at the intersection of science fiction and gender identity, and it’s pretty damn good.

Looks like the original’s gone down, but here’s an archived copy.

Gosh, even the archive.org copy’s gone. Here’s another.

A year and a half on, here’s a good follow-up including an explanation for it going offline.

Parent-child play: It’s constant and exhausting, but is there a better way?

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

It’s worth noting here that the idea that a parent should be a caretaker, educator, and entertainer rolled into one is not only historically, but also culturally specific. “There are lots of cultures where [parent-child play is] considered absolutely inappropriate—a parent would never get down on their knees and play with the children. Playing is something children do, not something adults do,” developmental psychologist Angeline Lillard said in an interview. “And that’s just fine. There’s no requirement for playing.”

Differences in practices around parent-child play exist within American subgroups, too. Sociologist Annette Lareau has observed a gap in beliefs about parent-child play between working-class/poor parents and middle-class parents in the United States. Working-class and poor parents in her study held a view that they were responsible for “supervision in custodial matters” (Did the child get to sleep on time? Does the child have sneakers that fit?) and “autonomy in leisure matters,” while the middle-class parents engaging in what Lareau termed “concerted cultivation” invested themselves heavily in children’s play. Ultimately, the poorer kids, Lareau found, “tended to show more creativity, spontaneity, enjoyment, and initiative in their leisure pastimes than we saw among middle-class children at play in organized activities.”

Interesting article (about 10 minutes reading), so long as you come at it from at least a little bit of an academic, anthropological perspective and so aren’t expecting to come out of it with concrete, actionable parenting advice!

Engaging in some kinds of play with your kids can be difficult. I’ve lost count of the hours spent in imaginative play with our 6-year-old, trying to follow-along with the complex narrative and characters she’s assembled and ad-lib along (and how many times she’s told me off for my character not making the choices she’d hoped they would, because she’s at least a little controlling over the stories she tells!). But I feel like it’s also a great way to engage with them, so it’s worth putting your devices out of sight, getting down on the carpet, and playing along… at least some of the time. The challenge is finding the balance between being their perpetual playmate and ensuring that they’re encouraged to “make their own fun”, which can be an important skill in being able to fight off boredom for the rest of their lives.

If I ever come up with a perfect formula, I’ll tell you; don’t hold your breath! In the meantime, reading this article might help reassure you that despite there almost-certainly not being a “right way”, there are plenty of “pretty good ways”, and the generally-good human values of authenticity and imagination and cooperation are great starting points for playing with your children, just like they are for so many other endeavours. Your kids are probably going to be okay.

Running Code Over Time

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I’m posting this on the last day of 2019.  As I write it, the second post I ever made on meyerweb says it was published “20 years, 6 days ago”.  It was published on the second-to-last day of 1999, which was 20 years and one day ago.

What I realized, once the discrepancy was pointed out to me (hat tip: Eric Portis), is the five-day error is there because in the two decades since I posted it, there have been five leap days.  When I wrote the code to construct those relative-time strings, I either didn’t think about leap days, or if I did, I decided a day or two here and there wouldn’t matter all that much.

Which is to say, I failed to think about the consequences of my code running over long periods of time.  Maybe a day or two of error isn’t all that big a deal, in human-friendly relative-time output.  If a post was six years and two days ago but the code says 6 and 1, well, nobody will really care that much even if they notice.  But five days is noticeable, and what’s more, it’s a little human-unfriendly.  It’s noticeable.  It jars.

As I mentioned in my comments on a repost last week, I work to try to make the things I publish to this site last. But that’s not to say that problems can’t creep in, either because of fundamental bugs left unnoticed until later on (such as the image recompression problem that’s recently lead to some of my older images going wonky; I’m working on it) or else because because of environmental changes e.g. in the technologies that are supported and the ways in which they’re used. The latter are helped by standards and by an adherence to them, but the former will trip over Web developers time and time again, and it’s possible that there’s nothing we can do about it.

No system is perfect, and we don’t have time to engineer every system, every site, every page in a way that near-guarantees its longevity; not by a long shot. I tripped myself over just the year before last when I added Content-Security-Policy headers to my site and promptly broke every embedded YouTube or Vimeo video because they didn’t fit the newly (and retroactively) enforced pattern of allowable content. Such problems are easy to create when you’re maintaining a long-running system with a lot of data. I’m only talking about my blog, but larger, older and/or more-complex systems (of which I’ve worked on a few!) come with their own multitudinous challenges.

That said, the Web has demonstrated a resilience that surpasses most of what is expected in consumer computing. If you want to run a video game from 1994 or even 2001 on a modern computer, you’re likely to find that you have to put in considerably more work than you would have on the day it was released! But even some of the oldest webpages still-existing remain usable today.

Occasionally, though: a “hip” modern technology without the backing of widespread browser standards comes along and creates a dark age. Flash created such a dark age; now there are millions of Flash-dependent web pages that simply don’t work any longer. Java created another. And I worry that the unnecessary overuse of front-end rendering technologies are creating a third that we’re living through right now, oblivious to the data we’re creating and losing.

Let’s think about longevity.

Which emoji scissors close?

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Ah, scissors. They’re important enough that we have an emoji for them. On your device, it appears as ✂️. Unlike the real world tool it represents, the emoji’s job is to convey the idea, especially at small sizes. It doesn’t need to be able to swing or cut things. Nevertheless, let’s judge them on that irrelevant criterion.

I’ve watched from afar as the Internet collapsed in on itself during a debate about what a cheeseburger looks like and I’ve raged a little myself at the popular depiction of juggling, but this newly-identified emoji failure is a whole new thing. Sure, emoji are supposed to be representative, not realistic. But if they have to cover sufficient diversity to include gender-neutral representations (which they absolutely should, and should have done in an early instance, but at least we’re fixing some of the issues in hindsight, like Ido did) then perhaps they could also include sufficient attention to detail that the tools they depict would actually, y’know, work?

It’s 2020 and you’re in the future

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

West Germany’s 1974 World Cup victory happened closer to the first World Cup in 1930 than to today.

The Wonder Years aired from 1988 and 1993 and depicted the years between 1968 and 1973. When I watched the show, it felt like it was set in a time long ago. If a new Wonder Years premiered today, it would cover the years between 2000 and 2005.

Also, remember when Jurassic Park, The Lion King, and Forrest Gump came out in theaters? Closer to the moon landing than today.

These things come around now and again, but I’m not sure of the universal validity of observing that a memorable event is now closer to another memorable event than it is to the present day. I don’t think that the relevance of events is as linear as that. Instead, perhaps, it looks something like this:

Graph showing that recent events matter a lot, but rapidly tail off for a while before levelling out again as they become long-ago events.
Recent events matter more than ancient events to the popular consciousness, all other things being equal, but relative to one another the ancient ones are less-relevant and there’s a steep drop-off somewhere between the two.

Where the drop-off in relevance occurs is hard to pinpoint and it probably varies a lot by the type of event that’s being remembered: nobody seems to care about what damn terrible thing Trump did last month or the month before when there’s some new terrible thing he did just this morning, for example (I haven’t looked at the news yet this morning, but honestly whenever you read this post he’ll probably have done something awful).

Nonetheless, this post on Wait But Why was a fun distraction, even if it’s been done before. Maybe the last time it happened was so long ago it’s irrelevant now?

XKCD 1393: Timeghost - 'Hello, Ghostbusters?' 'ooOOoooo people born years after that movie came out are having a second chiiiild right now ooOoooOoo'
Of course, there’s a relevant XKCD. And it was published closer to the theatrical releases of Cloudy with a Chance of Meatballs and Paranormal Activity than it was to today. OoooOOoooOOoh.
×

The Legend of the Homicidal Fire-Proof Salamander

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

In the first century AD, Roman naturalist Pliny the Elder threw a salamander into a fire. He wanted to see if it could indeed not only survive the flames, but extinguish them, as Aristotle had claimed such creatures could. But the salamander didn’t … uh … make it.

Yet that didn’t stop the legend of the fire-proof salamander (a name derived from the Persian meaning “fire within”) from persisting for 1,500 more years, from the Ancient Romans to the Middle Ages on up to the alchemists of the Renaissance. Some even believed it was born in fire, like the legendary Phoenix, only slimier and a bit less dramatic. And that its fur (huh?) could be used to weave fire-resistant garments.

Back when the world felt bigger and more-mysterious it was easier for people to come to the conclusion, based on half-understood stories passed-on many times, that creatures like unicorns, dragons, and whatever the Vegetable Lamb of Tartary was supposed to be, might exist just beyond the horizons. Nature was full of mystery and the simple answer – that salamanders might live in logs and then run to escape when those logs are thrown onto a fire – was far less-appealing than the idea that they might be born from the fire itself! Let’s not forget that well into the Middle Ages it was widely believed that many forms of life appeared not through reproduction but by spontaneous generation: clams forming themselves out of sand, maggots out of meat, and so on… with this underlying philosophy, it’s easy to make the leap that sure, amphibians from fire makes sense too, right?

Perhaps my favourite example of such things is the barnacle goose, which – prior to the realisation that birds migrate and coupled with them never being seen to nest in England – lead to the widespread belief that they spontaneously developed (at the appropriate point in the season) from shellfish… this may be the root of the word “barnacle” as used to describe the filter-feeders with which we’re familiar. So prevalent was this belief that well into the 15th century (and in some parts of the world the late 18th century) this particular species of goose was treated as being a fish, not a bird, for the purpose of Christian fast-days.

Anyway; that diversion aside, this article’s an interesting look at the history of mythological beliefs about salamanders.

This Page is Designed to Last: A Manifesto for Preserving Content on the Web

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

How do we make web content that can last and be maintained for at least 10 years? As someone studying human-computer interaction, I naturally think of the stakeholders we aren’t supporting. Right now putting up web content is optimized for either the professional web developer (who use the latest frameworks and workflows) or the non-tech savvy user (who use a platform).

But I think we should consider both 1) the casual web content “maintainer”, someone who doesn’t constantly stay up to date with the latest web technologies, which means the website needs to have low maintenance needs; 2) and the crawlers who preserve the content and personal archivers, the “archiver”, which means the website should be easy to save and interpret.

So my proposal is seven unconventional guidelines in how we handle websites designed to be informative, to make them easy to maintain and preserve. The guiding intention is that the maintainer will try to keep the website up for at least 10 years, maybe even 20 or 30 years. These are not controversial views necessarily, but are aspirations that are not mainstream—a manifesto for a long-lasting website.

This page is designed to last, too. In fact, virtually every post of any type I’ve made to this blog (since 2003, older content may vary) has been designed with the intention that it ought to be accessible without dependence on CSS, Javascript, nor any proprietary technology, that the code should be as human-readable as posssible, and that the site itself should be as “archivable” as possible, just as a matter of course.

But that’s only 15 years of dedicated effort to longevity and I’ve still not achieved 100% success! For example, consider my blog post of 14 December 2003, describing the preceeding Troma Night, whose content was lost during the great server failure of July 2004 and for which the backups were unable to completely describe. I’m more-careful now, with more redundancies and backups, but it’s still always going to be the case that a sufficiently-devastating set of simultaneous failures could take this content away. All information has fragility and we can work to mitigate it but we can never completely solve it.

The large number of dead outbound links on the older parts of my site is both worrying – that most others don’t seem to have the same level of commitment to the retention of articles on the Web – and reassuring – that I’m doing significantly better than the average. So next, I guess, I need to focus my attention – like Jeff is – on how we can make such efforts usable by other people, too. The Web belongs to all of us, after all.

I ignored warnings from friends and family not to marry my husband

This is a repost promoting content originally published elsewhere. See more things Dan's reposted.

When I was 20, a man I barely knew proposed without a ring.

I said yes.

Our friends were alarmed about our fast decisions to marry and move from Tennessee to New York City. I got a handwritten letter from an elder at church suggesting I wait to get to know my fiance better. His friends held a tearful intervention. One of our beloved professors questioned the decision. My mother referred to my fiance not by his name — David — but by the nickname “rank stranger.”

But we were in love. After refusing premarital counseling (we didn’t need it, we insisted), David and I got married and moved to Gramercy Park. We could see the Empire State Building at night when it was illuminated, if we craned our necks while sitting on our creaky fire escape.

My life was as romantic as a love song. Then, after one week of marriage, the phone rang.

Delightful story full of twists and turns on The Washington Post (warning: their adwall has a less-than-ethical/probably-not-legal approach to GDPR compliance for those of us in Europe so you might like to obfuscate your footprint or at least use privacy mode when visiting); seems like it’s going to be much darker than it is but turns out surprisingly uplifting. Give it a read.