How the Knights Hospitaller ‘accidentally’ became a major European air power

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Sod Pepsi’s navy.

Let’s talk about the point after WW2 where the Knights Hospitaller, of medieval crusading fame, ‘accidentally’ became a major European air power.

I shitteth ye not. ?️?️

So, if I asked you to imagine the Knights Hospitaller you probably picture:

1) Angry Christians on armoured horses
2) Them being wiped out long ago like the Templars.
3) Some Dan Brown bullshit

And you would be (mostly) wrong about all three. Which is sort of how this happened.

From the beginning (1113 or so), the Hospitallers were never quite as committed to the angry, horsey thing as the Templars. They had always (ostensibly) been more about protecting pilgrims and healthcare.

They also quite liked boats. Which were useful for both.

Over the next 150 years (or so), as the Christian grip on the Holy Lands waned, both military orders got more involved in their other hobbies – banking for the Templars, mucking around in boats for the Hospitaller.
This proved to be a surprisingly wise decision on the Hospitaller part. By 1290ish, both Orders were homeless and weakened.

As the Templars fatally discovered, being weak AND having the King of France owe you money is a bad combo.

Being a useful NAVY, however, wins you friends.

And this is why your first vision of the Hospitallers is wrong. Because they spent the next 500 YEARS, backed by France and Spain, as one of the most powerful naval forces in the Mediterranean, blocking efforts by the Ottomans to expand westwards by sea.
To give you an idea of the trouble they caused: in 1480 Mehmet II sent 70,000 men (against the Knights 4000) to try and boot them out of Rhodes. He failed.

Suleiman the Magnificent FINALLY managed it in 1522 with 200,000 men. But even he had to agree to let the survivors leave.

The surviving Hospitallers hopped on their ships (again) and sailed away. After some vigorous lobbying, in 1530 the King of Spain agreed to rent them Malta, in return for a single maltese falcon every year.

Because that’s how good rents were pre-housing crisis in Europe.

The Knights turned Malta into ANOTHER fortified island. For the next 200 years ‘the Pope’s own navy’ waged a war of piracy, slavery and (occasionally) pitched sea battles against the Ottomans.
From Malta, they blocked Ottoman strategic access to the western med. A point that was not lost on the Ottomans, who sent 40,000 men to try and take the island in 1565 – the ‘Great Siege of Malta’.

The Knights, fighting almost to the last man, held out and won.

Now the important thing here is the CONTINUED EXISTENCE AS A SOVEREIGN STATE of the Knights Hospitaller. They held Malta right up until 1798, when Napoleon finally managed to boot them out on his way to Egypt.

(Partly because the French contingent of the Knights swapped sides)

The British turned up about three months later and the French were sent packing, but, well, It was the British so:

THE KNIGHTS: Can we have our strategically important island back please?
THE BRITISH: What island?
THE KNIGHT: That island
THE BRITISH: Nope. Can’t see an island

After the Napoleonic wars no one really wanted to bring up the whole Malta thing with the British (the Putin’s Russia of the era) so the European powers fudged it. They said the Knights were still a sovereign state and they tried to sort them out with a new country. But never did
The Russian Emperor let them hang out in St Petersburg for a while, but that was awkward (Catholicism vs Orthodox). Then the Swedes were persuaded to offer them Gotland.

But every offer was conditional on the Knights dropping their claim to Malta. Which they REFUSED to do.

~ wobbly lines ~

It’s the 1900s. The Knights are still a stateless state complaining about Malta. What that means legally is a can of worms NO ONE wants to open in international law but they’ve also rediscovered their original mission (healthcare) so everyone kinda ignores them

The Knights become a pseudo-Red Cross organisation. In WW1 they run ambulance trains and have medical battalions, loosely affiliated with the Italian army (still do). In WW2 they do it too.

Italy surrenders. The allies move on then…

Oh dear.

Who wrote this peace deal again?

It turns out the Treaty of Peace with Italy should go FIRMLY into the category of ‘things that seemed a good idea at the time’.

This is because it presupposes that relations between the west and the Soviets will be good, and so limits Italy’s MILITARY.

This is a problem.

Because as the early Cold War ramps up, the US needs to build up its Euro allies ASAP.

But the treaty limits the Italians to 400 airframes, and bans them from owning ANYTHING that might be a bomber.

This can be changed, but not QUICKLY.

Then someone remembers about the Knights

The Knights might not have any GEOGRAPHY, but because everyone avoided dealing with the tricky international law problem it can be argued – with a straight face – that they are still TECHNICALLY A EUROPEAN SOVEREIGN STATE.

And they’re not bound by the WW2 peace treaty.

Italy (with US/UK/French blessing) approaches the Knights and explains the problem.

The Knights reasonably point out that they’re not in the business of fighting wars anymore, but anything that could be called a SUPPORT aircraft is another matter.

So, in the aftermath of WW2, this is the ballet that happens:

The Italians transfer all of their support and training aircraft to the Knights.

This then frees up the ‘cap room’ to allow the US to boost Italy’s warfighting ability WITHOUT breaking the WW2 peace treaty.

This is why, in the late forties/fifties, a good chunk of the ‘Italian’ air force is flying with a Maltese Cross Roundel.

Because they were not TECHNICALLY Italian. They were the air force of the Sovereign Military Order of Malta.

And that’s how the Knights Hospitaller ended up becoming a major air power.

Eventually the treaties were reworked, and everything was quietly transferred back. I suspect it’s a reason why the sovereign status of the Knights remains unchallenged still today though.

And that’s why today, even thought they are now fully committed to the Red-Cross-esque stuff, they can still issue passports, are a permanent observer at the UN, have a currency…

..,and even have a tiny bit of Malta back.

×

Metropoloid: A Metropolis Remix

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Yaz writes, by way of partial explanation:

You could fit almost the entire history of videogames into the time span covered by the silent film era, yet we consider it a mature medium, rather than one just breaking out of its infancy. Like silent movies, classic games are often incomplete, damaged, or technically limited, but have a beauty all their own. In this spirit, indie game developer Joe Blair and I built Metropoloid, a remix of Fritz Lang’s Metropolis which replaces its famously lost score with that of its contemporaries from the early days of games.

I’ve watched Metropolis a number of times over the decades, in a variety of the stages of its recovery, and I love it. I’ve watched it with a pre-recorded but believed-to-be-faithful soundtrack and I’ve watched it with several diolive accompaniment. But this is the first time I’ve watched it to the soundtrack of classic (and contemporary-retro) videogames: the Metroid, CastlevaniaZeldaMega Man and Final Fantasy series, Doom, Kirby, F-Zero and more. If you’ve got a couple of hours to spare and a love of classic film and classic videogames, then you’re in the slim minority that will get the most out of this fabulous labour of love (which, at the time of my writing, has enjoyed only a few hundred views and a mere 26 “thumbs up”: it certainly deserves a wider audience!).

remysharp comments on “Bringing back the Web of 1990”

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Hi @avapoet, I’m the author of the JavaScript for the WorldWideWeb project, and I did read your thread on the user-agent missing and I thought I’d land the fix ;-)

The original WorldWideWeb browser that we based our work on was 0.12 with screenshots from 0.16. Both browsers supported HTTP 0.9 which didn’t send headers. Obviously unintentional that I send the `request` user-agent, so I spent some painful hours trying to get my emulator running NeXT with a networked connection _and_ the WorldWideWeb version 1.0 – which _did_ use HTTP 1.0 and would send a User-Agent, so I could copy it accurately into the emulator code base.

So now metafilter.com renders in the emulator, and the User Agent sent is: CERN-NextStep-WorldWideWeb.app/1.1 libwww/2.07

Thanks again :)

I blogged about the reimplementation of WorldWideWeb by a hackathon team at CERN, and posted a commentary to MetaFilter, too. In doing so, some others observed that it wasn’t capable of showing MetaFilter pages, which was obviously going to be the first thing that anybody did with it and I ought to have checked first. In any case, I later checked out the source code and did some debugging, finding and proposing a fix. It feels cool to be able to say “I improved upon some code written at CERN,” even if it’s only by a technicality.

This comment on the MetaFilter thread, which I only just noticed, is by Remy Sharp, who was part of the team that reimplemented WorldWideWeb as part of that hackathon (his blog posts about the experience: 1, 2, 3, 4, 5), and acknowledges my contribution. Squee!

Yet Another JavaScript Framework

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

It is impossible to answer all of these questions simply. They can, however, be framed by the ideological project of the web itself. The web was built to be open, both technologically as a decentralized network, and philosophically as a democratizing medium. These questions are tricky because the web belongs to no one, yet was built for everyone. Maintaining that spirit takes a lot of work, and requires sometimes slow, but always deliberate decisions about the trajectory of web technologies. We should be careful to consider the mountains of legacy code and libraries that will likely remain on the web for its entire existence. Not just because they are often built with the best of intentions, but because many have been woven into the fabric of the web. If we pull on any one thread too hard, we risk unraveling the whole thing.

A great story about how Firefox nearly broke tens of thousands of websites by following standards, and then didn’t. tl;dr: Javascript has a messy history.

What Does AJAX Even Stand For?

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

…Jesse James Garett of Adaptive Path finally gave this use of Javascript a name. He called it Asynchronous Javascript and XML (thankfully shortened to Ajax). Ajax wasn’t a single method or technology, but a group of principles based on the work being done at Google that described how to handle JavaScript in more demanding web applications. The short version is actually fairly simple. Use XMLHttpRequest to make request to the server, take the XML data returned, and lay it out on the page using JavaScript and semantic HTML.

Ajax took off after that. It underpins major parts of modern web development, and has spawned a number of frameworks and methodologies. The technology itself has been swapped out over time. XML was replaced by JSON, and XMLHttpRequest by fetch. That doesn’t matter though. Ajax showed a lot of developers that the web wasn’t just about documents. The web could be used to build applications. That might seem like a small notion, but trust me when I say, it was momentous.

The History of The Web is great, and you should read it, but this piece is particularly interesting. Ajax and its spiritual successors laid the groundwork for rich Internet applications, shell apps, and the real-time Web, but they also paved the way for a handful of the inevitable practices that went alongside: Javascript-required websites, API-driven web and mobile applications with virtually no HTML content whatsoever… and these things have begun to become problematic for the health of the Web as a whole.

I love Ajax, but it absolutely must be employed as progressive enhancement where possible.

Pac-Man: The Untold Story of How We Really Played The Game

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Unrestored Pac-Man machine with worn paint in a specific place on the left-hand side.

Human beings leave physical impressions upon the things they love and use just as much as their do upon the lives of people and the planet they live upon. For every action, there’s a reaction. For every pressure, there’s an affect on mass and volume. And in the impressions left by that combination, particularly if you’re lucky enough to see the sides of a rare, unrestored  vintage Pac-Man cabinet, lies the never before told story of how we really played the game.

Until now, I don’t believe anyone has ever written about it.

Interesting exploration of the history of the cabinets housing Pac-Man, observing the ergonomic impact of the controls on the way that people would hold the side of the machine and, in turn, how that would affect where and how the paint would wear off.

I love that folks care about this stuff.

Debugging WorldWideWeb

Earlier this week, I mentioned the exciting hackathon that produced a moderately-faithful reimagining of the world’s first Web browser. I was sufficiently excited about it that I not only blogged here but I also posted about it to MetaFilter. Of course, the very first thing that everybody there did was try to load MetaFilter in it, which… didn’t work.

MetaFilter failing to load on the reimagined WorldWideWeb.
500? Really?

People were quick to point this out and assume that it was something to do with the modernity of MetaFilter:

honestly, the disheartening thing is that many metafilter pages don’t seem to work. Oh, the modern web.

Some even went so far as to speculate that the reason related to MetaFilter’s use of CSS and JS:

CSS and JS. They do things. Important things.

This is, of course, complete baloney, and it’s easy to prove to oneself. Firstly, simply using the View Source tool in your browser on a MetaFilter page reveals source code that’s quite comprehensible, even human-readable, without going anywhere near any CSS or JavaScript.

MetaFilter in Lynx: perfectly usable browing experience
As late as the early 2000s I’d occasionally use Lynx for serious browsing, but any time I’ve used it since it’s been by necessity.

Secondly, it’s pretty simple to try browsing MetaFilter without CSS or JavaScript enabled! I tried in two ways: first, by using Lynx, a text-based browser that’s never supported either of those technologies. I also tried by using Firefox but with them disabled (honestly, I slightly miss when the Web used to look like this):

MetaFilter in Firefox (with CSS and JS disabled)
It only took me three clicks to disable stylesheets and JavaScript in my copy of Firefox… but I’ll be the first to admit that I don’t keep my browser configured like “normal people” probably do.

And thirdly: the error code being returned by the simulated WorldWideWeb browser is a HTTP code 500. Even if you don’t know your HTTP codes (I mean, what kind of weirdo would take the time to memorise them all anyway <ahem>), it’s worth learning this: the first digit of a HTTP response code tells you what happened:

  • 1xx means “everything’s fine, keep going”;
  • 2xx means “everything’s fine and we’re done”;
  • 3xx means “try over there”;
  • 4xx means “you did something wrong” (the infamous 404, for example, means you asked for a page that doesn’t exist);
  • 5xx means “the server did something wrong”.

Simple! The fact that the error code begins with a 5 strongly implies that the problem isn’t in the (client-side) reimplementation of WorldWideWeb: if this had have been a CSS/JS problem, I’d expect to see a blank page, scrambled content, “filler” content, or incomplete content.

So I found myself wondering what the real problem was. This is, of course, where my geek flag becomes most-visible: what we’re talking about, let’s not forget, is a fringe problem in an incomplete simulation of an ancient computer program that nobody uses. Odds are incredibly good that nobody on Earth cares about this except, right now, for me.

Dan's proposed "Geek Flag"
I searched for a “Geek Flag” and didn’t like anything I saw, so I came up with this one based on… well, if you recognise what it’s based on, good for you, you’re certainly allowed to fly it. If not… well, you can too: there’s no geek-gatekeeping here.

Luckily, I spotted Jeremy’s note that the source code for the WorldWideWeb simulator was now available, so I downloaded a copy to take a look. Here’s what’s happening:

  1. The (simulated) copy of WorldWideWeb is asked to open a document by reference, e.g. “https://www.metafilter.com/”.
  2. To work around same-origin policy restrictions, the request is sent to an API which acts as a proxy server.
  3. The API makes a request using the Node package “request” with this line of code: request(url, (error, response, body) => { ... }).  When the first parameter to request is a (string) URL, the module uses its default settings for all of the other options, which means that it doesn’t set the User-Agent header (an optional part of a Web request where the computer making the request identifies the software that’s asking).
  4. MetaFilter, for some reason, blocks requests whose User-Agent isn’t set. This is weird! And nonstandard: while web browsers should – in RFC2119 terms – set their User-Agent: header, web servers shouldn’t require that they do so. MetaFilter returns a 403 and a message to say “Forbidden”; usually a message you only see if you’re trying to access a resource that requires session authentication and you haven’t logged-in yet.
  5. The API is programmed to handle response codes 200 (okay!) and 404 (not found), but if it gets anything else back it’s supposed to throw a 400 (bad request). Except there’s a bug: when trying to throw a 400, it requires that an error message has been set by the request module and if there hasn’t… it instead throws a 500 with the message “Internal Server Fangle” and  no clue what actually went wrong. So MetaFilter’s 403 gets translated by the proxy into a 400 which it fails to render because a 403 doesn’t actually produce an error message and so it gets translated again into the 500 that you eventually see. What a knock-on effect!
Illustration showing conversation between simulated WorldWideWeb and MetaFilter via an API that ultimately sends requests without a User-Agent, gets a 403 in response, and can't handle the 403 and so returns a confusing 500.
If you’re having difficulty visualising the process, this diagram might help you to continue your struggle with that visualisation.

The fix is simple: simply change the line:

request(url, (error, response, body) => { ... })

to:

request({ url: url, headers: { 'User-Agent': 'WorldWideWeb' } }, (error, response, body) => { ... })

This then sets a User-Agent header and makes servers that require one, such as MetaFilter, respond appropriately. I don’t know whether WorldWideWeb originally set a User-Agent header (CERN’s source file archive seems to be missing the relevant C sources so I can’t check) but I suspect that it did, so this change actually improves the fidelity of the emulation as a bonus. A better fix would also add support for and appropriate handling of other HTTP response codes, but that’s a story for another day, I guess.

I know the hackathon’s over, but I wonder if they’re taking pull requests…

MetaFilter failing to load on the reimagined WorldWideWeb.× MetaFilter in Lynx: perfectly usable browing experience× MetaFilter in Firefox (with CSS and JS disabled)× Dan's proposed "Geek Flag"× Illustration showing conversation between simulated WorldWideWeb and MetaFilter via an API that ultimately sends requests without a User-Agent, gets a 403 in response, and can't handle the 403 and so returns a confusing 500.×

WorldWideWeb, 30 years on

This month, a collection of some of my favourite geeks got invited to CERN in Geneva to participate in a week-long hackathon with the aim of reimplementing WorldWideWeb – the first web browser, circa 1990-1994 – as a web application. I’m super jealous, but I’m also really pleased with what they managed to produce.

DanQ.me as displayed by the reimagined WorldWideWeb browser circa 1990
With the exception of a few character entity quirks, this site remains perfectly usable in the simulated WorldWideWeb browser. Clearly I wasn’t the only person to try this vanity-check…

This represents a huge leap forward from their last similar project, which aimed to recreate the line mode browser: the first web browser that didn’t require a NeXT computer to run it and so a leap forward in mainstream appeal. In some ways, you might expect reimplementing WorldWideWeb to be easier, because its functionality is more-similar that of a modern browser, but there were doubtless some challenges too: this early browser predated the concept of the DOM and so there are distinct processing differences that must be considered to get a truly authentic experience.

Geeks hacking on WorldWideWeb reborn
It’s just like any other hackathon, if you ignore the enormous particle collider underneath it.

Among their outputs, the team also produced a cool timeline of the Web, which – thanks to some careful authorship – is as legible in WorldWideWeb as it is in a modern browser (if, admittedly, a little less pretty).

WorldWideWeb screenshot by Sir Tim Berners-Lee
When Sir Tim took this screenshot, he could never have predicted the way the Web would change, technically, over the next 25-30 years. But I’m almost more-interested in how it’s stayed the same.

In an age of increasing Single Page Applications and API-driven sites and “apps”, it’s nice to be reminded that if you develop right for the Web, your content will be visible (sort-of; I’m aware that there are some liberties taken here in memory and processing limitations, protocols and negotiation) on machines 30 years old, and that gives me hope that adherence to the same solid standards gives us a chance of writing pages today that look just as good in 30 years to come. Compare that to a proprietary technology like Flash whose heyday 15 years ago is overshadowed by its imminent death (not to mention Java applets or ActiveX <shudders>), iOS apps which stopped working when the operating system went 64-bit, and websites which only work in specific browsers (traditionally Internet Explorer, though as I’ve complained before we’re getting more and more Chrome-only sites).

The Web is a success story in open standards, natural and by-design progressive enhancement, and the future-proof archivability of human-readable code. Long live the Web.

Update 24 February 2019: After I submitted news of the browser to MetaFilter, I (and others) spotted a bug. So I came up with a fix…

DanQ.me as displayed by the reimagined WorldWideWeb browser circa 1990× Geeks hacking on WorldWideWeb reborn× WorldWideWeb screenshot by Sir Tim Berners-Lee×

Finding Lena Forsen, the Patron Saint of JPEGs

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Lena, then and now

Every morning, Lena Forsen wakes up beneath a brass-trimmed wooden mantel clock dedicated to “The First Lady of the Internet.”

It was presented to her more than two decades ago by the Society for Imaging Science and Technology, in recognition of the pivotal—and altogether unexpected—role she played in shaping the digital world as we know it.

Among some computer engineers, Lena is a mythic figure, a mononym on par with Woz or Zuck. Whether or not you know her face, you’ve used the technology it helped create; practically every photo you’ve ever taken, every website you’ve ever visited, every meme you’ve ever shared owes some small debt to Lena. Yet today, as a 67-year-old retiree living in her native Sweden, she remains a little mystified by her own fame. “I’m just surprised that it never ends,” she told me recently.

While I’m not sure that it’s fair to say that Lena “remained a mystery” until now – the article itself identifies several events she’s attended in her capacity of “first lady of the Internet” – but this is still a great article about a picture that you might have seen but never understood the significance of nor the person in front of the lens. Oh, and it’s pronounced “lee-na”; did you know?

RFC-20

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

The choice of this encoding has made ASCII-compatible standards the language that computers use to communicate to this day.

Even casual internet users have probably encountered a URL with “%20” in it where there logically ought to be a space character. If we look at this RFC we see this:

   Column/Row  Symbol      Name

   2/0         SP          Space (Normally Non-Printing)

Hey would you look at that! Column 2, row 0 (2,0; 20!) is what stands for “space”. When you see that “%20”, it’s because of this RFC, which exists because of some bureaucratic decisions made in the 1950s and 1960s.

Darius Kazemi is reading a single RFC every day throughout 2019 and writing up his understanding as to the content and importance of each. It’s good reading if you’re “into” RFCs and it’s probably pretty interesting if you’re just a casual Internet historian.

Edge may be becoming Chromium-powered, and that’s terrible

Microsoft engineers have been spotted committing code to Chromium, the backend of Google Chrome and many other web browsers. This, among other things, has lead to speculation that Microsoft’s browser, Edge, might be planned to switch from its current rendering engine (EdgeHTML) to Blink (Chromium’s). This is bad news.

This page in Microsoft Edge
This post, as it would appear if you were looking at it in Edge. Which you might be, I suppose.

The younger generation of web developers are likely to hail this as good news: one fewer engine to develop for and test in, they’re all already using Chrome or something similar (and certainly not Edge) for development and debugging anyway, etc. The problem comes perhaps because they’re too young to remember the First Browser War and its aftermath. Let me summarise:

  1. Once upon a time – let’s call it the mid-1990s – there were several web browsers: Netscape Navigator, Internet Explorer, Opera, etc. They all used different rendering engines and so development was sometimes a bit of a pain, but only if you wanted to use the latest most cutting-edge features: if you were happy with the standard, established features of the Web then your site would work anywhere, as has always been the case.
    Best viewed with... any damn browser
  2. Then, everybody starting using just one browser: following some shady dealings and monopoly abuse, 90%+ of Web users started using just one web browser, Internet Explorer. By the time anybody took notice, their rivals had been economically crippled beyond any reasonable chance of recovery, but the worst had yet to come…
    Best viewed with Internet Explorer
  3. Developers started targeting only that one browser: instead of making websites, developers started making “Internet Explorer sites” which were only tested in that one browser or, worse yet, only worked at all in that browser, actively undermining the Web’s position as an open platform. As the grip of the monopoly grew tighter, technological innovation was centred around this single platform, leading to decade-long knock-on effects.
  4. The Web ceased to grow new features: from the release of Internet Explorer 6 there were no significant developments in the technology of the Web for many years. The lack of competition pushed us into a period of stagnation. A decade and a half later, we’re only just (finally) finishing shaking off this unpleasant bit of our history.
    "Netscape sux"

History looks set to repeat itself. Substitute Chrome in place of Internet Explorer and update the references to other web browsers and the steps above could be our future history, too. Right now, we’re somewhere in or around step #2 – Chrome is the dominant browser – and we’re starting to see the beginnings of step #3: more and more “Chrome only” sites. More-alarmingly this time around, Google’s position in providing many major Web services allows them to “push” even harder for this kind of change, even just subtly: if you make the switch from Chrome to e.g. Firefox (and you absolutely should) you might find that YouTube runs slower for you because YouTube’s (Google) engineers favour Google’s web browser.

Chrome is becoming the new Internet Explorer 6, and that’s a huge problem. Rachel Nabors wrote in her excellent article The Ecological Impact of Browser Diversity:

So these are the three browser engines we have: WebKit/Blink, Gecko, and EdgeHTML. We are unlikely to get any brand new bloodlines in the foreseeable future. This is it.

If we lose one of those browser engines, we lose its lineage, every permutation of that engine that would follow, and the unique takes on the Web it could allow for.

And it’s not likely to be replaced.

The Circle of Browsers, by Rachel Nabors

Imagine a planet populated only by hummingbirds, dolphins, and horses. Say all the dolphins died out. In the far, far future, hummingbirds or horses could evolve into something that could swim in the ocean like a dolphin. Indeed, ichthyosaurs in the era of dinosaurs looked much like dolphins. But that creature would be very different from a true dolphin: even ichthyosaurs never developed echolocation. We would wait a very long time (possibly forever) for a bloodline to evolve the traits we already have present in other bloodlines today. So, why is it ok to stand by or even encourage the extinction of one of these valuable, unique lineages?

We have already lost one.

We used to have four major rendering engines, but Opera halted development of its own rendering engine Presto before adopting Blink.

Three left. Spend them wisely.

As much as I don’t like having to work-around the quirks in all of the different browsers I test in, daily, it’s way preferable to a return to the dark days of the Web circa most of the first decade of this century. Please help keep browsers diverse: nobody wants to start seeing this shit –

Best viewed with Google Chrome

Update: this is now confirmed. A sad day for the Web.

This page in Microsoft Edge× Best viewed with Google Chrome×

The peculiar history of the Ordnance Survey

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

The history of the organisation known as OS is not merely that of a group of earnest blokes with a penchant for triangulation and an ever-present soundtrack of rustling cagoules.

From its roots in military strategy to its current incarnation as producer of the rambler’s navigational aid, the government-owned company has been checking and rechecking all 243,241 sq km (93,916 sq miles) of Great Britain for 227 years. Here are some of the more peculiar elements in the past of the famous map-makers.

Learning BASIC Like It’s 1983

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

C64 showing coloured barsNow, it’s Saturday morning and you’re eager to try out what you’ve learned. One of the first things the manual teaches you how to do is change the colors on the display. You follow the instructions, pressing CTRL-9 to enter reverse type mode and then holding down the space bar to create long lines. You swap between colors using CTRL-1 through CTRL-8, reveling in your sudden new power over the TV screen.

As cool as this is, you realize it doesn’t count as programming. In order to program the computer, you learned last night, you have to speak to it in a language called BASIC. To you, BASIC seems like something out of Star Wars, but BASIC is, by 1983, almost two decades old. It was invented by two Dartmouth professors, John Kemeny and Tom Kurtz, who wanted to make computing accessible to undergraduates in the social sciences and humanities. It was widely available on minicomputers and popular in college math classes. It then became standard on microcomputers after Bill Gates and Paul Allen wrote the MicroSoft BASIC interpreter for the Altair. But the manual doesn’t explain any of this and you won’t learn it for many years.

One of the first BASIC commands the manual suggests you try is the PRINT command. You type in PRINT "COMMODORE 64", slowly, since it takes you a while to find the quotation mark symbol above the 2 key. You hit RETURN and this time, instead of complaining, the computer does exactly what you told it to do and displays “COMMODORE 64” on the next line.

Now you try using the PRINT command on all sorts of different things: two numbers added together, two numbers multiplied together, even several decimal numbers. You stop typing out PRINT and instead use ?, since the manual has advised you that ? is an abbreviation for PRINT often used by expert programmers. You feel like an expert already, but then you remember that you haven’t even made it to chapter three, “Beginning BASIC Programming.”

I had an Amstrad CPC, myself, but I had friends with C64s and ZX Spectrums and – being slightly older than the author – I got the opportunity to experiment with BASIC programming on all of them (and went on to write all manner of tools on the CPC 464, 664, and 6128 models). I’m fortunate to have been able to get started in programming in an era when your first experience of writing code didn’t have to start with an examination of the different language choices nor downloading and installing some kind of interpreter or compiler: microcomputers used to just drop you at a prompt which was your interpreter! I think it’s a really valuable experience for a child to have.

C64 showing coloured bars×

People don’t change

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Fundamentally, people haven’t changed much in tens of thousands of years. If ancient Egyptians had smartphones, you know full well that they’d have been posting cat pictures too. What can we learn from this and how should we look at our role when developing front-end Web experiences?