Official Post from The Video Game History Foundation: Something pretty fun happened yesterday that I wanted to share with you all: a bot on Twitter accidentally provided the clue
that finally solved a 28-year-old mystery about a DOS game that never shipped.Yesterday, the VGHF Twitter account was tagged in a thread by @awesomonster, who was frantically
Something pretty fun happened yesterday that I wanted to share with you all: a bot on Twitter accidentally provided the clue that finally solved a 28-year-old mystery about a DOS game
that never shipped.
Yesterday, the VGHF Twitter account was tagged in a thread by
@awesomonster, who was frantically trying to figure out the origins of a screenshot:
HackerRank has published its 2018 Developer Skills Report. The paper looks at a number things essential to understanding the developer landscape, and explores things like
the perks coders demand from their workplaces, the technologies they prefer to use, and how they entered the software development industry in the first place.
While perusing the paper, something struck me as particularly interesting. One of the questions HackerRank asked its community was when they started coding. It then organized the data
by age and country.
Almost immediately, you notice an interesting trend. Those in the 18 to 24 age group overwhelmingly started their programming journey in their late teens. 68.2 percent started coding
between the ages of 16 to 20.
When you look at older generations, you notice another striking trend: a comparatively larger proportion started programming between the ages of five and ten. 12.2 percent of those
aged between 35 and 44 started programming then.
It’s obvious why that is. That generation was lucky enough to be born at the start of the home computing revolution, when machines bearing the logos of Acorn and Commodore first
entered the living rooms of ordinary people.
…
This survey parallels my own experience: that among developers, those of us who grew up using an 80s microcomputer at home were likely to have started programming a decade or so younger
than those who grew up later, when the PC had come to dominate. I’ve written before about why I care about programming education, and I still think
that we’re not doing enough to show young learners what’s “under the bonnet” of our computer systems. A computer isn’t just a machine you can use, it’s a tool you can adapt: unlike the
other machines you use, which are typically built to a particular purpose, a computer is a general-purpose tool and it can be made to do an infinite number of different tasks!
And even if programming professionally isn’t “for you” (and it shouldn’t be for everyone!), understanding broadly how a tool – a tool that we all come into contact with every
single day – is adapted makes us hugely better-able to understand what they’re capable of and pushes us forwards. Imagine how many young inventors would be able to realise their for the
“killer app” they’ve dreamed up (even if they remained unable to program if themselves) if they were able to understand the fundamental limtations and strengths of the platforms, the
way to express their idea unambiguously in a way that a programmer could develop, and the way to assess its progress without falling into the “happy path” testing problem.
I’m not claiming that late-Gen X’s are better programmers than Millenials, by the way: absolutely not saying that! I’m saying that they were often lucky enough to be shaped by
an experience that got them into programming earlier. And that I wish we could find a way to offer that opportunity to today’s children too.
Official Post from Rob Sheridan: That goober you see above is me as a nerdy high school kid in my bedroom in 1998, being interviewed on TV for a dumb website I made. Allow me to
explain.20 years ago this month, an episode of the TV show Ally McBeal featured a strange animated baby dancing the cha-cha in a vision experienced by the
That goober you see above is me as a nerdy high school kid in my bedroom in 1998, being interviewed on TV for a dumb website I made. Allow me to explain.
20 years ago this month, an episode of the TV show Ally McBeal featured a
strange animated baby dancing the cha-cha in a vision experienced by the show’s titular
character. It immediately became an unlikely pop culture sensation, and by the tail end of the 90s you couldn’t pass a mall t-shirt kiosk or a Spencer’s Gifts without seeing corny
merchandise for The Dancing Baby, or “Oogachaka Baby” as it was sometimes
known. This child of the Uncanny Valley was an offensively banal phenomenon: It had no depth, no meaning, no commentary, no narrative. It was just a dumb video loop from the internet,
something your nerdiest co-worker would have emailed you for a ten-second chuckle. We know these frivolous bite-sized jokes as memes now, and they’re wildly pervasive in popular
culture. You can get every type of Grumpy Cat merchandise imaginable, for example, despite the property being nothing more than a photo of a cranky-looking feline with some
added text. We know what memes are in 2018 but in 1997, we didn’t. The breathtaking stupidity of The Dancing Baby’s popularity was a strange development with online origins that had
no cultural precedent. It’s a cringe-worthy thing to look back on, appropriately relegated to the dumpster of regrettable 90s fads. But I have a confession to make: The Dancing Baby
was kinda my fault.
…
Internet memes of the 1990s were a very different beast to those you see today. A combination of the slow connection speeds, lower population of “netizens” (can you believe we used to
call ourselves that), and the fact that many of the things we take for granted today were then cutting-edge or experimental technologies like animated GIFs or web pages with music means
that memes spread more-slowly and lived for longer. Whereas today a meme can be born and die in the fraction of a heartbeat that it takes for you to discover them, the memes of 1990s
grew gradually and truly organically: there was not yet any market for attempting to “manufacture” a meme. If if you were thoroughly plugged-in to Net culture, by the time you
discovered a new meme it could be weeks or months old and still thriving, and spin-off memes (like the dozens of sites that followed the theme of the Hampster Dance) almost
existed to pay homage to the originals, rather than in an effort to supplant them.
I’m aware that meme culture predates the dancing baby, and I had the privilege of seeing it foster on e.g. newsgroups beforehand. But the early Web provided a fascinating breeding
ground for a new kind of meme: one that brushed up against mainstream culture and helped to put the Internet onto more people’s mental maps: consider the media reaction to the
appearance of the Dancing Baby on Ally McBeal. So as much as you might want to wrap your hands around the throat of the greasy teenager in the picture, above, I think that in a
way we should be thanking him for his admittedly-accidental work in helping bring geek culture into the sight of popular culture.
And I’m not just saying that because I, too, spent the latter half of the 1990s putting things online that I ought to by right have been embarassed by in hindsight. ;-)
Undeniably one of the most obscure and unusual 'wars' in history, this is the story of how the killing of an escaped pig almost caused a war between the United States and Britain.
‘The Pig War’ is perhaps one of the most obscure and unusual wars in history. The story begins back in 1846 when the Oregon Treaty was signed between the US and Britain. The treaty aimed to put to rest a long standing border dispute between
the US and British North America (later to be Canada), specifically relating to the land between the Rocky Mountains and the Pacific coastline.
The Oregon Treaty stated that the US / British American border be drawn at the 49th parallel, a division which remains to this day. Although this all sounds rather straightforward,
the situation because slightly more complicated when it came to a set of islands situated to the south-west of Vancouver. Around this region the treaty stated that the border be
through ‘the middle of the channel separating the continent from Vancouver’s Island.’ As you can see from the map below, simply drawing a line through the middle of
the channel was always going to be difficult due to the awkward positioning of the islands.
The time capsule was buried in a secluded square in Murmansk in 1967 on the eve of the fiftieth anniversary of the Russian Revolution. Inside was a
message dedicated to the citizens of the Communist future. At short notice, the authorities brought forward the capsule’s exhumation by ten days, to coincide with the city’s 101st
birthday. With the stroke of an official’s pen, a mid-century Soviet relic was enlisted to honour one of the last acts of Tsar (now Saint) Nicholas II, who founded my hometown in
October 1916. From socialism to monarchism in ten days. Some of the city’s pensioners accused the local government of trying to suppress the sacred memory of the revolution. ‘Our
forefathers would be turning in their graves,’ one woman wrote in a letter to the local paper. The time capsule ‘is not some kind of birthday present to the city; it’s a reminder of
the centenary of the great October Revolution and its human cost.’
My father had watched the time capsule being buried. He came to Murmansk aged 17. From his remote village, he had dreamed of the sea but he failed the navy’s eye test. In October
1967, he was a second-year student at the Higher Marine Engineering Academy, an elite training school for the Soviet Union’s massive fishing fleet. As a year-round warm water port,
Murmansk – the largest human settlement above the Arctic Circle – is a major fishing and shipping hub, home to the world’s only fleet of nuclear-powered ice-breakers…
My dinner-party party piece for many years was to say, “Well, actually, I invented Baileys. You know, Baileys Irish Cream. I did that back in 1973.”
If one of the unfortunate listening group is a woman – and this is based on actual past experience – she is likely to respond something like this: “Oh-my-God. Baileys. My mother
absolutely adores it. Did you hear that, Jocasta? This man invented Baileys. It’s unreal. I don’t believe it. He must be terribly rich. Baileys Cream. Wow!”
And it’s not as if these rather posh people really adore Baileys. Or even hold it in the same esteem as, say, an obscure Islay single malt or a fine white burgundy from Meursault. Not
a bit of it. They might have respected it years ago but most people of legal drinking age regard Baileys as a bit naff. To my mind, they’d be very wrong…
This is a story of a country that journeyed from rags to riches and back to rags. It’s a cautionary tale of what happens when a nation exploits its natural resources at the expense of
people’s lives…
It was September 1738, and Benjamin Lay had walked 20 miles, subsisting on “acorns and peaches,” to reach the Quakers’ Philadelphia Yearly Meeting. Beneath his overcoat he wore a
military uniform and a sword — both anathema to Quaker teachings. He also carried a hollowed-out book with a secret compartment, into which he had tucked a tied-off animal bladder
filled with bright red pokeberry juice.
When it was Lay’s turn to speak, he rose to address the Quakers, many of whom had grown rich and bought African slaves. He was a dwarf, barely four feet tall, but from his small body
came a thunderous voice. God, he intoned, respects all people equally, be they rich or poor, man or woman, white or black.
Throwing his overcoat aside, he spoke his prophecy: “Thus shall God shed the blood of those persons who enslave their fellow creatures.” He raised the book above his head and plunged
the sword through it. As the “blood” gushed down his arm, several members of the congregation swooned. He then splattered it on the heads and bodies of the slave keepers. His message
was clear: Anyone who failed to heed his call must expect death — of body and soul.
Lay did not resist when his fellow Quakers threw him out of the building. He knew he would be disowned by his beloved community for his performance, but he had made his point. As long
as Quakers owned slaves, he would use his body and his words to disrupt their hypocritical routines…
As an engineer for the U.S. Digital Service, Marianne Bellotti has encountered vintage mainframes that are still being used in production — sometimes even powering web apps. Last month she
entertained a San Francisco audience with tales about some of them, in a talk called “7074 says Hello World,” at Joyent’s “Systems We Love” conference.
Created under the Obama administration, The U.S. Digital Service was designed as a start-up-styled consultancy to help government agencies modernize their IT operations, drawing engineering talent from Google, Facebook and other web-scale companies.
Or, as President Obama put it last March, it’s “a SWAT team — a world-class technology office.”
So it was fascinating to hear Bellotti tell stories about some of the older gear still running, and the sometimes unusual ways it was paired with more contemporary technology…
I have a vivid, recurring dream. I climb the stairs in my parents’ house to see my old bedroom. In the back corner, I hear a faint humming.
It’s my old computer, still running my 1990s-era bulletin board system (BBS, for short), “The Cave.” I thought I had shut it down ages ago, but it’s been chugging away this whole time
without me realizing it—people continued calling my BBS to play games, post messages, and upload files. To my astonishment, it never shut down after all…
In the beginning there was NCSA Mosaic, and Mosaic called itself NCSA_Mosaic/2.0 (Windows 3.1), and Mosaic displayed pictures along with text, and there was much rejoicing…
Have you ever wondered why every major web browser identifies itself as “Mozilla”? Wonder no longer…