Additional Processors

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Computerphile at its best, here tackling the topic of additional (supplementary) processors, like FPUs, GPUs, sound processors, etc., to which CPUs outsource some of their work under specific circumstances. Even speaking as somebody who’s upgraded a 386/SX to a 386/DX through the addition of a “math co-processor” (an FPU) and seeing the benefit in applications for which floating point arithmetic was a major part (e.g. some early 3D games), I didn’t really think about what was really happening until I saw this video. There’s always more to learn, fellow geeks!

The Case Against Quantum Computing

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

Quantum computing is all the rage. It seems like hardly a day goes by without some news outlet describing the extraordinary things this technology promises. Most commentators forget, or just gloss over, the fact that people have been working on quantum computing for decades—and without any practical results to show for it.

We’ve been told that quantum computers could “provide breakthroughs in many disciplines, including materials and drug discovery, the optimization of complex manmade systems, and artificial intelligence.” We’ve been assured that quantum computers will “forever alter our economic, industrial, academic, and societal landscape.” We’ve even been told that “the encryption that protects the world’s most sensitive data may soon be broken” by quantum computers. It has gotten to the point where many researchers in various fields of physics feel obliged to justify whatever work they are doing by claiming that it has some relevance to quantum computing.

Meanwhile, government research agencies, academic departments (many of them funded by government agencies), and corporate laboratories are spending billions of dollars a year developing quantum computers. On Wall Street, Morgan Stanley and other financial giants expect quantum computing to mature soon and are keen to figure out how this technology can help them.

It’s become something of a self-perpetuating arms race, with many organizations seemingly staying in the race if only to avoid being left behind. Some of the world’s top technical talent, at places like Google, IBM, and Microsoft, are working hard, and with lavish resources in state-of-the-art laboratories, to realize their vision of a quantum-computing future.

In light of all this, it’s natural to wonder: When will useful quantum computers be constructed? The most optimistic experts estimate it will take 5 to 10 years. More cautious ones predict 20 to 30 years. (Similar predictions have been voiced, by the way, for the last 20 years.) I belong to a tiny minority that answers, “Not in the foreseeable future.” Having spent decades conducting research in quantum and condensed-matter physics, I’ve developed my very pessimistic view. It’s based on an understanding of the gargantuan technical challenges that would have to be overcome to ever make quantum computing work.

Great article undermining all the most-widespread popular arguments about how quantum computing will revolutionise aboslutely everything, any day now. Let’s stay realistic, here: despite all the hype, it might well be the case that it’s impossible to build a quantum computer of sufficient complexity to have any meaningful impact on the world beyond the most highly-experimental and theoretical applications. And even if it is possible, its applications might well be limited: the “great potential” they carry is highly hypothetical.

Don’t get me wrong, I’m super excited about the possibility of quantum computing, too. But as Mickhail points out, we must temper our excitement with a little realism and not give in to the hype.

Reply to Hardware Issue – when did hard drive space get like this?

Hardware Issue (Aquarionics)

So, I am a professional system administrator. It says it on my business cards and everything. Every couple of months, when I have to explain to the receptionist at the London office that yes, I do work here, and so

Nicholas Avenell (Aquarionics) wrote:

(My first hard drive for the Amiga 600 was second hand from my dad’s old laptop. It was SIXTY MEGABYTES. It held DOZENS of games. I would need over EIGHT HUNDRED of those drives to hold a 50Gb World of Warcraft install).

I remember my first hard drive. It was 40Mb, and that felt flipping MASSIVE because I’d previously, like most people, been using floppy disks of no larger than 1.44Mb. My second hard drive was 105Mb and it felt like a huge step-up; I ripped my first MP3s onto that drive, and didn’t care for a moment that they each consumed 2%-3% of the available space (and took about 15 minutes each to encode).

Nowadays I look at my general-purpose home desktop’s 12TB RAID array and I think to myself… yeah, but it’s over half full… probably time to plan for the next upgrade. What happened‽ Somewhere along the line hard drive space became like mobile phone battery level became before it: something where you start to worry if you have less than half left. I don’t know how we got here and I’m not sure I’m happy about it, but suffice to say: technology today is nuts.

What Happens When You Mix Java with a 1960 IBM Mainframe

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

IBM Mainframe

As an engineer for the U.S. Digital ServiceMarianne Bellotti has encountered vintage mainframes that are still being used in production — sometimes even powering web apps. Last month she entertained a San Francisco audience with tales about some of them, in a talk called “7074 says Hello World,” at Joyent’s “Systems We Love” conference.

Created under the Obama administration, The U.S. Digital Service was designed as a start-up-styled consultancy to help government agencies modernize their IT operations, drawing engineering talent from Google, Facebook and other web-scale companies.

Or, as President Obama put it last March, it’s “a SWAT team — a world-class technology office.”

So it was fascinating to hear Bellotti tell stories about some of the older gear still running, and the sometimes unusual ways it was paired with more contemporary technology…

The Lost Civilization of Dial-Up Bulletin Board Systems

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I have a vivid, recurring dream. I climb the stairs in my parents’ house to see my old bedroom. In the back corner, I hear a faint humming.

It’s my old computer, still running my 1990s-era bulletin board system (BBS, for short), “The Cave.” I thought I had shut it down ages ago, but it’s been chugging away this whole time without me realizing it—people continued calling my BBS to play games, post messages, and upload files. To my astonishment, it never shut down after all…

The author’s computer connecting to BBS in 1996 (Benj Edwards)

The Golden Age of x86 Gaming

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I’ve been happy with my 2016 HTPC, but the situation has changed, largely because of something I mentioned in passing back in November: The Xbox One and PS4 are effectively plain old PCs, built on: Intel Atom class (aka slow) AMD 8-core x86 CPU 8 GB RAM AMD Radeon 77xx / 78xx GPUs cheap commodity…

The 2016 HTPC Build

This article is a repost promoting content originally published elsewhere. See more things Dan's reposted.

I’ve loved many computers in my life, but the HTPC has always had a special place in my heart. It’s the only always-on workhorse computer in our house, it is utterly silent, totally reliable, sips power, and it’s at the center of our home entertainment, networking, storage, and gaming. This handy box does it all,…

Post-It Minesweeper

Remember Minesweeper? It’s probably been forever since you played, so go have a game online now. And there went your afternoon.

A game of Microsoft Minesweeper in progress.
This is actually a pretty tough move.

My geek-crush Ben Foxall posted on Twitter on Monday morning to share that he’d had a moment of fun nostalgia when he’d come into the office to discover that somebody in his team had covered his monitor with two layers of Post-It notes. The bottom layer contained numbers – and bombs! – to represent the result of a Minesweeper board, and the upper layer ‘covered’ them so that individual Post-Its could be removed to reveal what lay beneath. Awesome.

Ben Foxall discovers Post-It Minesweeper
Unlike most computerised implementations of Minesweeper, the first move isn’t guaranteed to be safe. Tread carefully…

Not to be outdone, I hunted around my office and found some mini-Post-Its. Being smaller meant that I could fit more of them onto a monitor and thus make a more-sophisticated (and more-challenging!) play space. But how to generate the board? Sure: I could do it by hand, but that doesn’t seem very elegant at all – plus, humans make really bad random number generators! I didn’t need quantum-tunnelling-seeded Minesweeper (yes, that’s a thing) levels of entropy, sure, but it’d still be nice to outsource the heavy lifting to a computer, right?

Screenshot of my Post-It Minesweeper board generator.
Yes, I’m quite aware of the irony of using a computer to generate a paper-based version of a computer game, why do you ask?

So naturally, I wrote a program to do it for me. Want to see? It’s at danq.me/minesweeper. Just line up some Post-Its on a co-worker’s monitor to work out how many you can fit across it in each dimension (I found that I could get 6 × 4 standard-sized Post-Its but 7 × 5 or even 8 × 5 mini-sized Post-Its very comfortably onto one of the typical widescreen monitors in my office), decide how many mines you want, and click Generate. Don’t like the board you get? Click it again!

Liz McCarthy tweets about her experience of being given a Post-It Minesweeper game to play.
I set up the first game on my colleague Liz’s computer, before she came in this morning.

And because I was looking for a fresh excuse to play with Periscope, I broadcast the first game I set up live to the Internet. In the end, 66 people ended up watching some or all of a paper-based game of Minesweeper played by my colleague Liz, including moments of cheering her on and, in one weird moment, dispair at the revelation that she was married. The internet’s strange, yo.

Anyway: in case you missed the Periscope broadcast, I’ve put it on YouTube. Sorry about the portrait-orientation filming: I think it’s awful, too, but it’s a Periscope thing and I haven’t installed the new update that fixes it yet.

Now go set up a game of Post-It Minesweeper for a friend or co-worker.

Teaching Kids to Code – Why It Matters

The BBC ran a story this week about changes to the National Curriculum that’ll introduce the concepts of computing programming to children at Key Stage 1: that is, between the ages of five and seven. I for one think that this is a very important change, long overdue in our schools. But I don’t feel that way because I think there’ll be a huge market for computer programmers in 13+ years, when these children leave school: rather, I think that learning these programming skills provide – as a secondary benefit – an understanding of technology that kids today lack.

Computer program that asks if you're a boy or a girl and then says it likes that gender. Photograph copyright Steven Luscher, used under Creative Commons license.
Ignoring the implied gender binary (fair enough) and the resulting inefficiency (why do you need to ask two questions), this is a great example of a simple algorithm.

Last year, teacher and geek Marc Scott wrote an excellent blog post entitled Kids Can’t Use Computers… And This Is Why It Should Worry You. In it, he spoke of an argument with a colleague who subscribed to the popular belief that children who use computers are more technically-literate than computer-literate adults. Marc refutes this, retorting that while children today make use of computers more than most adults (and far more than was typical during the childhood of today’s adults), they typically know far less about what Marc calls “how to use a computer”. His article is well worth reading: if you don’t have the time you should make the time, and if you can’t do that then here’s the bottom line: competency with Facebook, YouTube, Minecraft, and even Microsoft Office does not in itself demonstrate an understanding of “how to use a computer”. (Marc has since written a follow-up post which is also worth reading.)

Children programming on laptops. Photograph copyright Steven Luscher, used under Creative Commons license.
If the can of Mountain Dew wasn’t clue enough, these children are coding.

An oft-used analogy is that of the automobile. A hundred years ago, very few people owned cars, but those people that did knew a lot about the maintenance and inner workings of their cars, but today you can get by knowing very little (I’ve had car-owning friends who wouldn’t know how to change to their spare tyre after a puncture, for example). In future, the requirements will be even less: little Annabel might be allowed to ‘drive’ without ever taking a driving test, albeit in a ‘driverless’ computerised car. A similar thing happened with computers: when I was young, few homes had a computer, but in those that did one or more members of the family invariably knew a lot about setting up, configuring, maintaining, and often programming it. Nowadays, most of the everyday tasks that most people do with a computer (Facebook, YouTube, Minecraft, Microsoft Office etc.) don’t need that level of knowledge. But I still think it’s important.

A 2-year-old using a MacBook. Photograph copyright Donnie Ray Jones, used under Creative Commons license.
A future computer-literate, or just another computer “user”?

Why? Because understanding computers remains fundamental to getting the most out of them. Many of us now carry powerful general-purpose computers in our pockets (disguised as single-purpose devices like phones) and most of us have access to extremely powerful general-purpose computers in the form of laptops and desktops (but only a handful of us use them in a ‘general purpose’ way; for many people, they’re nothing more than a web browser and a word processor). However, we expect people to be able to understand the issues when we ask them – via their elected officials – to make sweeping decisions that affect all of us: decisions about the censorship of the ‘net (should we do it, and to what extent, and can we expect it to work?) or about the automation of our jobs (is it possible, is it desirable, and what role will that mean for humans?). We expect people to know how to protect themselves from threats like malicious hackers and viruses and online scams, but we give them only a modicum of support (“be careful, and install anti-virus software”), knowing full well that most people don’t have the foundation of understanding to follow that instruction. And increasingly, we expect people to trust that software will work in the way that it’s instructed to without being able to observe any feedback. Unlike your car, where you may know that it’s not working when it doesn’t go (or, alarmingly, doesn’t stop) – how is the average person to know whether their firewall is working? You can find out how fast your car can go by pressing the pedals, but how are you to know what your computer is capable of without a deeper understanding than is commonplace?

A simple program in Hackety Hack.
I first started to learn to program in Locomotive BASIC. My microcomputer was ready to receive code from the second it booted: no waiting, just programming. Nowadays, there’s a huge barrier to entry (although tools like Hackety Hack, pictured, are trying to make it easier).

A new generation of children tought to think in terms of how computers and their programs actually work – even if they don’t go on to write programs as an adult – has the potential to usher in innovating new ways to use our technology. Just as learning a foreign language, even if you don’t go on to regularly use it, helps make you better at your native language, as well as smarter in other ways (and personally, I think we should be teaching elementary Esperanto – or better yet, Ido – to primary school children in order to improve their linguistic skills generally), learning the fundamentals of programming will give children a far greater awareness about computers in general. They’ll be better-able to understand how they work, and thus why they sometimes don’t do what you expect, and better-equipped to solve problems when they see them. They’ll have the comprehension to explain what they want their computer to be able to do, and to come up with new ideas for ways in which general-purpose computers can be used. And, I’ve no doubt, they’ll be better at expressing logical concepts in mutually-intelligble ways, which improves human communication on the whole.

Let’s teach our kids to be able to understand computers, not just “use” them.