John Graham-Cumming’s recent TED Talk, “The Greatest Machine That Never Was” centers on the idea of the world’s earliest computer. As Graham-Cumming explains, Charles Babbage never built computing machines, he merely designed them. It wasn’t until British mathematician, Alan Turing, came along in the 20th century that we saw computers become electrical, like they are today.
Swelling with steampunk imagination and a 1K memory, Babbage’s analytical machine is a monster device that’s about the size of a steam engine. With gears and cogs spinning, the materialized version of this machine would be quite impressive, but beyond outdated. Graham-Cumming proposes to build this archaic computer, which begs the question: Why?
As I watched the video I’ll admit that I drooled over the epic scale of this computer and the old sketches of massive cogs spinning in sequence to calculate multiplication tables. It is a cool idea, but why would we spend our time and money investing in something like this? I am a historical nerd as much as the next editor (I’m an avid reader of Thomas Paine and Thomas Jefferson), but there isn’t any practicality in creating this machine that Babbage left on his to-do list. I’ve always enjoyed pictures, videos, and experiencing old-school technology – everything from old muscle cars to the earliest light bulb. I relish in witnessing a device that was designed nearly two centuries ago, finally realized.
The construction of this machine could serve some purpose as a museum display. I find this especially true in teaching children about the interworking components of computers, but can’t we find a cheaper, more efficient way to do so? This brings me to my larger point: Can we continue to glorify the past while keeping our eyes on trained the future? It is a tricky razor’s edge to dance on.
We need to remember our past in order to learn from it – I’m a big advocate for remembering your roots – but, embarking in the development of the archaic predecessors to little-known designs may not be the best route. This device may unveil some of the mystery behind computer technology, but the sheer cost outweighs the learning value. Besides, we have 3D printers you could probably create a scale model quickly, and without such encumbered costs.
A project like this begs to be taken on, but it shouldn’t be presented as anything other than a ‘cool’ pet project. People will donate money to this project thinking that they are promoting progress in computer technology when they are really only funding an awesome museum diorama.
Is this deception, or just a guy who really thinks his project will change the world? – even though it kind of already has. I’m sure Graham-Cumming would admit that this is a vanity project, and that’s fine. My problem comes when money is dumped (and I put emphasis on the word dumped) into a project that has no practical value. I’m a logical person – or at least I’d like to think so – and I feel like I can find more important things to throw my money at. Some might say, “So what’s the big deal? He’s pitching his project. If people want to give him money, that’s their business.” I agree, but the project should be presented more as vanity and less as necessity.
If we want to see advances in computer technology and we need to be smart about our investments. Babbage’s analytical machine is a great opportunity for publications like PD&D to show cool things happening in the industry, but this project has little value beyond the nostalgic and warm fuzzy feelings. If you want to see progress, give your money to an engineering grad student who is working on quantum levitation (http://leejinha.com/) or one of the teams participating in the American Solar Challenge (http://americansolarchallenge.org/). These are the not-for-profit researchers that will challenge the standard and bring us into a new age of technology.
What’s your take? Email email@example.com or comment below.