Nuts, bolts and butterflies’ wings
This week I had a bout of incontinent nostalgia. The BBC posted an article celebrating the release, thirty years ago, of a seminal British home computer – the Sinclair ZX81. This machine was the brainchild of maverick inventor Sir Clive Sinclair. It had 1K of RAM. One Kilobyte. It had no screen – you plugged it straight into the TV. The multifunction keys were crowded, squishy and unresponsive. Graphical interface? Ha! This was three years before Apple released their legendary Macintosh. What you saw when you turned on was just white text on a black background – onto which you had to type instructions in BASIC computing language.
But you know what? It worked. And it was AMAZINGLY cheap. In 1981 the market-leading Apple III cost 3,500 Dollars. The Sinclair was just 100 Dollars. This meant that eighteen-year-olds like me could ask their hard-working parents for a home computer without being laughed out of the room. The ZX81 transformed popular computing, and produced an entire generation of computer programmers. I could reminisce for hours about this machine, but I’ll spare you. You can read the gory details here and here.
The key point about the ZX81 was that to make it do anything, you had to get your hands dirty. You had to teach yourself some programming – loops, variables, conditional statements. There were no installed programs. This was no sealed-off, iPad “user experience”. This was pure nuts and bolts.
Yet Sinclair sold over a million of these things. People loved them. So why was it considered fine back then for ordinary folk to code their own devices, but not now? Is it just that computing technology has matured so much that we can forget the nuts and bolts, and focus on the shiny surface? Does all technology – radio, TV, motor car – eventually evolve into a sealed black box, inside which all complexity is locked away?
A trip to my son’s school this week threw this question into sharp relief. My son is 14, and making subject choices that will determine his future education. He is scientifically inclined – so his choices are influenced by methods, formulae and data. Nuts and bolts. I was surprised, therefore, at one omission on his list of favourite courses. Computing – now called Information and Communication Technologies. He’s fine at it – but never mentions it. I asked him why. The answer was that he found it dull.
Dull? A boy who loves to solve, make and calculate, finds computing dull?
I asked the teacher about the ICT course content – and was astonished to discover that fundamental elements of computer science, engineering and programming have effectively been removed from the subject. The new focus for ICT – as outlined in the UK GCSE curriculum – is on evaluation and presentation of information – not on understanding the processes by which this information is calculated, stored or accessed. Pupils learn to use programs like Word and Powerpoint – they don’t code programs of their own. They study “how to use ICT safely and responsibly”, and learn to know “when and where to use ICT to enhance their learning and the quality of their work”.
Work. That’s the key word. This course is workplace training. Its intention is to make citizens “computer literate” in terms of computer use – so that they can participate in a modern knowledge economy. But in doing so ICT has removed low-level detail to focus on vocational abstractions. ICT believes that study should be directly relevant to people’s lives and work – it shouldn’t get lost in obscure details. I can see a whole legion of business leaders – no doubt educated in obscure details – nodding their heads in satisfaction.
There’s just one problem. ICT is now utterly and depressingly DULL.
It is unchallenging and uninspired. It is like listening to a management Powerpoint presentation for two whole years without a toilet break. It’s turning a whole generation of students away from computer studies. And the consequences of this are now being felt in education and the economy. Data from the Joint Council of Qualifications in the UK show that numbers of students studying computing have declined 33% in just three years. This has had a knock-on effect in higher education and industry, where Britain has historically performed very strongly.
In short, the UK is losing expertise in the nuts and bolts of a technology that runs and connects every single device on earth, from washing machines to nuclear power stations. It is teaching competency in the use of devices and applications that will presumably be engineered and programmed by others. As an education policy, this seems staggeringly stupid.
Thankfully, alarm bells are now ringing. In 2010 the Royal Society in the UK launched an 18 month study into the issue, in association with industry and education bodies. This will be released in winter 2011 – with recommendations that will hopefully reverse the decline. It can’t come soon enough.
Am I just speaking out of educational prejudice here? Am I a computing dinosaur who wishes to inflict dull code on a generation of work-hungry children? I don’t believe so. I accept that there is real value in studying the practical issues of computing in society. But there has to be a balance between the nature of computing and the use of computing. I firmly believe that the nuts and bolts I learned on ZX81 have a universal benefit in education. Here’s why:-
1. Knowledge is power
Computers control everything in 21st century life. Every game, every cash machine, every bank account. Your medical records. Your government data. Your car’s brakes. Hospital machines. Military weapons. At the core of all of these things is digital processing and logic. This core has a surprising simplicity. But if we never learn anything about this core, then we delegate the control of our society to those that do. That is never a good thing.
2. Life is full of algorithms
At the heart of all computer programming is the algorithm. An algorithm is just a recipe for problem solving. If I ask you to direct me to the library, you give me a set of instructions that will, if followed, get me to my destination. Computer programs do all of their processing using these recipes. And by doing these ourselves, we become skilled at decomposing problems and constructing organised ways of solving them. Are you telling me that this has no applicability in the wider world?
3. Complexity emerges from simplicity
Computing is staggeringly simple at its core. Just 1’s and 0’s. Yet using these two simple states, we’ve connected an entire world. Anybody who knows about DNA is aware of the complexity that grows from simplicity. There are only four ‘letters’ in the genetic code – A,C, T and G. But with just these four, we can code a whole human. What you get when you tinker with 1’s and 0’s is a sense of potential and consequence. Your actions can build beautiful things from simple statements – but they can also create unforseen problems. The smallest insight is rewarded – the tiniest error can bring down the whole endeavour. Coding is a lesson in the “butterfly’s wing” – an example of the connectedness between great and small that characterises all the best and worst aspects of human interactions.
When he built the ZX81, I don’t think Clive Sinclair had these lofty ideas in his mind. But then that’s the real beauty of nuts and bolts. You don’t just tell people what stuff means – you give them the tools to make their own consequences.