Today Michael Gove announced that coding is going to be taught in schools. I think this is great news, but to be honest I’m feeling a little uneasy about it. While it’s great that coding is going to be taught at all, I’m concerned that they might screw it up. In some cases it’s down to the government to avoid teaching coding the wrong way; in others it will be the responsibility of individual schools and teachers.
So please excuse my pessimism. But I think it’s important that the government does this well, and that people understand precisely how not to teach programming.
It’s all too easy for schools to pick bad technologies and languages to teach their students to program with.
At my former sixth form, one of the small number to offer A-level Computing, the primary teaching language was Pascal. The argument was that it’s roughly ‘middle-of-the-road’: higher-level than C and Assembly, lower than Ruby and other scripting languages. Supposedly this was a good thing, because students taught on Pascal could take their skills either way on the language spectrum. The real reason was likely that it was teacher’s Blub.
Another bad way to choose a teaching language is looking at what’s popular in ‘industry.’ In fact, popular languages are rarely good. Java, for instance, is a terrible language, but it got popular because Sun spent millions on marketing and PR to get pointy-haired bosses to insist on it. (They also got it into universities: it’s impossible to get a CS degree in the UK without learning Java. And people ask me why I’m not going to university. I suspect a similar process got Microsoft Office associated so closely with ICT lessons.)
A similar process might happen to Objective-C, which is a fine language for writing iPhone apps — but I don’t think anyone would argue that it’s a more powerful language than Ruby or Scheme.
So I would advise the government to recommend a carefully-picked set of languages to be taught in schools. Don’t choose languages that are ‘popular in industry’ hoping that it will get people a job. Choose good teaching languages in the knowledge that those who would thrive in a computing job will learn the languages they ‘need’ to get a job by themselves.
Too much systematisation.
In his essay A Mathematician’s Lament, Paul Lockhart bewailed the teaching of maths in school. The same danger faces computer science.
The joy of programming is in making things just because it’s interesting to make them, like Bayesian text filters in your own programming language. There’s the joy of writing programs you use every day, which are perfectly suited to your needs and tastes. There’s the joy of releasing that code, knowing that your work is making a positive difference to other people’s lives.
The joy of programming is not in knowing the difference between formal and actual parameters, being able to describe tail-call optimisation (ha), or describing what the function
print does. Testing these things is like testing an artist by asking them to define the word palette. They’re useful terms to know when you’re talking about concepts with other programmers, but you don’t need to know them to do real programming.
The most important things are that students write good, working code, and that they work on finding their own problems and solving them their own way. Reducing programming to sets of algorithms and data-types in the way that maths is treated as formulae and constants would be a terrible thing.
I didn’t take A-level Computing but if I had a free period or my teacher was away and there was a lesson being taught, I would sometimes ask to sit in on it, to see what they were teaching.
The teacher kept talking about “real programmers,” not in the sense of someone who doesn’t use Pascal or refuses to let his blackjack program cheat, but just to refer to ‘professional’ programmers with ‘real programming jobs.’ He talked as though the professional programmers are “real programmers” but you students are just learning the basics, don’t get too excited now.
If you’re writing any code at all, you’re a real programmer. No kidding. Even if that code is Pascal (or Java, etc.), all code that works in some sense is real code, and if you’ve ever written any then you’re a real programmer.
I suppose my problem is the word ‘real.’ If he had talked about ‘professional programmers’ instead, I wouldn’t mind — that’s a perfectly valid distinction to make, but perhaps still not constructive and motivating enough.
Maltreating the geniuses.
When I was in my school’s ICT lessons and they were teaching how to use Excel, I rarely looked at the actual instructions for what they wanted us to do. I just looked at the end result, then worked on my own to get something like what they wanted. (And I still don’t know how to use a spreadsheet properly.) Somehow I still got good marks.
Things were less rosy in the classes covering basic HTML. They were teaching HTML 1995-style, but I wrote beautiful XHTML and CSS. The teacher wasn’t impressed. (I got my just deserts. I had to test my pages in Internet Explorer 6 — no other browser was installed on the school computers. When I looked at them in Safari or Firefox at home, they looked terrible.)
Teachers are often slow to recognise that sometimes, a few students may have advanced themselves to the point where they really do know better than them. This danger is particularly prevalent in computing, where the state-of-the-art advances so quickly, and the teaching establishment is often so far behind the times.
Many programmers are self-taught, from an early age (usually around 10 or 11 years old). Most good programmers are taught themselves. All the true experts at programming were either self-taught, or learned at university only because computers weren’t around when they were ten or eleven. A successful computer science curriculum will embrace students who already know some, or most of the material.