Popular Tropes

And now for something completely different ...

Monday, 2 April 2012

C is the new Latin

John Naughton joined the clamour for ICT education to be redesigned around programming in yesterday's Observer, something I've commented on previously. He rejects the instrumental justification that this will help the future economy and instead stresses the moral dimension. Because networked computing is central to how the modern world operates, if our children do not have a good understanding of the underlying technology, they will be "short-changed" and "intellectually crippled", he says in language shorn of all emotion. "They will grow up as passive consumers of closed devices and services, leading lives that are increasingly circumscribed by technologies created by elites working for huge corporations".

The implication is that an understanding of programming will be a guarantee of liberty and democracy in the future. That's possibly over-egging it. The real fear of the middle-class parents who read the Observer is less the coming dystopia and more that learning how to use Word and Excel will condemn young Emily to life as a public sector clerk. In contrast, computer science will give her access to the intellectual elite, working for a huge corporation. In fact, the likes of Apple and Google employ very few knowledge workers relative to their size, and programmers generally will always be a small percentage of the workforce (I'm talking here of real programmers, i.e. software developers, not website designers or marketeers who input content).

Naughton's claim, and the apocalyptic style in which it is couched, is typical of most debate around education, particularly with respect to the assumed failure to value science and technology. He even wheels out CP Snow of "Two Cultures" fame in support of this. The dichotomy in British education is real, but the basis is class not some inter-disciplinary antagonism. He gives an unintentional insight into this by dragging in the bete noire of Latin.

"That's why the fashionable mantra that emerged recently – that "code is the new Latin" – is so perniciously clueless. It implies that programming is an engaging but fundamentally useless and optional skill. Latin is an intriguing, but dead, language; computer code is the lingo of networked life – and also, it turns out, of genetic replication."

This misses the point. Latin is a token that provides access to a certain class of education. When I applied to go to university in the late 70s to study archaeology, there was a clear correlation between the requirement for O-level Latin and certain prestigious colleges, regardless of the substance of the course. Interestingly, no university required an O-level in technical drawing (something mainly taught in comprehensive schools), despite this having a far greater practical value. You spend much more time drawing pots and field plans than deciphering epigraphs.

It's also worth pointing out that computing languages can themselves become de facto "dead", even as their use grows. For example, you wouldn't teach kids to program in C, despite this being the language in which Unix and Linux are written. You'd do better to teach them a more modern, higher-level language such as Java or PHP. C is to PHP as Latin is to Italian.

In fact, you'd do better to not worry too much about the language at all, unless you want to replicate the Latin problem. What is of most value, both in terms of intellectual satisfaction and potential application, is learning the underlying principles and common methodologies. Naughton is strong on the former but weak on the latter, though it is methodology that is key to creativity and problem-solving. This in turn reflects the fact that software development, as a body of theory and practice, was hugely disrupted by the Wild West Web and is still struggling to get a grip. There were three reasons for this disruption.

First, the low cost of entry, notably the ease of learning Web scripting, resulted in the effective deprofessionalisation of programming: you didn't have to serve an apprenticeship (learn through others) but could hack your way to glory. The influx of cowboy coders led to rates crashing at the bottom end of the labour market, which in turn made it cheaper to throw bodies at any problem rather than seek to raise standards.

Second, the dotcom madness led to so much churn, in terms of startups and short-run projects, that many organisations were unable or disinclined to implement proper structure. Just as best practices can be disseminated through labour mobility, so worst practices can spread like a virus in a volatile market.

Third, the crossover of graphic design culture, as program logic and media presentation were interleaved, biased design and decision-making processes towards taste rather than empirical testing. Collaboration was seen more in terms of multi-disciplinary teams, i.e. individuals with specialisms, rather than multiple heads all working on the same problem. This encouraged both prima donnas and hangers-on.

All three remain major issues in the programming economy today, though the worst excesses have moved on to the new(ish) territories of social media and mobile apps. Elsewhere, software development is becoming progressively re-professionalised. This is the key social imperative behind the emerging consensus on the need for change: that we can keep and grow programming jobs in the UK and make them sufficiently high-quality to keep wages up. I suspect this is largely wishful-thinking.

John Naughton is right to urge the reform of ICT education, and right to prefer computer science and programming as the basis for this rather than the use of MS-Office, but he is naive to believe that our children will become better citizens as a result or that the positional good of an expensive education (private school, Latin, Oxford PPE) will be diluted.

What kids really need to be taught, or encouraged to learn for themselves, is how to think. That skill will still be valuable when programming as we understand it today is a quaint artisanal pursuit. One of the foundations of heuristics, and one of the key principles to good software development, is Occam's Razor or, as it is known in Latin, the lex parsimoniae.

No comments:

Post a Comment