It has been fashionable in recent years to claim that modern technological innovation isn't a patch on the past - that the current IT revolution is little more than trivial consumption, such as iPods and cat videos, which compares poorly with steam power and the internal combustion engine. Robert Gordon and Tyler Cowen have been widely quoted in this regard. Some of this is probably generational - i.e. middle aged men who never got over the non-appearance of jet-packs being underwhelmed by Angry Birds (I imagine Thomas Malthus didn't think much of the potential of steam engines). Some of it is just conservative misanthropy - the assumption that we live in decadent times and everything is a bit shit. And some of it is, I believe, a misunderstanding of the significance of triviality.
It was interesting then to see a report by neoliberal cheerleaders McKinsey outlining the "12 technologies that could drive truly massive economic transformations and disruptions in the coming years". McKinsey, given how they earn their money, focus on near-term technologies - i.e. what they consider to be racing certainties, rather than speculative futurology. They think mobile Internet might be big. There's the usual consensus guff, so 3D printing, "autonomous vehicles" and "advanced materials" (i.e. Graphene) make the first division. In the second division ("on the radar") we find fusion power and quantum computing, which would probably have made the top division in years gone by, before we realised how bloody difficult they are, while the third division of the "interesting and often hyped" includes 3D and volumetric displays, which have little potential use outside of Sci-Fi films.
What is most significant about the top 12 is the extent to which they rely upon information technology, which the report notes is now "pervasive". This is obvious in the case of mobile Internet, cloud technology, the "Internet of things" (i.e. smart devices and RFID), and automation of knowledge work, but it is also true for the other disruptive technologies. Advanced robotics is now more software than hardware, autonomous vehicles depend on realtime processing, genomics depends on massive data-crunching, while advanced oil and gas exploration and recovery has been more about IT than wrenches for decades. 3D printing is emblematic of this. Though it has only broken into public consciousness in recent years, basic 3D printers were built in the early 1980s. The slow march to wider use has been partly due to refinements in the mechanics (additive manufacturing) but mainly due to advances in computing power and the software.
The disruptive change that has triggered the most comment has been the automation of knowledge work. McKinsey note that knowledge workers comprise 9% of the global workforce and account for 27% of total labour costs. As that ratio indicates, we're talking about the better-paid, middle-class jobs. This is leading to further fretting about the returns to education: is it worth getting a degree if skilled, whitecollar jobs are going to start disappearing. Personally, I suspect that this automation will work its way from the bottom up - i.e. attack the least powerful first. Just as telephone support was quickly offshored, we should expect it to be first in line for intelligent automation (something better than "Press 1 for ...") The higher management life-forms, like McKinsey consultants, will be at the back of the queue, so degrees from "top" universities will continue to translate into economic power, they'll just be fewer of them. You can also expect a continuation of the tendency for human behaviours that cannot be automated to be formalised as business "values", and for those behaviours to exhibit a class (or "educated") bias. Once empathy and customer-focus are synthesised, a GSOH and charmingly amateurish cake-making skills will become more important. Clubbability has already proved a more long-lived skill than the ability to use a slide-rule.
We can also expect rapid inroads by technology in the burgeoning "care industry", which is clearly at the vulnerable end of the labour scale. The example of the grandson who video-cammed his infirm granny in order to check up on her care workers was a microcosm of many current trends. The poor quality care (often down to crap wages and insufficient time rather than human wickedness), the upside of surveillance, and the growing expectation that technology and "telecare" may have a much larger role to play. A similar revolution is in the offing for healthcare, which explains the relentless claims that the NHS is inadequate and/or insupportable. While you can make money by privatising a labour-intensive public service and cutting staff (e.g. binmen who no longer have time to walk up the drive), the really big money is to be made through automation that takes out swathes of the workforce. The trick is to secure a long-term contract just before massive capital investment, and ideally get the government to part-fund this investment as the rail and water companies did.
Part of the reason why the current technological revolution is under-appreciated is the massive deflation in costs and associated commodification. This leads to the supposition that the technology must be trivial, because we can afford to put it to trivial uses. McKinsey note that the fastest supercomputer in 1975 was the CDC 7600, which cost a princely $5m then, equivalent to $32m in today's prices. An iPhone 4 has the same processing power and costs about $400. If steam engines had experienced comparable deflation, they would literally have cost buttons by 1900. This tendency towards faster commodification and steeper deflation is nothing new. If you follow the Robert Gordon model, the technologies of the 1870-1900 era, such as electricity, the internal combustion engine, central heating, air-con and indoor plumbing, were all more commodified and affordable than those of the 1750-1830 era, such as steam engines, cotton spinning and railways. Ordinary people got the benefit of cheaper clothes and railway travel, but they couldn't afford their own power looms or railway engines. They could (eventually) afford their own indoor toilets and cars.
This massive drop in the cost of information technology has further ramifications because of the transmission of deflation to secondary technologies, i.e. the many and various applications of IT. A "manufacturing startup" in the 1930s meant a significant investment in machine tools, plant and raw materials. A tech startup today can require little more than a couple of laptops, broadband and a spare bedroom. The point is not that these startups will all produce viable businesses - the failure rate is very high - but that the capital at risk is tiny. If you have ever wondered why venture capitalists threw silly money at barking ideas during the dotcom boom, bear in mind that low entry costs were as much the driver as a desire not to miss the next big thing. The problem for VCs was that they were sitting on a lot of capital, faced with a lot of projects each requiring a relatively small amount. Imagine a 100-horse race, in which every mount has odds of 1,000 to 1, and you have £100. You'd just put a quid on every horse and rake in the winnings.
The impact of IT has (I think) been central to the growing dearth of capital investment opportunities, which has in turn led to more and more capital being pumped into property and resource speculation, incidentally expanding financial markets faster than the fixed capital base. This dearth is less a smaller quantum of opportunities (as some have assumed, feeding the "we've stopped innovating" meme) and more a massive fall in the price of opportunities as discrete investments. This is throwing off huge amounts of wealth, but that wealth is sticking with capital rather than being shared with labour. The consequence is increasing wage inequality, increasing asset inequality, and a polarisation of jobs. As Marc Andreessen said rather dramatically: “The spread of computers and the Internet will put jobs in two categories: People who tell computers what to do, and people who are told by computers what to do.” It is important though to note that the first group will be dominated by owners of capital and their "lackeys", to use a once-popular term, rather than software engineers or other techies. The nerd will not inherit the earth - it will be that lovely young intern with the nice manners. That is the real significance of triviality in the modern economy.