Search

Monday 17 June 2013

The Significance of Triviality

It has been fashionable in recent years to claim that modern technological innovation isn't a patch on the past - that the current IT revolution is little more than trivial consumption, such as iPods and cat videos, which compares poorly with steam power and the internal combustion engine. Robert Gordon and Tyler Cowen have been widely quoted in this regard. Some of this is probably generational - i.e. middle aged men who never got over the non-appearance of jet-packs being underwhelmed by Angry Birds (I imagine Thomas Malthus didn't think much of the potential of steam engines). Some of it is just conservative misanthropy - the assumption that we live in decadent times and everything is a bit shit. And some of it is, I believe, a misunderstanding of the significance of triviality.

It was interesting then to see a report by neoliberal cheerleaders McKinsey outlining the "12 technologies that could drive truly massive economic transformations and disruptions in the coming years". McKinsey, given how they earn their money, focus on near-term technologies - i.e. what they consider to be racing certainties, rather than speculative futurology. They think mobile Internet might be big. There's the usual consensus guff, so 3D printing, "autonomous vehicles" and "advanced materials" (i.e. Graphene) make the first division. In the second division ("on the radar") we find fusion power and quantum computing, which would probably have made the top division in years gone by, before we realised how bloody difficult they are, while the third division of the "interesting and often hyped" includes 3D and volumetric displays, which have little potential use outside of Sci-Fi films.

What is most significant about the top 12 is the extent to which they rely upon information technology, which the report notes is now "pervasive". This is obvious in the case of mobile Internet, cloud technology, the "Internet of things" (i.e. smart devices and RFID), and automation of knowledge work, but it is also true for the other disruptive technologies. Advanced robotics is now more software than hardware, autonomous vehicles depend on realtime processing, genomics depends on massive data-crunching, while advanced oil and gas exploration and recovery has been more about IT than wrenches for decades. 3D printing is emblematic of this. Though it has only broken into public consciousness in recent years, basic 3D printers were built in the early 1980s. The slow march to wider use has been partly due to refinements in the mechanics (additive manufacturing) but mainly due to advances in computing power and the software.

The disruptive change that has triggered the most comment has been the automation of knowledge work. McKinsey note that knowledge workers comprise 9% of the global workforce and account for 27% of total labour costs. As that ratio indicates, we're talking about the better-paid, middle-class jobs. This is leading to further fretting about the returns to education: is it worth getting a degree if skilled, whitecollar jobs are going to start disappearing. Personally, I suspect that this automation will work its way from the bottom up - i.e. attack the least powerful first. Just as telephone support was quickly offshored, we should expect it to be first in line for intelligent automation (something better than "Press 1 for ...") The higher management life-forms, like McKinsey consultants, will be at the back of the queue, so degrees from "top" universities will continue to translate into economic power, they'll just be fewer of them. You can also expect a continuation of the tendency for human behaviours that cannot be automated to be formalised as business "values", and for those behaviours to exhibit a class (or "educated") bias. Once empathy and customer-focus are synthesised, a GSOH and charmingly amateurish cake-making skills will become more important. Clubbability has already proved a more long-lived skill than the ability to use a slide-rule.


We can also expect rapid inroads by technology in the burgeoning "care industry", which is clearly at the vulnerable end of the labour scale. The example of the grandson who video-cammed his infirm granny in order to check up on her care workers was a microcosm of many current trends. The poor quality care (often down to crap wages and insufficient time rather than human wickedness), the upside of surveillance, and the growing expectation that technology and "telecare" may have a much larger role to play. A similar revolution is in the offing for healthcare, which explains the relentless claims that the NHS is inadequate and/or insupportable. While you can make money by privatising a labour-intensive public service and cutting staff (e.g. binmen who no longer have time to walk up the drive), the really big money is to be made through automation that takes out swathes of the workforce. The trick is to secure a long-term contract just before massive capital investment, and ideally get the government to part-fund this investment as the rail and water companies did.

Part of the reason why the current technological revolution is under-appreciated is the massive deflation in costs and associated commodification. This leads to the supposition that the technology must be trivial, because we can afford to put it to trivial uses. McKinsey note that the fastest supercomputer in 1975 was the CDC 7600, which cost a princely $5m then, equivalent to $32m in today's prices. An iPhone 4 has the same processing power and costs about $400. If steam engines had experienced comparable deflation, they would literally have cost buttons by 1900. This tendency towards faster commodification and steeper deflation is nothing new. If you follow the Robert Gordon model, the technologies of the 1870-1900 era, such as electricity, the internal combustion engine, central heating, air-con and indoor plumbing, were all more commodified and affordable than those of the 1750-1830 era, such as steam engines, cotton spinning and railways. Ordinary people got the benefit of cheaper clothes and railway travel, but they couldn't afford their own power looms or railway engines. They could (eventually) afford their own indoor toilets and cars.

This massive drop in the cost of information technology has further ramifications because of the transmission of deflation to secondary technologies, i.e. the many and various applications of IT. A "manufacturing startup" in the 1930s meant a significant investment in machine tools, plant and raw materials. A tech startup today can require little more than a couple of laptops, broadband and a spare bedroom. The point is not that these startups will all produce viable businesses - the failure rate is very high - but that the capital at risk is tiny. If you have ever wondered why venture capitalists threw silly money at barking ideas during the dotcom boom, bear in mind that low entry costs were as much the driver as a desire not to miss the next big thing. The problem for VCs was that they were sitting on a lot of capital, faced with a lot of projects each requiring a relatively small amount. Imagine a 100-horse race, in which every mount has odds of 1,000 to 1, and you have £100. You'd just put a quid on every horse and rake in the winnings.

The impact of IT has (I think) been central to the growing dearth of capital investment opportunities, which has in turn led to more and more capital being pumped into property and resource speculation, incidentally expanding financial markets faster than the fixed capital base. This dearth is less a smaller quantum of opportunities (as some have assumed, feeding the "we've stopped innovating" meme) and more a massive fall in the price of opportunities as discrete investments. This is throwing off huge amounts of wealth, but that wealth is sticking with capital rather than being shared with labour. The consequence is increasing wage inequality, increasing asset inequality, and a polarisation of jobs. As Marc Andreessen said rather dramatically: “The spread of computers and the Internet will put jobs in two categories: People who tell computers what to do, and people who are told by computers what to do.” It is important though to note that the first group will be dominated by owners of capital and their "lackeys", to use a once-popular term, rather than software engineers or other techies. The nerd will not inherit the earth - it will be that lovely young intern with the nice manners. That is the real significance of triviality in the modern economy.

2 comments:

  1. [Comment originally posted while Google+ comments were turned on. Turning this off, as I have now done, causes the comment to disappear on Blogger, so I've reposted along with my reply]

    -------------------------------

    From ArthurBough:

    David,

    Usual high quality blogging. However, I'm not sure about a couple of points. Firstly, I think there is a good possibility that the use of these new technologies WILL be to replace high paid, high skilled jobs where possible. They have always been the ones Capital has had most incentive to deskill, because they represent the greatest savings for each individual capital. I think Aglietta's view that Neo-Fordism would be about precisely that in relation to a series of service provision is exactly what we see, and is the real material force behind privatisation. After all, Capital can deal with the low paid, low skilled work by a variety of methods - absolute surplus value, low paid immigrant workers, off-shoring of production to low wage economies etc.

    I also don't agree with the view of a "growing dearth of capital investment opportunities, which has in turn led to more and more capital being pumped into property and resource speculation, incidentally expanding financial markets faster than the fixed capital base."

    I think there has been, and continues to be no end of profitable investment opportunities. When I think about all of the innovations that have occurred over the last 20 years, I think they have been staggering compared even with some of the most vibrant periods of industrial history. That is why in the first 10 years of this century the output of goods and services was equal to 25%, of the total for Man's entire history!!! It is not that there has been a shortage of investment opportunities that has led to a build up of huge money hoards that have found their way into blowing up asset price bubbles, but the fact that the rate and volume of profit over the last 30 years has been so huge, and in part that itself has been a function of the massive rise in productivity, and the shortening of the turnover period of capital.

    The notion that current innovations are not significant compared to those of the past is driven by the unthinking adherence to what is thought to be Leninist orthodoxy, that Capitalism is in its death throes, and therefore incapable of dynamism, and further revolutionising of the productive forces.

    I prefer to use the method of Marx, Engels and Lenin, and look at the facts first, and then analyse the reality rather than to simply try to make the facts fit some 90 year old mantra.

    -------------------------------

    Reply from Dave Timoney:

    Boffy,

    I agree that technology will replace high-pay/high-skill jobs, I just think it will start with the low-pay/low-skill roles first. As you note, automating the former would actually deliver the greatest benefit to capital, but we need to factor in the political power of that bloc, i.e. the middle class, to delay and divert the process.

    Re the dearth of invetsment opportunities, this is (paradoxically) the product of the staggering advances in technology and productivity. As profit (the returns to capital) has expanded, it has become more difficult to recycle. Not because there is a decline in the quantum of opportunities - quite the reverse - but because the unit cost has plummeted.

    The property bubble is symptomatic of profit not finding sufficient productive opportunities, as much as it is of relative asset class returns.

    ReplyDelete
  2. As a follow-up ... Paul Krugman has an article in the NYT today that discusses the low level of capital investment in terms of monopoly rents. Those monopolies (Apple, Google etc) are increasingly the product of technology.

    ReplyDelete