Search

Tuesday, 25 February 2014

Facebook Buys Phonebook

The sticker price of $19 billion was the lead item in most reports on Facebook's acquisition of WhatsApp last week, but subsequent commentary has tended to focus on the messaging firm's ad-free model and the assurances of Mark Zuckerberg that he has no intention of changing it: "I don’t personally think ads are the right way to monetize messaging". A lot of the commentary is just MBA boilerplate, so there is the usual stuff about market share, growing user numbers, defensive plays etc. Zuckerberg's official line is an apparently Micawberish belief that something will turn up: "Once we get to being a service with 1 billion, 2 billion, 3 billion people, there are many clear ways that we can monetize". In fact, it's not "clear" at all.

WhatsApp currently claim over 400 million active users, a figure that has doubled over the last year (i.e. it's probably a bit soft). There are 7 billion people on Earth, and around the same number of sim cards (smartphones account for about 1.5 billion). One of the founders, Jan Koum, has talked of reaching 5 billion users, which points to WhatsApp becoming the de facto standard for smartphone messaging (not to mention 3.5 million handset upgrades), even though it lacks any real technological edge. The attraction of the service has been free or low-cost use and the absence of adverts, and it is crucially dependent on network SMS services being relatively expensive.

What it does have is an application architecture that requires the user to upload all their contacts to a central server for cross-checking to establish who they can communicate with. In other words, Facebook is buying phone numbers, and more specifically numbers organised in interconnecting networks, lots of Rolodexes, rather than just a flat directory. The capability of extending the "social graph" to mobile has obvious attractions for Zuckerberg & co. Even more attractive, many of those numbers belong to people who don't currently use either Facebook or WhatsApp. A self-denying ordinance on the harvesting of user's personal information does not preclude text-spamming their contacts.


In the popular tech media, the founders of WhatsApp have become almost heroic figures, lauded for their anti-advertising stance and refusal to gather user data (despite actually doing so, but hey, it's just metadata, right?) while simultaneously envied for their sudden great wealth (their earlier opposition to being bought-out has been quietly forgotten). The decision by Jan Koum to sign the deal on the door of the Federal office where his family collected foodstamps is naturally being held up as evidence that anyone can make it in America, while his childhood in the Ukraine, "a country where phone lines were often tapped, instilled the importance of privacy in him", apparently. The backlash cannot be long in coming. Perhaps we could get Pussy Riot to flashmob their offices in Mountain View.

Some of this geek indulgence can be traced to the desire to be done with advertising altogether, a counter-culture remnant in a highly commodified society that has itself become ironically commodified (think of Steve Jobs's unadorned turtle-necks). This leads to WhatsApp's ad-free model being optimistically elevated to the ethos of a utility, a quasi-public service in contrast to the naked consumerism of Facebook, despite the total absence of the characteristics of a utility, such as large-scale infrastructure investment, public oversight, price controls etc.

Thinkers like Jaron Lanier long ago progressed beyond this naive trust in the altruism of the tech companies, but they have since succumbed to a Utopian belief that we can best defend the individual (and "middle class" jobs) by re-establishing strict intellectual property rights in a "humanistic information economy". This imagines companies making micropayments in order to exploit our data, while still providing "free stuff" subsidised by advertising. In effect, the massive surplus of the tech companies (e.g. the $4 billion in cash used for the WhatsApp purchase) would be distributed as a proportionate dividend to us digital peons. Micropayments would even extend to surveillance: the NSA are welcome to our data, but they must pay a fee. The more they want, the more they pay. The price mechanism thus encourages responsibility and limits abuse. This exchange-based approach is naturally proving popular in the US, where it chimes with the deep ideology of homesteading and is seen as reliably anti-socialist and consistent with neo-classical economics.

Critics like Evgeny Morozov have noted that Lanier's vision is actually one of ubiquitous surveillance, or more exactly permanent self-exposure as a method of personal monetisation. You are incentivised to over-share. The crossover with the lifelogging meme and the idea of the quantified self is obvious. Perhaps less obvious are the roots of this in neoliberal theory. As Michel Foucault said in The Birth of Biopolitics: "In practice, the stake in all neo-liberal analysis is the replacement every time of homo oeconomicus as a partner of exchange with homo oeconomicus as entrepreneur of himself, being for himself his own capital, being for himself his own producer, being for himself the source of [his] earnings".


Lanier's fundamental error is to believe that a market in personal data and authored content could spontaneously arise given the massive structural asymmetries of the Internet. For all the peer-to-peer capabilities, it remains dominated by broadcast, as well as congenial to monopolies and cartels.  His plea, that more crumbs should be brushed off the table for the rest of us, is just trickle-down economics recast as pull rather than push. It is possible that such a market could be imposed through massive state intervention, but this would be unlikely to ever get off the ground, even in China, not least because driving the recalcitrant off-net (or to the darknet) is no longer in the interests of the state (the ironic achievement of the NSA and GCHQ).

Zuckerberg is calculating that governments have no desire to empower users, beyond consumer protection and privacy anodynes, and that the telcos will gradually quit the network application space (bought off by guaranteed rent from network traffic and state subsidies for infrastructure investment). WhatsApp is not currently a disruptive competitor for the phone companies, with which it has an essentially parasitical relationship - i.e. eating into SMS revenues. The post-acquisition announcement that it will provide voice calls is being talked about as game-changing, the telcos "reduced to companies that maintain expensive networks and hand out Sim cards to order", but this ignores the truth that the network carriers have always been rubbish at network applications and have been progressively marginalised since the arrival of 3G. Their core interest remains the pick-n-shovel infrastructure.

The iconic 5 billion figure mentioned by Zoum has previously been bandied about by the Facebook CEO in respect of internet.org, the industry lobby intended to extend online access to the two-thirds of the global population that currently lack it, through "new data-compression technologies, network infrastructure, and business models that make it possible to not only get everyone a smartphone, but make the data that powers them affordable". Getting a handle on most of the world's population is clearly not just an inspiring mission statement. The true significance of the $19 billion price tag is probably what it tells us about macroeconomic conditions, namely that we are in the middle of a huge stock bubble inflated by central bank-injected liquidity and low interest rates, but the significance of Zuckerberg's choice of WhatsApp is that he wants to monopolise the de facto directory service of the future Internet. Despite his attempts to distance Facebook from the NSA fallout, the congruence of interests remains: they both want to know who we know.

Wednesday, 19 February 2014

The Magical Properties of Code

The magical properties of code have been much to the fore of late, notably in the car-crash launch of the Year of Code campaign. This industry lobby, to encourage us all to get coding, is fronted by Lottie Dexter, who has a background in business deregulation and rightwing think-tanks. The sight of her trying to out-stupid Jeremy Paxman on Newsnight (the unapologetically dim meets the proudly ignorant) was hilarious. Predictably, people who get po-faced about technology have slammed Dexter as an airhead PR, failing to spot that YoC is simply a PR exercise for which she is admirably qualified.


John Naughton takes a more high-minded line, rejecting the utilitarian justification that ubiquitous coding skills will help our kids find jobs: "This is first and foremost about citizenship. Today's schoolchildren will inherit a world that is largely controlled by computers and software. The choice that faces them is Program or Be Programmed". Naughton and others also noted the obvious commercial interests of the backers, e.g. Saul Klein's interest in Codecademy, and the compromised position of the BBC. More pragmatically, Jack Schofield suspects that what is on offer is little more than the basics of HTML: "how to use some parts of a mark-up language, which is (being generous) only just coding ... YearOfCode is just recycling some currently fashionable but equally fatuous American techno-romanticism".

The defenders of YoC and its ilk are perhaps more unconsciously honest about the economic reality, and thus reveal a prejudice towards techies of a certain age. According to Rory Cellan-Jones at the BBC: "There is a minority of older experienced programmers who see themselves and their craft as an exclusive band of brethren and will always be hostile to an initiative like this. A glance at the comments under a YouTube video of Lottie Dexter's Newsnight interview reveals a murky world of misogyny and coding snobbery". The gratuitous troll callout (almost every video that features a woman attracts misogynistic comments) is intended to paint Dexter as a progressive victim of reactionary forces in the mould of Mary Beard or Caroline Criado-Perez. The implication is that programming has become a 1970s-style closed shop and must be deregulated and modernised. Basically, those nerdy twats earn too much, and probably don't have girlfriends.

YoC joins a growing list of initiatives pushing the idea that everyone must "code". These range from bog-standard neoliberal lobbies to grassroots initiatives. In reality, this is a narrow spectrum of acceptably progressive positions, all of which assume that "coding" is not merely a panacea, but that it may well be on a par with religion, hence the popularity of tales about conversion and hope. The picketing of Google buses in San Francisco may be a response to growing urban inequality, but it resonates more widely as a critique of organised religion (the symbolic importance of the passengers' "separateness" and "self-absorption"), and thus the desire to democratise adept knowledge. If you think that's a stretch, consider how often modern advocates of educational rigour insist, like YoC, that the future is about "the three Rs and a C". Lottie Dexter is not William Tyndale, but she is in a long line of those seeking to retool the people in the name of progress.

What these initiatives also share is the fear that unless you acquire this "skill" you will be socially and economically disadvantaged in the future. It's not quite as bad as facing the rapture unbaptised, but the same millennial anxiety is there. The promoters are often not "coders" themselves (any more than your average "tech entrepreneur" is), but then the loudest advocates for the three Rs are not always numerate or literate beyond the functional minimum. But we shouldn't assume that this is just a lobby for whatever suits the needs of business, not least because "coding" does not objectively exist in the way that maths or written English does: there are many programming languages, their useful lifespan is a few decades at best, the difference between assembler and HTML is vaster than any two human languages etc. Even the argument that it's about "understanding principles" is weak as most of those principles reduce to algebra, syntax and symbolic logic. With the exception of set theory, most of this would have been familiar to schoolkids a hundred years ago.

The instrumental argument, that the future wealth of the nation depends on everyone being able to code, is demonstrably false. Contrary to popular belief, software development does not create lots of jobs. If it did, it would have done so by now. There were 1.1 million people employed in IT roles across all sectors in the UK in 2012, of which 298,000 were programmers (using the most generous definition). In other words, coding accounts for 1% of the workforce. You'll not lose money betting that this share will increase, but it will be some time before there are more programmers than public sector teachers (442,000 in 2012) or estate agents (562,000 in 2013). In fact, much of the forecast demand for IT skills is actually for intermediary roles (i.e. species of general management), rather than "coders", particularly in hype areas such as cyber-security, risk management, "Green IT", big data and cloud computing.


Unlike physical engineering, software engineering benefits from massive economies of scale and a tendency towards monopoly. We need hundreds of thousands of engineered structures across the country, but we don't need thousands of search engines or email clients, while the ubiquity of "apps" is a reflection of their ease and cheapness of supply, not insatiable demand. There will not be a Silicon Roundabout in most towns. It is also worth bearing in mind that software production is subject to the same technological and competitive forces as any production process, so there is a tendency towards deskilling as languages become more "high level" (i.e. closer to simplified English), programs write programs (i.e. AI substitutes for humans), and free or cheap GUI tools lay waste the cottage industry of Website builders.

Entry-level coding (such as Web page markup and scripting, which is what initiatives like YoC mainly push) is poorly-paid, largely because it is easily offshored. This is not "high tech" in any meaningful sense. The "well-paid jobs" in the UK that can legitimately be categorised as programming are largely species of business services, such as inhouse custom development or third-party systems integration, where the market price ultimately depends on proprietary or domain knowledge that can't be taught in schools. Programming a Raspberry Pi may be fun, but it remains less vocationally useful than knowing how to use Microsoft Excel, while the generic technology skills needed for the future - i.e. the ones more appropriately compared to the three Rs - are routinely acquired by kids outside of school via their use of the Internet and social media.

It would be easy to dismiss YoC as a stalking horse for neoliberal sweetheart deals, where privileged providers like Codecademy get to sell over-priced IT course material to the Departments of Education or BIS, but I think there is something else at work here. This is the notion of digital citizenship, on which left-of-centre critics like John Naughton find common ground with George Osborne, who says in the YoC launch video: "I want to make sure that kids in our schools are not just consumers of technology ... they understand coding ... they understand more about the world around them, and that is what education is all about".

Underlying this is the idea that citizenship is contingent, dependent on a specific ability or qualification, hence the lazy claim that "coding is the new literacy". No one is suggesting that you should be denied the vote if you are "incomputerate", any more than if you can't read or write, but there is the implication that you will be unable to "fully participate" if you lack the requisite skills. You will definitely be in the "out group", even one of the "digitally disenfranchised" (even if that phrase is really just a euphemism for "too poor to have Internet access"). I remember the 1970s BBC's series, On The Move, which tried to destigmatise adult illiteracy through the encouraging progress of Bob Hoskins' removal man. That was an exemplary social-democrat tract about what you could gain: your salvation, in other words. The "code or die" trope focuses on what you are at risk of losing, and is another example of the constant striving and performative self-improvement of neoliberal ideology: the low hum of anxiety about "outdated skills" and becoming "unmarketable".


I think there is another layer to this, which is that the tech industry is keen to encourage even more content creation by users. Learning how to create a Web page or edit a stylesheet may not be quite the liberation we imagine, but it does normalise the idea that we are responsible for continuing to "build the Web" at a time when the growth in new users is slowing. The YoC site hints at this: "We use code to build websites and apps, design clothes, publish books, make games and music, and to get the most from technology. Getting to know code is really important. It means you can be creative with computers, start your own business or boost your earning potential. It is really simple to learn and anyone can do it - not just rocket scientists". Code will set you free, but you'll have to work at it.

Whereas the digital economy pre-Web was largely about businesses creating and consuming their own content, the modern digital economy is based on the idea that everyone should create their own and consume each others. The dominant tech companies are now essentially intermediaries. Volume and monopoly allow them to charge small rents on transactions, typically via advertising and premium offerings. Though they are lionised as creators of new wonders (hence the symbolic importance of driverless cars and delivery drones), the reality is that most monetisable software is a mish-mash of acquisitions and open-source (often well-hidden), and heavily-dependent on unpaid user testing and support. Tech entrepreneurs remain first and foremost financial engineers.

Saturday, 15 February 2014

Buying Votes

Tom Perkins, the US billionaire who recently claimed that attacks on the rich were akin to the Nazis' persecution of the Jews, has now suggested that the "rich should get more votes", or so says the The Daily Telegraph. The note of caution does not reflect my scepticism that he said this, but cynicism about the agenda of the newspaper. Despite the headline, Perkins is actually arguing that the poor should get fewer votes, which is not quite the same thing but still consistent with the implication that votes are commodities that can be bought (if not resold). The bonus for rich guys like himself is incidental. This provocative reform turns out to be a reactionary lament: "Thomas Jefferson, at the beginning of this country thought to vote you had to be a landowner. That didn’t last and the vote was given to everyone. But the basic idea was you had to be a taxpayer or a person of property to vote. That went by the board".

This is another sighting of the "no representation without taxation" meme, which the Torygraph has a particularly soft spot for, but this time with deliberately obnoxious top-notes to confuse and distract the inevitable trolls - like truffle oil drizzled on dodgy scrambled eggs. Perkins has become the poster-boy / laughing-stock for the trending topic of "the demonisation of the rich". I don't think the extensive media coverage is a conscious attempt to distract from the hegemonic trend, which is the relentless demonisation of the poor, so much as a revelling in the perverse and ironic (like the apocryphal Bullingdon Club initiation of burning a £50 note in front of a tramp). The wider UK context is the failure of austerity to produce a self-sustaining recovery and the consequent likelihood that taxes will have to increase after 2015 to avoid the implosion of some public services (Cameron's panicked insistence that "money is no object" for flood relief is a tacit admission of this).

The problem is that 30 years of neoliberal reform have produced a top-heavy economy in which the scope to spread tax increases across the population has been reduced. The reality of fiscal exchanges is that most people are net beneficiaries when total tax is offset by total benefits, and the crossover point has been moving up the population distribution over the last three decades as inequality has grown and tax receipts have become correspondingly more dependent on the rich. The share of tax paid by the top 1% of earners increased from 11% in 1979 to 28% in 2012, and is predicted to hit 30% this year, according to the IFS. This has been spun on the right as evidence that we should coddle the rich: for fear that higher rates of tax may scare them away. Behind this lurks the suggestion that as wealth is virtue, a willingness to contribute a greater quantum of income tax should be publicly applauded (even if your marginal total tax rate is lower than that of a homeless alcoholic, who must pay VAT and excise on the booze that dominates his expenditure).


The changing composition of the tax cohorts over recent years has seen millions "taken out of tax altogether" through increased personal allowances, while more middle-earners have found themselves crossing the top-rate threshold as that falls relative to average income. The cohort that pays only basic rate income tax is shrinking, pointing towards a stark tripartite system of a "tax-free" lower class (ignoring the disproportionate burden of VAT etc), a resentfully anti-welfare middle class (who get the lion's share of welfare), and a privileged and disengaged upper class (who routinely avoid tax). In this light, Tom Perkins' comments are not so extreme. He isn't seriously proposing that you should get an extra vote for every million you have in assets, but that the middle class should focus on disenfranchising the irresponsible poor (who, conservative prejudice insists, will always vote themselves "largesse out the public treasury") rather than expropriating the rich. This is therefore a critique of democracy.

The harsh reality of this challenge is ignored by centrists and progressives who continue to agonise over questions such as "Is Parliament hopelessly out of date?", as if the old dear just needed a makeover. According to Katie Ghose, Chief executive of the Electoral Reform Society: "The vote should be given to 16- and 17-year-olds, electoral registration should be seamless and we need a fairer voting system at all levels of government, as well as an elected House of Lords. Politics should be reclaimed as a public service, with a majority aiming to hold public office in their lifetime. Change like that requires not being squeamish about bringing politics into the classroom at the earliest opportunity".

What does "seamless" mean? Why an elected House of Lords, as opposed to simple abolition? Why should a majority hold public office? The latter implies rotation or sortition, in order to be practical, which would be an effective way of preventing new political blocs gaining traction. "Bringing politics into the classroom" is a classic manoeuvre of totalitarianism, which echoes the success that clerical conservatives had bringing religion into it. The transitive verb "bring" implies that politics (legitimate politics) must be injected and cannot arise spontaneously from within. It is clearly not the same as "letting schoolkids take control". The progressive agenda reveals its authoritarian instincts.


The atrophy of popular, participatory politics (i.e. the decline in party membership and voter turnout since the 1950s) is a stylised fact that ignores any evidence to the contrary, such as temporary flowerings of participation or alternative, non-traditional modes of debate and protest. It also tends to ignore the structural and cultural imperatives of different eras: "one of the drivers of membership [in the 1950s] seemed to be the thrill of the cut and thrust of debate within the party ... people would join for entertainment as much as conviction. Parties were part of a cultural bricolage of strong institutions – church, factory, union and so forth – that helped provide identity and give people a sense of place in the world". The assumption that the mass of electors have lost interest (rather than been turned away by cliques and careerists) looks suspiciously like an elite consensus that democracy is ailing and requires "reform" to prevent the rise of demagoguery and the "wrong sort".

You can hear this angst behind a lot of the commentary on UKIP. The failure of the Kippers to break the mould in the recent Wythenshawe by-election was predictable, but this hasn't stopped many insisting that they remain as big a threat to Labour as they do to the Conservatives. This is just wishful thinking, not unlike the centrist excitement that marked the foundation of the SDP (their threat to the Tories was ultimately negligible). Thirty years later the members of the "plague on both their houses" tendency have got Nick Clegg in government and are so disappointed they find Nigel Farage's claim to have "been on benders" longer than the curtailed by-election campaign both horrifying and strangely thrilling.

The popular disillusion with voting, and the indulgence of court jesters like Russell Brand, does not indicate that democracy is "sick", as if failing from within (i.e. the people's fault), but that an otherwise robust democracy is under attack from without. If you are denied an effective choice, which is the modus operandi of neoliberalism, it is perfectly reasonable to lack interest in voting. That doesn't mean that you don't care, or wouldn't relish the opportunity to influence a decisive contest. Ultimately, we are not as far from the farce of democracy seen in Russia (where votes are already highly commodified) as we would like. While Tom Perkins may not be persuasive in his desire to abolish universal suffrage, the fundamental cause of the rich - to deny democracy power over property - remains in good health.

Wednesday, 12 February 2014

The Coriolanus Effect

The Guardian's handy cut-out-and-keep guide to a century of armed conflict involving British forces is being presented as a hopeful sign of that much-delayed "peace in our time" we were once promised, though this is surely the very definition of a triumph of hope over experience. While the planned cuts to the military will certainly restrain their enthusiasm for biffing foreigners, the giddy excitement of drones and teched-up special forces suggests that a full return to barracks may be some way off. The top brass are describing this as a "strategic pause" (the strategy being to identify another reliable enemy who can justify the disproportionate expense). I'm surprised no one has wheeled out "we live to fight another day" yet.

The pause is widely interpreted as the result of British society becoming disillusioned with unwinnable "wars of choice" and increasingly "casualty-averse", with the vote against intervention in Syria being deemed to have caught the popular mood (in reality, the planned budget cuts predated the vote). In other words, it's our fault for being irresolute glory-hunters who have now lost our collective bottle. A variation on this, which is propounded on both the left and right, is that disengagement is the inevitable result of Britain becoming more multi-cultural. In other words, trashing the Middle East doesn't play well with Muslims (mind you, I don't recall politicians or the military moderating policy in Northern Ireland in the 1970s in order to appease the large number of Irish immigrants in Britain). This can be thought of as the Coriolanus effect: a tendency to denigrate the "beastly plebeians" as unworthy of their noble soldiers.


The underlying premise, that overseas military ventures are dependent on popular support, is nonsense. Indeed, one achievement of the Guardian article was to highlight how many conflicts were barely noticed in Britain at the time (an irony of the Syria vote is the elite fear that future decisions on intervention must actually be discussed). Outside of the two world wars, wilful ignorance was the standing order. This traditionally led the military elite to have a low opinion of the "many-headed multitude" - a feeling that was largely mutual and provided fertile ground for the "lions led by donkeys" trope that so irritates Michael Gove - which reinforced their contempt for democracy and their class pride. A clue to the roots of this estrangement was contained in the Guardian's observation that "The timeline of constant combat may stretch even further back, given Britain's imperial engagements, all the way to the creation of the British army in 1707".

The British Army and Navy actually dates back beyond the Union with Scotland (see what they did there?) to the Civil War, and specifically the creation of the Parliamentary New Model Army in 1645. Initially radical (the highwater mark being the Putney Debates), the Army was purged of Levellers and other agitators by Cromwell and the gentry, after which it was sent to Ireland in 1649. This combination of political repression and support for colonial expansion would become characteristic of the British military over the centuries. The navy, as a state institution rather than the ad hoc combination of crown ships and privateers under the Tudors, also dates from the Commonwealth, and specifically the desire to control the growing Atlantic trade routes and access to the new colonies in America, which led to the First Anglo-Dutch War of 1652-4.

From Cromwell onwards, the chief strategic purpose of the military has been to advance and defend Britain's commercial interests. The pattern of intervention over the last 100 years reflects this. In the 20s and 30s, the focus was on quelling nationalist movements in order to preserve oil and trade interests in the Middle East and Asia. After a brief period re-establishing colonial control in the late 40s (including aiding the French and Dutch in the Far East), the 50s and 60s became an era of managed withdrawal in the face of the inevitable (and American pressure), with compliant pro-Western regimes being "facilitated" by British arms (or autocrats propped up in more remote corners where the international media were absent, such as the Arabian Gulf). The Falklands Conflict can be seen both as the last knockings of this post-imperial mode and a harbinger of the modern strand of liberal interventionism and "policing" in the Balkans, Iraq and Afghanistan.

In this light, the current lack of military adventure reflects less the waning support of the people back home and more the advances of neoliberal democracy (e.g. Argentina) and the economic power of developing nations (e.g. China) abroad. While the French are still keen on sending Legionnaires into parts of West Africa as a form of political Viagra, it is surely only a matter of time before Nigeria warns them off their backyard. Britain has recognised that the interests of its economic elite are now better served by jaw-jaw than war-war, largely because those interests are no longer identified with the UK so much as the virtual territory of the tax-averse rich, where Belgravia is closer to Dubai than Basildon, and Scotland is just an arts festival and some grouse moors. Similarly, Obama's winding-down of US military involvement abroad is a a sign not of circumspection or a loss of nerve but of relative success. When every country is open to global corporations and finance capital is unfettered, there is less need for gunboats and surgical strikes.

Thursday, 6 February 2014

Put a Value on That

It's been claimed that the current Tube strikes will cost the capital's economy £200 million (£50 million a day). This is meaningless, and not just because it's mere rhetoric by a business lobby, or an arbitrary figure with no supporting data (hilariously, the Daily Hate managed to reveal the fictional nature of the number by misquoting it in their headline as £20 million). It's meaningless because any attempt to calculate the net impact of a specific shock to an economy is incredibly difficult, and certainly can't be made with any accuracy in advance (the upside variant of this is the giddy claim about the "boost to the economy" of fracking and the like). In reality, a short and limited strike is a trivial interruption (even if personally inconvenient) as commuters switch to alternative transport, work from home or take a holiday. According to TfL, who are routinely pro-business but here have an obvious interest in showing that the strike's impact is limited, "86% of Londoners who usually paid for travel using their Oyster cards had done so as normal".


£50 million represents about 5% of the average daily value of the London economy (£309 billion in 2012), which equates to an average "loss" of 22.5 minutes for every worker (i.e. 5% of a 7.5 hour working day). The source for the claim, the London Chamber of Commerce and Industry, puts the figure at £48 million (media inflation does the rest) and bases it on an opinion poll of its members from September 2007. The London Overground wasn't launched till November 2007, while the branch between Clapham Junction and Surrey Quays that I used yesterday on my way to Shoreditch (home of a couple of iconic Tube carriages) didn't open until December 2012. 22.5 minutes is coincidentally a quarter of a football game, which reminds me that England played their final 2010 World Cup group game at 4pm on a Wednesday afternoon ("work lunch and you can bugger off early", was the common adaptation). I suspect the notional "loss to the economy" that day was considerably greater.

This sort of specious calculation (which the media usually accept at face value) was also present following the suspension yesterday of the main rail line to Plymouth and Penzance due to storm damage at Dawlish: "Business leaders claim the closure of the line will cost the south-west up to £30m a day in lost business". I'm surprised they didn't predict cannibalism on the English Riviera inside the week. The operative term here is nothing to do with the number, or even the implied authority of "business leaders", it's the phrase "lost business". The idea that economic activity is a tangible substance, that can be "acquired" or "lost", like buried treasure or loose change, is such a timeworn trope that we lose sight of what it signifies, namely that everything has a precisely calculable market value. This has reached a new level of absurdity in Italy, where the state is considering suing Standard & Poors for downgrading the country's credit rating in 2011 without taking into account its "history, art or landscape which, as universally recognised, are the basis of its economic strength".

S&P could point out that a credit rating is merely an assessment of a state's likelihood to honour its debts, not the valuation of a balance sheet containing "priceless" assets that might prove difficult to sell. However, Italy could point out that flogging off the landscape (or at least a few islands) was precisely what was demanded of Greece in 2010. Of course, the references to islands and the Acropolis were iconic, with the actual "sell-off" comprising leases on government land and the privatisation of services (and thus economic rents), rather than the ceding of sovereignty or disposal of cultural treasures. But the use of icons like this is (if you'll pardon the meta-pun) emblematic. It represents a world of commodities and property, where everything has a tradeable value and nothing is so fixed (not even the Acropolis) that it cannot be uprooted and sold on.

The economy is a highly complex, dynamic, open-ended network, which is why it is resilient to shocks and highly adaptable, in the same way that a "self-organising network" of millions of commuters will adapt in the event of a transport interruption. The 2008 banking crisis (and associated meta-narratives like Nassim Nicholas Taleb's "black swans" and "anti-fragility") has encouraged us to forget this, to think of the economy as a singular "property" that must be husbanded, defended against further depletion and built up over time, hence we remain suckers for nonsense like a £50 million "loss" or, at the other extreme, a "windfall" like fracking.

Meanwhile, the real action is elsewhere, as shown in this Bloomberg video. Having quickly dispensed with the "cost" of the strike "to London's commuters" (or at least the well-heeled segment that make up their audience), they move on to TfL's "modernisation" plans and the commercial opportunities opened up by demanning ticket offices, from Amazon lockers to station sponsorship, all over the strapline: "Bringing business to public transport". In the neoliberal world, business is always a benefit, never a cost, and always progressive. The irony is that commercial outlets were a traditional feature of Tube stations, from kiosks to vending machines, before they were removed in the 1980s as part of an earlier wave of modernisation. Even station "sponsorship" has been possible, if you didn't treat it as a commercial exchange, as Herbert Chapman found when he persuaded London Underground to rename Gillespie Road station to Arsenal. Of course, rebranding it now as "Emirates Arsenal" would cost a packet. Everything has its price.

Monday, 3 February 2014

Silence, Exile and Cunning

The Coen brothers' Inside Llewyn Davis is a beautifully made film that offers a counterpoint to their Oh Brother! Where Art Thou, being another playful interpretation of Homer's Odyssey. The earlier film filtered the tale through references to Preston Sturges's Sullivans Travels, which echoed Gulliver's Travels, itself a parody of Homer's epic. Their latest also references Groundhog Day (and thus Dante's Inferno), turning the abrasive protagonist's increasingly miserable journey into a circle of Hell, but the chief influence I think is James Joyce's Ullyses, mainly because of the centrality of music and the idea of authenticity (the setting is the New York folk scene in 1961). But if Joyce's classic ends with an affirmative "Yes!", this film delivers a downbeat "Maybe". As in their earlier work, the happy ending is ruthlessly undermined.


Early on, Jean, one of the friends that Llewyn routinely sponges off and whom he may have got pregnant, describes him pithily and repetitively as "shit" and then more poetically as "like King Midas's idiot brother". He is brownfinger (his clothes are largely autumnal shades of brown, contrasting with wintry grey backgrounds). Everything he touches turns to shit: he misses out on a share of the royalties of a hit song, he loses his remaining money in a vain and Kafkaesque attempt to rejoin the merchant marine, he mislays a friend's cat and compounds the error by returning an inadequate looky-likey ("Where's his scrotum?"). When he visits his senile father (an old sailor asked after by Llewyn's tormentors at the sailor's union) and plays him a favourite song (actually the contemporary Shoals of Herring), the old man craps his pants.

Late on the name of the cat is revealed to be Ullyses, which suggests that far from being a mere McGuffin the tabby brown was the real hero of the tale. At one point Davis stares hard at a poster for the Disney film, The Incredible Journey, which told the "true life" (i.e. faked) story of how three domestic pets returned from the wild to the comfort of suburbia. The cat's name would suggest that Davis, subtly played by Oscar Isaac, is actually a riff on Stephen Daedalus, a young man with father issues seeking escape. Interestingly, there is no reference to Llewyn's mother, and he is repeatedly offensive to the mother substitutes among his friends and fellow performers.

The heart of the film is an extended road trip from New York to Chicago, in which Davis shares the cost with a fat old junkie jazz musician, played by John Goodman, and a wannabe beat poet, played by Garrett Hedlund. Though some will find this overlong to the point of trying, it is a clever essay on the barren stupidity of authenticity, as the two other characters monotonously pursue their stereotypes to the point of torpor. The jazz muso spends much of his time catching flies, when not shooting up in the toilet, dissing Davis's former partner, Mike, for choosing the George Washington Bridge to jump off (Brooklyn Bridge is "traditional"), and cursing Davis with some New Orleans voodoo shit. The beat poet "valet" is taciturn, self-absorbed and selfish. Even the camera seems obsessed with his profile.

Goodman is partly referencing his own one-eyed bible salesman in Oh Brother! Where Art Thou, and here shares the Polyphemus role with F Murray Abraham's Chicago club owner who turns down Llewyn for a gig, advising him to rejoin his partner Mike ("That's good advice", deadpans Davis) and stating of his music: "I don't see a lot of money here". The meta-irony is that the film's production values are drenched in money, with shots down car-filled New York streets (rather than across) that must have cost a mint to restore to an early-60s look.

In New York, Davis is assailed on all sides by the inauthentic, like an older Holden Caulfield. His friend Jim, played by Justin Timberlake (who started out in Disney's Mickey Mouse Club), gets him a recording gig for an execrable novelty single. He bums a sofa for a few nights off Al Cody, another session performer, who reveals his real name is Arthur Milgrum. Jean, Jim's girlfriend (played by Carey Mulligan), who demands that Davis pay for her abortion, is revealed to have let the Gaslight Café's owner screw her. Four ersatz fishermen in cable-knit sweaters are now on stage there, soon to be replaced by an early Bob Dylan, who Davis literally and figuratively turns his back on. Marcus Mumford is on the soundtrack. How much suffering can a man bear?

Ultimately, Davis soldiers on, rejecting the suicide option of his erstwhile partner, rejecting the turn off the road to Akron, where an old girlfriend may be bringing up his kid (he discovers along the way that she didn't have the abortion he paid for), rejecting the music ("Four micks and Grandma Moses"), and rejecting his past (the remains of his youth were thrown out along with his seaman's licence papers). We have no idea where he goes or what he does, but his final words, "Au revoir", spoken after a beating by a faceless deus ex machina in a stetson, hint at exile.