Search This Blog

Wednesday, 30 July 2014

The Opinion of Others

All media are driven by the opinion of others. Factual news is expensive to acquire, unreliable in its supply, and tends towards the dull but worthy. Opinion, on the other hand, is inexhaustible, cheap and reliably contentious. In a variation on the City speculators' mantra, there is always a "greater fool" who will respond to Richard Littlejohn or George Monbiot (I plead guilty on numerous counts, m'lud). We turn gratefully from pictures and testimony of suffering in Gaza to verbal bunfights between Israeli spokesmen and TV news anchors, where the likelihood of a mind being changed is precisely zero (and, surprisingly, not likely to be helped by Mia Farrow and other slebs).

The first newspapers, which were gazettes of court announcements, were of limited interest compared to illustrated chapbooks detailing Catholic atrocities or tracts promising the imminence of God's Kingdom. Though literate snark soon came to the fore (The Spectator was launched in 1711), most newspapers continued to rely on adverts as much as editorial to pique interest until nineteenth century mass-literacy led to the "human interest story" and a realisation of the power of mobilised opinion to sway elected legislatures. The long twentieth century, from the launch of Tit-Bits in 1881 to the Web 2.0 media-moment in 2004, was distinguished by gossip, the privileged opinions of the high and mighty, and messages from "our sponsors".

The Internet has supposedly democratised opinion, providing a variety of platforms for "everyone" to have their say, but this just means a vast increase in content (gossip, opinion, ads) and thus an even greater value accorded to aggregators amd filters, which is what Tit-Bits was and arguably Addison and Steele's imaginary spectator was too. Plus ça change. Though some services claim that you, the consumer, are now able to act as your own aggregator (choosing who to follow, specifying your interests etc), the trope of "content overload" obviously serves the purposes of those who would "pre-curate" your content stream based on algorithms that analyse your history and relationships. Of course, such algorithmic precision is a myth, which even some services are happy to admit.

The churn in technologies and corporate providers obscures the persistence of opinion as the major driver of media. Thus incumbent providers, like the press, lift up their skirts and screech at the thought of Facebook manipulating a user's stream, while music nostalgists bemoan the death of the album (curation by Big Music) under the onslaught of streaming service playlists (dominated by sleb curation, which is the new face of Big Music). The success of Twitter, which is a pretty ropey piece of technology, is down to the simple fact that it provides raw, instant opinion, the slebbier the better (accept it: Rihanna is a better football pundit than you are).

Search engine results can be thought of as a type of playlist, dynamically curated by an algorithm that aggregates and orders relevant pages based on the "opinion" of other pages. The plea by Google that they should not be held responsible for the opinion of others (which has predictably found favour with the House of Lords) depends on a belief that the algorithm is an accurate reflection of that aggregated opinion (impossible to know) and not subject to any manipulation or systemic bias (clearly untrue). The myth of the Internet is that everyone has an opinion (because property is universal) and that the expression of that opinion should be free and unrestricted (because the state should not limit your rights in your property).

The key change that occurred a decade ago was that we moved from a culture in which being a passive consumer of the opinion of privileged others was deemed sufficient to one in which we must now all express an opinion as well, whether on Gaza or a friend's new cardigan. But that opinion does not need to be a reasoned analysis (we don't seriously seek to dethrone the aristocracy of pundits, merely to vote them up or down by the weight of our comments). A simple preference will do, either through a binary "like" or a top 10 ordering (the modern "listicle" is simply a way to teach us correct practice). Creating lists, and curating them on an ongoing basis, has become a social obligation: performative opinionising as a way of defining who we are. By their public playlists ye shall know them.

Thursday, 24 July 2014

L'Etat C'est Moi

Tony Blair's speech this week, twenty years to the day after he was elected Labour Party leader, prompted a number of retrospectives by his supporters in the press, which in turn prompted the usual venom beneath the fold about duplicitous war-mongering. Both sides largely ignored the speech itself, which was delivered in memory of focus group impressario Philip Gould and was eerily anachronistic in its promotion of a supra-ideological "third way", the false dichotomy of the individual and the collective, and the notion that the "zeitgeist" is some sort of constructed reality amenable to policy. It's as if Blair has barely been in the country of late.

Most of his defenders feel that his progressive record has been unfairly tarnished by the fallout from Iraq, which the former PM mentions precisely once in his speech, in a clause starting with "whatever". John Rentoul in The Independent was typical: "The country has changed, mostly for the better, in 20 years and much of it is because of Tony Blair. Unexpectedly, the change was best summed up by David Cameron in his words on entering No 10 four years ago, when he said that the country he inherited was 'more open at home and more compassionate abroad' than it had been". This is Great Man history, which marginalises the contribution of others (the snub to Gordon Brown is deliberate), wallows in nostalgia ("the sun shone more under the old king"), and equates the national mood with a personal style.

Janan Ganesh in The FT suggests that Blair was more pragmatic than he is given credit for: "He governed with the grain of history, nudging it along from time to time, but never upending a country that was functioning well enough". This is vapid insofar as almost all heads of government do exactly the same. It would serve just as well as a testament to John Major. It might appear paradoxical that a self-styled progressive like Rentoul would laud the impact of the individual on history, while a conservative like Ganesh emphasises structural forces, but the former is merely the neoliberal valorisation of "talent" as a proxy for class, while the latter believes that the exceptional Thatcher changed the course of history, tearing up the postwar settlement and embarking on a social and economic revolution that Blair merely continued (the reality was that she rode the wave of global structural change as much as she vigorously turned the tiller).

Much of Blair's achievement is simply down to longevity. If you cling to office long enough, you will get the credit for all sorts of secular changes and social shifts that simply coincided with your watch, while observers will marvel that you haven't stayed exactly the same ("from Bambi to Bliar"). In The Telegraph, Stephen Bush credits Blair with installing a security door on his childhood block of flats, much as good harvests were once attributed to the beneficence of the distant monarch. Similarly, Rentoul reckons Blair "achieved the near-impossible in Northern Ireland" with the signing of The Good Friday Agreement, ignoring the patient build up since the Downing Street Declaration in 1993, not to mention the obvious readiness of the paramilitaries for a face-saving peace deal long before Blair's accession in 1997.

Ganesh claims that Blair "did not come from anywhere in particular", despite the public school and Oxford background, though his classlessness and cosmopolitan ease were more apparent to journalists than to the average voter. Cameron's belief that he could model himself on Blair sprang from a realisation that he was not a million miles away from the upper middle class lawyer in style and experience. Chris Dillow identifies Blair's "managerialist ideology" as his weakness, leading to over-confidence and poor decision-making. This ideology was part of a wider commitment to neoliberalism at home and abroad, which included the privileging of The City, his mugging by US neocons over Iraq, and his subsequent ascension to the global 0.01%. (In The Guardian, Michael White said: "Someone told me recently he'd brokered an oligarch's yacht sale". The point is not that this tale is true, but that it is credible).

Once they got beyond the noise about the horrors of the Saddam regime (i.e. "he had it coming"), Blair's supporters initially defended the decision to go to war in Iraq on the grounds that it was made in good faith, the absence of WMD notwithstanding. As the cost of failure mounted, that tactical error was subsumed beneath the massive strategic blowback, suggesting that either the plan all along was to trash the region or else the architects of intervention were drunk on their own power. Blair seems oblivious to the irony of his contemporary words: "Third way politics begins with an analysis of the world shaped by reality not ideology, not by delusionary thoughts based on how we want the world to be, but by hardheaded examination of the world as it actually is".

A feature of Blair's delusion, which is being eagerly advanced by his supporters in the press, is that all the structural failings that he ignored or even encouraged, such as the growth of in-work benefits and the indulgence of The City, were the fault of Gordon Brown. This is a monarchical defence, in which the good king is undermined by his bad ministers. For Blairists among the Tories, this allows Ed Miliband to be tainted by association with Brown, though they struggle to square the idea of him as Cardinal Richelieu's Eminence Grise with the Wallace meme. For Labour Blairists, it holds out the prospect that the king over the water (in Jerusalem, mostly) may one day return, like De Gaulle recalled from Colombey. What odds a Blair-Sarkozy entente by 2020?

Monday, 21 July 2014

Damnatio Memoriae

Each time a celebrity is jailed for sexual abuse, or is definitively found guilty by a posthumous review, you can guarantee a flurry of commentaries on how their memory will now be effaced, their image edited out of the Top of the Pops archive and embarrassing memorials removed from view, thereby enacting symbolic violence against their person. Though we may be uncomfortable with the overlap of this practice with Stalinist airbrushing and the "unperson", there is also a sense that removing the disgraced's representation from the public record is a punishment with an impeccable pedigree. As a political act, this goes back to the Roman's damnatio memoriae and the cartouche tippexing of the Ancient Egyptians. As a social act, this is merely a larger-scale example of the willed forgetting and photo-chopping that follows on from bitter divorces and family estrangements. Rewriting the historical record to excise painful memories is normal.

In the early days of this process, while the evil nature of the disgraced is being established beyond doubt, nothing is allowed that might contradict the reduction of the person to the essence of their evil acts. Once the reputation is sufficiently blackened, contrasting imagery begins to reappear. The bigger the evil, and thus the more secure it is from revision, the more of this hinterland can be accommodated over time. A good example of this is Adolf Hitler, whose documentary appearances in recent decades have featured more incidental scenes with children and animals, not to mention the now clichéd horsing around with Eva Braun at Berchtesgaden, rather than just stock footage of ranting or map-pondering. A Channel 4 documentary about Hitler's dogs is surely in development already.

Presenting a rounded and contextual picture of a person is good history, but it also serves to blur the boundary between the person and their environment, i.e. other people. One reason for the extreme editing of the early years is the fear of contamination: the embarrassment of the other DJ seen grinning as Jimmy Savile leers at a young girl on TOTP, or the variety show host introducing his "very good friend" Rolf Harris. Effacement is always a cover-up, obscuring the connivance and negligence of others as much as the memory of the disgraced. This extends to the common stock of social memory, with associates now claiming they "hardly knew" X or had "little to do with" Y.

This willed oblivion will be much harder in future due to the inexhaustible memory of the Internet. The aftermath of the EU Court of Justice ruling on Google and the misnamed "right to be forgotten" has seen a campaign by the US search company to convert the issue from one of data property to one of "press censorship", at the same time as the UK government has forced through the Data Retention and Investigatory Powers (Drip) Bill, which will provide Google et al with the legal cover to continue supplying GCHQ with data. The framing of the EU judgement as a threat to "free speech" and "fair comment" is in the interests of both Google, which uses "freedom" to enclose the commons, and a print media that is largely anti-Leveson, anti-EU and keen to adopt the moral high ground relative to the search giant.

In practice, the threat of censorship arising directly from the ECJ ruling is non-existent, not least because the obligations on Google and others have yet to be established in the national courts. The US company is deliberately and provocatively jumping the gun in order to trigger an anti-ECJ groundswell. It serves Google's interests for people to assume that it and the Internet are one, rather than it being simply a glorified index that only covers a fraction of the Web and is already biased by the needs of advertising and existing government regulation. Google's difficulty with the ECJ is the product of its ambition to construct a persona for all users, i.e. a single profile encompassing both the individual's behaviour in Google applications and the wider "file" of that individual fragmented across the Web. It cannot admit the trivial impact of the ruling without simultaneously robbing its brand of the mystique of omniscience. There is a sound commercial reason why Google does not even acknowledge the existence of its competitors in the search market.

In contrast, while Google gets agitated about something that its users (mostly) don't care about, Facebook finds itself criticised for treating its users like guinea pigs, even though its now notorious "experiment" was simply a standard A/B test of the sort routinely carried out by commercial websites: just substitute the words "interested" and "bored" for "happy" and "sad" (the infantilism of the experiment is noteworthy). The "emotion contagion" finding is hardly news, and the focus on manipulation by Facebook alone is odd when editing content to stimulate interest or reaction is what all media organisations do every day of the week (the negative reaction by old media may indicate a fear that new media is much more effective at shaping public opinion).

The basis of the ethical storm is that while users have willingly alienated their data, they did not believe they were agreeing to be experimented on when they signed the terms of service (which they didn't read anyway). The difference this highlights is that Facebook see the users as an extension of the data (which they ultimately own), while the users still believe they are independent, even if they have ceded rights over that data. Google has the same view as Facebook, hence the importance they give to treating the online persona as superior to the offline person. The ECJ ruling embodies a different philosophy, which sees the data as the inalienable property of the citizen. The dispute is thus being framed as a conflict between previously complementary liberal principles: free speech and private property. Of course, free speech, in the particular form of a "free press", is essentially a property right, so the whole affair (including Drip) boils down to a tussle over property.

Because personal data is increasingly the property of others, I suspect that the convention of damnatio memoriae may well be on its way out. The instinct and desire to obliterate will remain at the individual level, but that just means "unfriending" and deleting accounts so that the offensive memory is hidden from your own view. At the societal level, organisations like the BBC, with a perceived responsibility to curate the historical record to reflect public opinion, will find their actions increasingly irrelevant and difficult to defend. That epigraph is the property of Google, that cartouche the property of Facebook.

Wednesday, 16 July 2014

Curb Your Enthusiasm

Paul Krugman, who has long cultivated the fertile lands between Sci-Fi and economics, has been thinking about the implications of "IT-mediated car services" like Uber. He avoids mentioning robot drivers, perhaps to prevent the discussion tail-spinning into angst about capital-labour substitution before the madness of cars gets a proper airing: "when you think about it, for most people owning a car is quite wasteful. It’s an expensive item of equipment that sits idle most of the time; it requires parking (and often a parking structure) both at origin and at destination; it requires maintenance and is a big hassle all around". In other words, it is an inefficient use of capital that involves a significant waste of land.

The promise of Uber et al is that "reliable, quick-response chauffeur services could free many people from the need to tie up all those resources in a consumer durable that they only use now and then. And from a social point of view it would avoid the need to tie up so much capital that sits unused most of the time". The implication is that the capital could be put to more productive use (from a social point of view) elsewhere, though Krugman is coy about recommending increased investment in public transport, which would simply bring the anti-state harpies down upon him. Instead, he seems keen to get marginalists onside by echoing the thoughts of Tyler Cowen, who previously advocated the incentive pricing of parking, in respect of capacity.

"There is, however, an obvious problem: rush hour. Peak car use comes twice a day, and that would seem to dictate that we have nearly as many cars as we do now even if they’re supplied by the likes of Uber. But here’s where surge pricing comes in. If traveling during peak hours is more expensive than off-peak, people will have an incentive to shave off those peaks. People who aren’t commuting to work will avoid travel at peak hours; some people will find other ways to travel; some people (and businesses) will rearrange their schedules to take advantage of cheaper off-peak travel".

As with all incentive-based arguments, the first question to ask is: who are these "people" of whom you speak? If traveling during peak hours is more expensive than off-peak, then peak-time travel will be dominated by the better-off. You could spin this as a form of progressive taxation, but the reality is that peak hours will only command a premium if they are a more attractive good. In other words, this is a system in which the less well-off exchange relative inconvenience for a discount on travel costs (unlike the democratic basis of public transport). While there will always be some individuals whose time preference makes this a win-win, aggregate satisfaction must be reflected in the time-slot price. The most popular slots will cost the most.

Though his blog post is provocatively titled "Life without cars", what Krugman actually envisages is "a society that still relies mainly on cars to get around, but manages to do this with significantly fewer cars than we need at present", which is a pretty accurate thumbnail sketch of New York, where the incentives of housing density, land value and loss of utility (i.e. traffic jams), as well as decent public transport, mean that more households are car-less than own a vehicle. It would be easy to suggest that familiarity with limos makes Nobel laureates comfortable with services like Uber, and easier still to suggest that well-paid economics professors like the idea of buying marginal services while investing their capital for yield, but Krugman is clearly in tune with the zeitgeist.

The wider secular trends are the move from ownership to subscription (already seen in many other areas) and consequent concentration of capital; the emergence of a generation that considers ownership of both homes and cars to be unfeasible in the short-term, if only because all their capital went into tuition fees; growing concerns about the environmental impact of cars (which is not unrelated to the decline of car manufacturing in developed nations); and the increased physical control of city centres and public spaces, variously sold on the grounds of security, "traffic management" and pollution reduction (London is well ahead of the field here).

Massively open online car schemes (hey, MOOCS!) may prove to be a stepping-stone to the introduction of robot cars in metropolitan areas - i.e. first get the infrastructure of car pools in place and then substitute the drivers. They promise a brighter and better Sci-Fi future, with minimal discussion about the changing landscape of power. As Krugman himself said (in the Wired interview linked to above), "Evil will come in stylish, Steve Jobs-inspired designs".

Sunday, 13 July 2014

The Coming Terror

The "Westminster Paedophile Scandal" appears to be galloping through the traditional narrative arc of "conspiracy in high places" in record time. As with the "Red Scare" of Joseph McCarthy, the classic modern template, there are three notable features: the suggestion that Westminster is riddled with the guilty; that some parts of government are dodgier than others (McCarthy had it in for the US State Department, his British successors for the Home Office); and that official inquiries are likely to be compromised, if not part of an active cover-up by "the establishment" (a word enjoying a mini-revival). You'll know the scandal is mid-flight over the shark when someone mentions the Freemasons or the Bilderberg Group.

This is not to say that there aren't paedophiles at Westminster or in the Civil Service (statistical probability suggests otherwise and convictions have already occurred), but there is clearly more at issue here than individual crimes, just as Watergate wasn't "about" a break-in. The question is whether the institution can be held responsible for the behaviour of its members, but it is doubtful that culpability goes beyond the willed obliviousness that the BBC exhibited in respect of Savile et al. It seems unlikely that there is anything on a par with the connivance of Richard Nixon, let alone that party whips at Westminster were routinely hushing-up criminal acts in order to strongarm MPs.

A legacy of the 1960s "sexual revolution" is the increased public salience of child abuse in the "doings of the mighty". Unconventional forms of consensual sex (adultery, homosexuality) have lost their power to embarrass or provide political leverage (McCarthy also pursued homosexuals, in what became known as the "Lavendare Scare"), while at the same time popular therapy has advanced the idea of childhood as a time of unique vulnerability. The structural origin of the "paedo threat" can be found in the increased tabloidisation of the media in the 1980s (the arrival of Today, the start of breakfast television and the evolution of the "sleb"), which recycled historic cases such as The Moors Murders and provided ample exposure for flimsy claims of "satanic ritual abuse". As these modern witch-hunts collapsed under the weight of their own hyperbole, focus shifted to institutional abuse.

Institutions are sites of power, so abuse of all sorts will always be found there. What I'm interested in is the selectivity of the media: sexual abuse is having a moment, but not in its most prevalent form, which is abuse and coercion in the workplace. Just as satanic ritual abuse was an all-to-obvious opportunity to berate social workers (first for incompetence, then for over-reaction), and by extension local councils, so the current focus on claims of a "ring" involving MPs, civil servants and local authority care homes corrals what might be termed "the usual suspects" in the eyes of the anti-state brigade. But this linkage is tenuous at best. The fact that Cyril Smith was an abuser does not implicate Parliament as a whole, nor should it excuse the media's eliding of child sexual abuse with a wider "loss of trust" in politicians and the establishment. Expenses-fiddling does not segue into kiddy-fiddling.

Andrew Rawnsley in The Observer today bemoans the loss of public trust, but manages to exhibit the commentariat's conflicted instincts within a single paragraph: "I like to think I am always on my guard against hysteria ...  but I wouldn't be at all surprised if it is shown that there were cases of offenders among MPs whose criminal depredations were hushed up by whips and other party managers for the usual self-serving reasons". This is prurience masquerading as cynicism, which further fuels the nonsensical idea that whips are omniscient and omnipotent. Francis Urquhart is a work of fiction.

Rawnsley attempts perspective: "What is often called the decline of trust is really an evaporation of deference. Where once there was a reflexive respect for authority and a willingness to give it the benefit of the doubt, there is now a default to distrust". Deference was never about giving the benefit of the doubt, i.e. suppressing scepticism, but about the unquestioning acceptance of authority, often in flat contradiction of the facts. It is the reactionary frame of mind. The 60s are meant to have ushered in an era marked by the "healthy distrust of authority", though the roots of this can be traced back to World War One. In practice, the scepticism of the age soon became the motor of modern capitalism: the "me generation", those "crazy individualists", "because you're worth it" etc.

Rawnsley appears oblivious to the role of capitalism, in its anti-state and pro-market form, in the creation of what David Brooks in The New York Times admiringly refers to as a "personalistic culture". According to Rawnsley, who cites Brooks as an expert on modern manners, "It seems we'd now rather trust an individual we don't know than a big institution that we have come to know much too well". Really? You might like to help me with a small problem I have transferring a billion Dollars out of Nigeria.

In an amusing juxtaposition, Rawnsley's piece in the print edition appears opposite an editorial on BBC funding, which promotes an impeccably neoliberal position on the benefits of commodistisation: "The more various and often frankly free market ways of raising money the corporation develops, the more free from sticky political fingers it will become". In other words, only the market can save institutions from the further erosion of trust consequent on their association with the state. The ideological logic of modern capitalism, of rational utility maximisers and transactional calculation, is for trust to become a commodity that can be secured or exchanged for money. Filthy lucre becomes a disinfectant, an idea that would have fascinated Sigmund Freud, who first popularised the psychological trinity of childhood, sex and money.