Search This Blog

Monday, 1 September 2014

Taxi Driver

The media coverage of the Jay report into child sexual exploitation in Rotherham has focused on political correctness ("a vile, perverted ideology which is wrecking our society and ruining the lives of the innocent"), and the related evils of anti-racism (a "dogma") and multiculturalism ("we no longer have a universal moral code or national identity, and the consequences can be seen all around us, whether in the rise of home-grown Islamic extremism or in the failure of too many migrant groups to learn even basic English"). The religious flavour of this language, and the apocalyptic image of social breakdown, cannot distract from the all-too-obvious bigotry. The salient fact for most commentators is that the perpetrators were Pakistani and the victims white.

Some, such as Allison Pearson, have tried to obscure this by paying lip-service to other aspects: "Powerless white working-class girls were caught between a hateful, imported culture of vicious misogyny on the one hand, and on the other a culture of chauvinism among the police, who regarded them as worthless slags" (class, immigration, sexism, bingo!). Even professional liberals, like Yasmin Alibhai Brown, have found themselves talking nonsense: "White experts and officers have for too long been reluctant to confront serious offences committed by black and Asian people" (which presumably explains the pitifully small number of them in prison).

That levels of misogyny vary between different communities is hardly a surprise, nor that it should be more prevalent in a conservative community like British Pakistanis. People in Clacton are more bigoted than people in London, essentially because the big city attracts the unconventional and progressive, and because proximity and variety encourages tolerance. This does not mean that Clacton has "questions to answer", any more than the Pakistani community has. Similarly, that the police belittle crimes against women and the poor is hardly news. The police service is a conservative institution and the structural bias of the legal system means it privileges crimes against property and social order.

Dan Hodges, who passes as a "lefty" in the eyes of Daily Telegraph readers, says "we cannot ignore that race played a part in these crimes", however he never gets round to explaining the part that race played. This is because his reasoning can only lead to either crude racism (Pakistanis have a greater propensity to abuse) or obvious nonsense (the terrified police were intimidated by the all-powerful Pakistani community). As a consequence, his rant explodes under the pressure of its own frustration in a fit of hyperbole: "A major British town was turned into a rape camp". Really?

Organised crime depends on networks of influence and opportunity. It is inescapably social, which means the community is inevitably compromised, if only through a desire to "mind its own business". That said, the dominant factors are usually material and reflect circumstance. A salient yet widely-ignored feature of the Rotherham case (like Rochdale before it) is the involvement of minicabs (the Jay report notes: "One of the common threads running through child sexual exploitation across England has been the prominent role of taxi drivers in being directly linked to children who were abused"). The significance of ethnicity is that Pakistanis are disproportionately represented in this employment sector (providing greater opportunity and scope for collusion), not that Pakistanis are more likely to be sex abusers due to arranged marriages or Islam.


For the right, political correctness is simply "cultural Marxism", which they source to the Frankfurt School, Antonio Gramsci and the "correct party line" tradition of Communism. However, though the left has undoubtedly played its part, modern "PC" is largely an invention of reactionaries appalled at the advance of civil rights (in the US in the 60s) and the emergence of identity politics (in the UK and elsewhere in the 70s). From the early 80s it was routinely attributed to the "loony left", famously in the case of the GLC and often on the basis of nothing more than myth (Baa Baa White Sheep etc). This worked well enough during the Thatcher era, but it gradually lost its credibility after the rise of New Labour (according to Google Ngram, the phrase "loony left" peaked in 1995).

Thereafter the focus shifted from "loony" to "craven", with an emphasis on the assumed cowardice of local government and public corporations in "standing up" to the greedy and arrogant demands of immigrants and special interest groups (mad mullahs gradually took over the role previously played by lippy rastas and boiler-suited lesbians). For Daniel Hannan it is always the happy time of Thatcherism ("Labour's rotten boroughs ... remain stuck in the early Eighties"). Even self-styled "liberal lefties" like Denis McShane appear to have internalised this narrative, excusing their wilful blindness as the result of brainwashing by The Guardian. Though PC is now assumed to infect all areas of public behaviour, from hands-tied police to conniving councillors, it remains at heart a matter of language: what words are permissible. The deployment of the phrase "Pakistani heritage" in the context of Rotherham clearly hints at a desire to use blunter terms.

Contrary to the myth of a communist conspiracy, political correctness originates in the 18th century idea of politeness (politesse). Like reason, the sublime and the sentimental, this was a key concept of the Enlightenment, amplified through the broader cultural norms of civility and manners. Whereas previous styles of language were deployed as class and status identifiers (courtly love, Renaissance classicism, French imports), now "right language" was seen to express "right behaviour" and "right thinking" and thus to be an ethical aspiration for all rather than merely a badge of membership for the few. This built on the earlier development of "plain speech" through the translation of the Bible, which was a rejection of "fancy" (and implicitly Catholic) aristocratic forms as much as the vulgar demotic.

Politeness meant moderating language. The original Spectator talked of its mission to "enliven morality with wit, and to temper wit with morality". This points up the impeccably bourgeois credentials of the idea that society could be improved by improving its speech. This would in turn lead to the 19th century belief that language is the repository of national spirit (notably among German Romantics) and thus a common endeavour. While this encouraged some to see language as a plastic medium for moulding a national revolution, it also suggested that language itself was a site of political struggle. What was common to both left and right was the Enlightenment idea that language was universal within the polity, which fed the nineteenth century mania for standardised vocabulary, grammatical rectitude and an antipathy towards dialect and "backward" tongues.


In the 20th century, the "linguistic turn" in philosophy and the emergence of structuralism led to the realisation that the control of language was a means to control society, which is where Gramsci, the Frankfurt School and Critical Theory came in. This reached its logical conclusion in Orwell's 1984 with Newspeak. The negative implications of this were an important contribution to the postwar enthusiasm for recovering marginalised languages (Gaelic, Welsh etc) in the 50s and 60s and the new-found vigour of dialect (notably in poetry) during the 60s and 70s. While these developments were often seen as progressive by the left because of their anticolonial and autonomist credentials, they were actually conservative.

The purpose of this trot through the history of political correctness is not just to point out that it has been employed for conservative and reactionary ends as much as progressive ones, but to note that control is central to its practice: it is people in positions of power that promote politically correct language and thinking. The rightwing critique of PC depends on the belief that Marxists and craven fellow-travellers are in control of major cultural institutions, such as the BBC, as well as local government and the bits of education still in state hands. In reality, the local government left (which was never extensive or entrenched) was systematically disempowered after 1979, education was homogenised through the national curriculum and changes to university funding, and the arts and media were infected by managerialism and neoliberal deference. If PC has grown over the last 35 years, it isn't down to the tireless work of Marxist academics or gay social workers.

But this doesn't mean that political correctness is simply a fabrication. It does exist and it exists for a reason. The perp was big business, which recognised as early as the 1960s that discriminating against ethnic minorities, women and gays was counter-productive, mainly because it limited the pool of talent for recruitment and alienated potential customers (Gary Becker's The Economics of Discrimination was the key validation of this change in thinking). This shift led to business jargon absorbing notions such as "diversity" and "equal opportunities" at the same time as it expanded under the dual impact of modern management theory and the spillover of the terminology of financial engineering.

The incursion of business jargon into the public sector after 1979 was mainly driven through privatisation and outsourcing, but even where these didn't occur, public sector managers were under pressure from central government (advised by consultancies) to adopt private sector practices and vocabulary. Left to its own devices, local government (as institutionally conservative as the police) would probably still be as cautious in embracing cultural sensitivity as it was in the 1970s (it is worth remembering that the "loony left" was largely a generational reaction to a fossilised and often intolerant Labourism). As the Jay report makes clear, Rotherham Council was not an enthusiastic champion of multiculturalism, rather it saw the Pakistani community as a problem to be avoided.

The focus on political correctness by the rightwing media is a psychological projection: attributing to their opponents (the nebulous left) their own desire for social engineering, i.e. the creation of the "universal moral code" and "national identity" beloved of The Daily Mail. The problem for the right is that capitalism works to undermine both of these because its is motivated primarily by profit. That is why we have Internet porn and offshoring. As Adam Smith put it, "it is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own interest".

The lesson of Rotherham is that too many people did not see it as being in their interest to enquire into the sexual exploitation of children in local authority care, or to pursue the ample evidence of an organised criminal network centred on minicab firms. The best way to prevent a repetition is not to sack individual local authority workers or members of the police years after the event, or to demand that the Pakistani community explains its "failure", but to make it in the interest of the relevant authorities to give a shit.

Tuesday, 26 August 2014

Empires and Dance

Doctor Who returned to our screens at the weekend with a Glaswegian accent, questions of identity and a variety of strategies for living together: secret marriage, non-boyfriend and parasitical robots to the fore. This made it a more fruitful commentary on Scottish independence than the televised debates between Alex Salmond and Alistair Darling. This is not a cheap crack at the politicians, but an acknowledgement that the focus on practicalities, such as the pound and the BBC, is largely irrelevant until after (and if) a yes vote puts negotiating chips on the table. To repurpose a popular metaphor, you don't discuss the theoretical division of the CD collection before deciding on divorce.

Scots themselves have been self-congratulatory at the high-minded and reasonable tone of the debate so far, emphasising the absence (for the most part) of anglophobia and Braveheartery, and the aspiration for Scotland to "punch above its weight" internationally. Of course, this decorum is ideological, promoting the myth of a country that is uniformly social democratic and tolerant (anti-Tory, not anti-English), while open to global capital (pro-EU and willing to cut corporation tax). This is progressive cant that ignores the conservative nature of Scottish politics (the SNP remain at heart Tartan Tories) and the residual bigotry beneath the veneer of post-Thatcher modernity.

The focus on the pragmatic over the emotional is not merely a tactic; it reveals a fundamental truth about the way that the Scots view the union, which in turn does much to explain English indifference. The Act of Union of 1707 was a deal brokered between English and Scottish elites long before the birth of ideas such as a "national interest". The English elites' motives were a mix of regime security (extinguishing the Stuart threat) and a desire to absorb an economic competitor. Compensation for the Scottish investors in the Darien scheme secured votes in the Scottish Parliament for union, but it also secured agreement that Scottish trade and finance would be subservient to the City of London. The Scottish elites' motives were not merely to avoid bankruptcy in the face of English competition, but to import the Protestant Whig gains of 1688 while retaining their own legal and ecclesiastical authority (together with the removal of aristocrats to London, this would help produce the Scottish Enlightenment).


The Scots did not integrate into a Greater England after 1707, but nor did they immediately become British. The idea of "Britain" was originally developed as a state identity (the supposed legacy of King Arthur) to justify English territorial expansion and ethnic cleansing in Ireland during the Elizabethan era. It would be reinvented during the creation of the "second empire" (after the loss of the American colonies in 1783) as a portmanteau that could accommodate multiple political and cultural identities, both real and imagined (e.g. Walter Scott). This neutralised the political threat of romantic nationalism and provided the means by which the ambitious Scottish landowning and commercial classes could access the single market of the empire on equal terms with the English, while retaining a nominal national identity that in turn smoothed over the reality of the cultural and religious divisions of Scotland.

As these dominant classes became more anglicised in the nineteenth century, the "North British" identity began to evaporate. British increasingly became a synonym for English while Scottish was increasingly reserved for backward (often sentimentalised) rural and proletarian habits, much as "Northern" was in industrial Britain. By the twentieth century, this also reflected the growing belief among English elites that Scotland (and the periphery more generally) was of dwindling value, as its traditional economic resources (coal, shipyards and manpower for empire) declined. The gradual concession of devolution after 1974 and contemporary insouciance about the end of union are corollaries of the globalised turn of the City of London since the 1960s (the current Lord Mayor, Fiona Woolf, is a Scot: "I am quite often extolling the virtues of the global experience that we have, because the UK is a small domestic market and is something of a trampoline for bouncing out all over the rest of the world").

In 2005, the Scottish academics Iain McClean and Alistair McMillan noted that: "Primordial Unionism (the belief that the union is good in and for itself) now survives only in Northern Ireland. Instrumental Unionism supported the Union as a means to other ends, such as the Empire and the Welfare State; but the first is gone and the second is now evolving differently in the four territories of the UK. Representation and finance are the unsolved, and arguably insoluble problems of the post-1997 devolution settlement".


British identity reached its apogee in the post-WW2 era due to three developments. First, the welfare state required a common identification - now eulogised as the "spirit of '45" - to underpin collective provision (significantly this British identity excluded Northern Ireland, where much welfare, such as council housing, remained sectarian). This meant that the NHS and the BBC became more British than the monarchy in the eyes of many progressives. Second, "British" increasingly became the self-identifier of immigrants and their descendants (thus London is the most British region of England), and was thereby hitched to the idea of multiculturalism from the 60s onwards (the NF and BNP's focus on British as a contested identity - "there ain't no black in the Union Jack" - ironically helped this). Third, it also correlated with socio-economic status, with the professional and executive classes more likely to both identify as British and be pro-EU from the 80s onwards (and to consequently consider English identity, outside sport, as reactionary).

The decline of British identity and its substitution by a "modern" national identity is presented in Scotland as progressive (and has been since Tom Nairn's 1977 The Break-Up of Britain), but its agenda is fundamentally conservative. As the historian Tom Devine sees it, "The Scottish parliament has demonstrated competent government and it represents a Scottish people who are wedded to a social democratic agenda and the kind of political values which sustained and were embedded in the welfare state of the late 1940s and 1950s. ... It is the Scots who have succeeded most in preserving the British idea of fairness and compassion in terms of state support and intervention. Ironically, it is England, since the 1980s, which has embarked on a separate journey." As Gerry Hassan noted in response, "The case that the welfare state of the 1940s and 1950s is the pinnacle of human ingenuity and the best we can do is profoundly pessimistic".

England's "separate journey" has seen the political right increasingly associate the word British with "failed" multiculturalism, the nanny state, metropolitan elites, and a belief that the Scots and other peripheral nations are getting more than their fair share of public expenditure. Though there is clearly a background hum of xenophobia, this indicates that English nationalism remains instrumental rather than primordial, i.e. concerned with the economic and social interests of particular classes (small capitalists more than workers) rather than a culturally homogeneous community. The ambivalence over the flag of St George is as much about "vulgar jingoism" as racism.


As postwar British identity had been insubstantial beyond the welfare state, public corporations and archaic relics likes the monarchy, the "national revival" of Thatcher found itself resorting by default to the forms of empire, where British practice actually meant something. Thus local government was eroded by central diktat, public assets were transferred into private hands, and recalcitrant communities were ostracised. From "partner in empire" Scotland had become a subaltern. This neocolonial treatment, most famously in the form of the Poll Tax, convinced many Scots that England wanted out of the marriage as much as they did. The resulting migration of votes from the Conservative Party to the SNP was entirely pragmatic.

New Labour failed to reboot the welfare state as a common endeavour and instead approached it as a managerialist challenge (internal markets, targets, PFI etc). Lacking this original buttress, and with no interest in imperial remnants like the Commonwealth, it preferred to redefine the identity of Britain as a "young country" and a "global player", from The City to Iraq. From the concrete invention of tradition (tartan), we had moved to the performative aspiration of neoliberalism (Cool Britannia). The modest nature of New Labour's commitment to devolution, and the modest achievements of the resulting assemblies, is testament to the instrumental attitude that continues to prevail on both sides of the border. This means that the union is a dead letter, whatever the result in September. Indeed, the quicker the social infrastructure of Britain is fragmented through privatisation and austerity, the quicker the break-up of Britain (to use Tom Nairn's phrase) will occur. Devo-max is not an end point but just another turn in the dance. The music will only stop when Scottish MPs quit Westminster for good.

The driving force behind this is the political dominance of the City of London, which has always regarded the hinterland of Britain as it would come to regard empire, in terms of either its potential for economic exploitation or the threat it posed to the security of existing commerce (consider the foundation of Londonderry). Geography made London a major entrepot and trading nexus, but it was mercantilism that turned Elizabethan "Britain" into the British Empire. Through the Sterling area, empire bequeathed an outsized financial centre with minimal scruples and an allegiance to a narrow interest rather than a national community. This turned out to be the ideal profile to exploit the opportunities of modern finance, from unregulated trading though privatisation to tax avoidance. For a variety of political and personal reasons, David Cameron is not the Tory Prime Minister who wants to oversee the end of the union, but Boris Johnson may well be.

Wednesday, 20 August 2014

Crime and Project Management

Project management and politics are once more in the news. This is partly due to it being August and a slow news week, hence the appearance of thinly-disguised listicles and boilerplate about why government projects tend to fail. Plus ça change. Ahead of the 2010 general election, and on the back of another list of government IT failures, David Cameron "signalled a move away from big IT projects, suggesting he will use technology to increase the transparency of government". In light of the NSA and GCHQ revelations, as much as the Universal Credit debacle, this is ironic.

IT projects regularly fail in both the private and public sectors. The supposed higher rate of failure in the latter is attributed to the extra complication of regime change (at both government and ministerial level) and the conflicting interests of politicians and civil servants (the Yes Minister trope). In fact, there is no good evidence that there is a higher rate of failure in the public sector. What is known is that the private sector is better at obscuring failure and redefining success. Conversely, public sector projects are (with a few exceptions) publicly audited and their success or failure is often a matter of political judgement, so there is usually an opposition willing to challenge the government's verdict. The Obamacare rollout, which for a while looked like the epitome of big government project failure, has dropped out of the headlines in the US and lost its political charge due to its gradual success, as could have been predicted.


The main reasons why IT projects fail are also well known, because they are common to most projects and have nothing to do with technology. It's mostly about people, and a little bit about process: a lack of genuine commitment to change, poor leadership, not involving the right people, poor supplier management, a failure to meaningfully define success, a failure to know when to stop or change course. The mitigations and contingencies for these weaknesses are also well known, and widely ignored. The root problem is that many projects are faith-based. The earliest records of project management we have are religious works. For example, the Old Testament is one project after another: build an ark, find the promised land, build a temple.

Project management is an attempt to provide a rational framework for goals that are too often inspired and driven by emotion. It is no accident that words such as "mission" and "vision" feature prominently in the language. The problem that public sector projects face is not that government is incompetent, but that these projects tend to be more emotionally charged, both in terms of their impacts and ambition. Replacing payroll system A with payroll system B may lead to tears and tantrums in one company, but this is trivial in comparison with the change to a national benefits system that will supposedly transform shirkers into strivers.

Many government IT projects are initiated for gestural reasons. This is not just about ministers being seen to be doing something, but about articulating and concretising a political worldview, a practice as old as the pyramids. This truth is usually ignored by project management "experts". For example, suggesting that UC could be better implemented using an incremental approach, avoiding a risky big bang and allowing for gradual adaptation, makes perfect sense technically but ignores the political importance of the scheme, which goes well beyond "making work pay". The commonly-understood goal is to cut the welfare bill (officially through greater efficiency and reduced fraud, unofficially through increased harrying of claimants). The gestural purpose is to introduce the idea of a citizens' basic income. It is the all-or-nothing approach, and IDS's associated zealotry, that constitute the symbolic message to the electorate: everyone must work, the poor must be subsidised (due to the incontestable needs of the market), and complexity leads to fraud. A low-key, gradual improvement in process efficiency is politically irrelevant.


This emotivism is a double-edged sword. Just as politicians will happily lead a project to failure so long as it continues to reflect the desired policy stance, so public opinion (or at least the media) may deem a project a failure because of unreasonable expectations. The e-Borders system has "failed" because it is popularly assumed to be a tool for "controlling immigration", which is not something that can be effectively done at passport control (and has arguably been a non-issue for some time now). As a policing tool - i.e. intercepting criminal suspects - it appeared to work fine. The downgrade to the "Go home or face arrest" vans earlier this year was an equally pointless gesture aimed squarely at xenophobes attracted by UKIP and a demanding press, not at people who had overstayed their visas.

Government also has a role as an advocate of technology and (increasingly) the "power of information" to both business and the wider population, hence the e-government frenzy that started in the late 90s. This partly explains its reluctance to consider non or low-tech solutions for public services (e.g. the insistence that job seekers must do online job searches, despite the well-known problem of fake jobs on job boards) and also partly explains the attractiveness of government as a client for large solution providers. But the higher risk of perceived failure (or at least the higher profile given to failures in the media as evidence of ministerial incompetence, big government ineptitude and supplier abuse) makes this a costly strategy.

Clearly then, the subjective rewards (or "soft benefits" in project management-speak) must be of considerable value to politicians. I think there are three worth noting. First is decisiveness. Launching a project (or any sort of initiative) is the end-game for many politicians who fear their ministerial tenure may be short. Unless you can arrive late and take credit for a successful implementation (e.g. Boris Johnson and the London Bike Scheme), few want or expect to be around at project end (IDS may have had a stay of execution till 2015, but he is likely to quit the scene long before UC exits the pilot stage, if it ever does).

Another attraction is that government IT projects encapsulate the neoliberal idea of managerialism: the belief that for every problem there is a solution that is amenable to generic management skills. The project management process itself encapsulates the tropes of measurement and monitoring, with reality easily giving way to the figment of targets and progress (the "reset" of Universal Credit and IDS's insistence that it is "on target" are examples of the lunacy this gives rise to). Similarly, e-Borders might appear a colossal waste of money, but it serves to further normalise the idea that society should be constantly surveilled and that we should be able to accurately quantify classes of citizen or resident. Even the growing role of Parliamentary committees acting as project auditors, rather than just spending watchdogs, reinforces the neoliberal trope of constant inspection.


The third benefit is the normalisation of failure, which is central to capitalism. One of the mantras of Silicon Valley, which was adopted from agile software development, is the Beckettian "fail fast, fail better". In reality, this is often little more than cant in commercial organisations, where the empirical method remains alien, while most media evangelists turn out to be self-serving "serial entrepreneurs". Ironically, the history of "the entrepreneurial state" encourages the belief that failure is intrinsic to government projects, albeit in an ultimately beneficial way. You can hear this in the words of Tony Hall, the Director General of the BBC (lumped with government when it comes to project management critiques) when he talks of the corporation's role: "We must be the risk capital for the UK. We have got to be the people who have enough confidence to be able to say that we are going to back things that may not work".

Government is unlikely to wean itself off gestural politics so long as it depends on elections, so a certain amount of project failure is an unavoidable cost of democracy. The state's role as the risk-taker and experimenter of last resort also makes it unlikely that it will ever become (or be seen to be) a better project manager than a private sector with selective memory. Finally, managerialism is bigger than neoliberalism and central to all flavours of government. Even the "night-watchman" state of right-libertarian fantasy (which is by definition a police state) will waste money on projects equipping the military and building courthouses.

This intractable reality leads the critics of government project mismanagement down the blind alley of personal responsibility: "we should guarantee that the ministers, senior civil servants and corporate CEOs involved will all be publicly sacked if the project fails". This is close to Stalin's approach to project management (JFDI) and project failure (the gulags), but as we have seen with the banks, pinning blame in complex organisations for activities that spanned many years is difficult, and project failures rarely produce a smoking gun like an email requesting that LIBOR be rigged. The dirty truth is that big IT projects, in both the public and private sectors, carry an implicit indemnity for the participants, much as banking does for executives who stop short of actual criminality. If this weren't the case, no one of any real talent would take the job on. Despite the ample scope for fraud and abuse, big IT projects remain largely crime-free zones.

Monday, 4 August 2014

Lights Out for the Territory

The suggestion that we should all turn the lights out between 10 and 11 this evening, as a commemoration of the centenary of the start of World War One, is annoyingly sentimental and likely to be largely ignored. A cynic might suggest that it is merely an encouragement to gather round the glow of the TV, as the BBC broadcasts a candlelit service from Westminster Abbey, or to fire up Jeremy Deller's specially-commissioned app. I'm sure the latter will be worth watching (social commemoration is Deller's forté), but the former looks way too earnest (over two hours of Huw Edwards' downbeat tones as po-faced clerics snuff out candles) and likely to prove poor competition up against Death in Paradise and Tulisa: The Price of Fame.


The lights out theme (which also made me think of Iain Sinclair and Mark Twain) is meant to evoke Edward Grey's famous words, spoken on the eve of war, that "The lamps are going out all over Europe, we shall not see them lit again in our life-time". This is one of those portentous sayings that has acquired resonance due to subsequent history (not least that he died in 1933). Had the war really all been over by Christmas, as many politicians thought, we'd probably never have heard of the phrase (Grey only published it, and then on the basis of a friend's memory, in 1925). It also hints at the coming of aerial warfare and the blackouts of WW2, not to mention the destructive impact on Europe of what would eventually amount to over three decades of conflict. Prescient stuff.

The centenary has had the effect of obscuring and confusing current conflicts, as we look for echoes of the past and worry about an incipient global crisis. The conflicts in the Middle East (Syria, Iraq and Palestine) are all refracted through vague memories of the collapse of the Ottoman Empire (in the commentariat market the Sykes-Picot Agreement is up, the Balfour Declaration down), while Ukraine is increasingly cast as a murderous Balkan affair that risks dragging in the Great Powers. Even victory in the World Cup is seen as evidence of the final rehabilitation of Germany (for Germans that meant 1954, for Brits it appears we've only just got used to the idea).

Grey's words, dripping elite nostalgia (you can picture the scene, as the Foreign Secretary in his frockcoat gazes from his office out over St James' Park), have naturally appealed to American conservatives fearful that the US is losing its dominant role in the face of rising powers abroad and perceived weakness at home. This is a continuation of the pro-empire polemic undertaken by neocons since 2001, where the assumed errors of British policy (insufficiently interventionist before 1914 and too entangled in Europe afterwards) and the consequent cycle of decline are held up as warnings to the current global hegemon. In essence, the US should whack its assumed enemies wherever and whenever they appear, regardless of territorial integrity and collateral damage, and should treat multilateralism with disdain ("We're an empire now, and when we act, we create our own reality"). The plot of pretty much every action movie since Vietnam.

Despite the caution of the Obama administration, which is little more than an oscillation back from the over-reach of Bush Jr, US foreign policy has not turned to peace and love, and there is no prospect of it doing so any time soon. Hillary Clinton's defence of Israel over Gaza is not about appeasing the "Jewish lobby" ahead of a Presidential campaign, but a simple articulation of State Department policy in which Israel functions as a proxy for the US. If you harm us, we will whack you. Disproportionately. The non-intervention in Syria is not appeasement, it is acceptance that regional interests are best served by a grinding conflict that largely takes that country out of the game. Similarly, the "loss" of Crimea is of little consequence compared to the advance of NATO to Russia's borders.

Such pragmatic calculation is the way of the world, but as Iraq and Afghanistan have shown, it is easy for delusion and prejudice to cause those calculations to backfire. The two fundamental errors the British made in 1914 were thinking that they could restrict their military contribution to naval domination and a small expeditionary force, while France and Russia provided the land armies (and the bulk of casualties), and that the cost of financing the war (including loans to their allies) would be manageable because it would be short. In effect, a repeat of the episodic conflicts of the Napoleonic era, with a similar byproduct of additional imperial possessions picked up on the cheap (I suspect the bicentenary of the start of the Congress of Vienna will not be marked).

What US foreign policy shares with that of its British analogue of one hundred years ago is the strongly defined division between here and there, between home and abroad. Not just in the quotidian sense that foreign is a different country, but in the belief that normal laws and norms of behaviour do not apply "there", nor do they apply to "them" when they are here. The US-UK cooperation over torture and intelligence-gathering since the millennium are cases in point, as is the tolerance of Israel "mowing the lawn" in Gaza. This is the wider truth: despite the best attempts of apologists like Niall Ferguson, empire corrupts both ruler and ruled, and the lesson from Britain today is that the stink hangs around for a very long time. The "unsivilized" territory beyond the horizon that entices Huck Finn is also a site of genocide, but that crime and many others can be traced back to offices overlooking Foggy Bottom and St James' Park. What we need is more light, not less.

Wednesday, 30 July 2014

The Opinion of Others

All media are driven by the opinion of others. Factual news is expensive to acquire, unreliable in its supply, and tends towards the dull but worthy. Opinion, on the other hand, is inexhaustible, cheap and reliably contentious. In a variation on the City speculators' mantra, there is always a "greater fool" who will respond to Richard Littlejohn or George Monbiot (I plead guilty on numerous counts, m'lud). We turn gratefully from pictures and testimony of suffering in Gaza to verbal bunfights between Israeli spokesmen and TV news anchors, where the likelihood of a mind being changed is precisely zero (and, surprisingly, not likely to be helped by Mia Farrow and other slebs).

The first newspapers, which were gazettes of court announcements, were of limited interest compared to illustrated chapbooks detailing Catholic atrocities or tracts promising the imminence of God's Kingdom. Though literate snark soon came to the fore (The Spectator was launched in 1711), most newspapers continued to rely on adverts as much as editorial to pique interest until nineteenth century mass-literacy led to the "human interest story" and a realisation of the power of mobilised opinion to sway elected legislatures. The long twentieth century, from the launch of Tit-Bits in 1881 to the Web 2.0 media-moment in 2004, was distinguished by gossip, the privileged opinions of the high and mighty, and messages from "our sponsors".

The Internet has supposedly democratised opinion, providing a variety of platforms for "everyone" to have their say, but this just means a vast increase in content (gossip, opinion, ads) and thus an even greater value accorded to aggregators amd filters, which is what Tit-Bits was and arguably Addison and Steele's imaginary spectator was too. Plus ça change. Though some services claim that you, the consumer, are now able to act as your own aggregator (choosing who to follow, specifying your interests etc), the trope of "content overload" obviously serves the purposes of those who would "pre-curate" your content stream based on algorithms that analyse your history and relationships. Of course, such algorithmic precision is a myth, which even some services are happy to admit.


The churn in technologies and corporate providers obscures the persistence of opinion as the major driver of media. Thus incumbent providers, like the press, lift up their skirts and screech at the thought of Facebook manipulating a user's stream, while music nostalgists bemoan the death of the album (curation by Big Music) under the onslaught of streaming service playlists (dominated by sleb curation, which is the new face of Big Music). The success of Twitter, which is a pretty ropey piece of technology, is down to the simple fact that it provides raw, instant opinion, the slebbier the better (accept it: Rihanna is a better football pundit than you are).

Search engine results can be thought of as a type of playlist, dynamically curated by an algorithm that aggregates and orders relevant pages based on the "opinion" of other pages. The plea by Google that they should not be held responsible for the opinion of others (which has predictably found favour with the House of Lords) depends on a belief that the algorithm is an accurate reflection of that aggregated opinion (impossible to know) and not subject to any manipulation or systemic bias (clearly untrue). The myth of the Internet is that everyone has an opinion (because property is universal) and that the expression of that opinion should be free and unrestricted (because the state should not limit your rights in your property).

The key change that occurred a decade ago was that we moved from a culture in which being a passive consumer of the opinion of privileged others was deemed sufficient to one in which we must now all express an opinion as well, whether on Gaza or a friend's new cardigan. But that opinion does not need to be a reasoned analysis (we don't seriously seek to dethrone the aristocracy of pundits, merely to vote them up or down by the weight of our comments). A simple preference will do, either through a binary "like" or a top 10 ordering (the modern "listicle" is simply a way to teach us correct practice). Creating lists, and curating them on an ongoing basis, has become a social obligation: performative opinionising as a way of defining who we are. By their public playlists ye shall know them.