Tuesday, 26 August 2014

Empires and Dance

Doctor Who returned to our screens at the weekend with a Glaswegian accent, questions of identity and a variety of strategies for living together: secret marriage, non-boyfriend and parasitical robots to the fore. This made it a more fruitful commentary on Scottish independence than the televised debates between Alex Salmond and Alistair Darling. This is not a cheap crack at the politicians, but an acknowledgement that the focus on practicalities, such as the pound and the BBC, is largely irrelevant until after (and if) a yes vote puts negotiating chips on the table. To repurpose a popular metaphor, you don't discuss the theoretical division of the CD collection before deciding on divorce.

Scots themselves have been self-congratulatory at the high-minded and reasonable tone of the debate so far, emphasising the absence (for the most part) of anglophobia and Braveheartery, and the aspiration for Scotland to "punch above its weight" internationally. Of course, this decorum is ideological, promoting the myth of a country that is uniformly social democratic and tolerant (anti-Tory, not anti-English), while open to global capital (pro-EU and willing to cut corporation tax). This is progressive cant that ignores the conservative nature of Scottish politics (the SNP remain at heart Tartan Tories) and the residual bigotry beneath the veneer of post-Thatcher modernity.

The focus on the pragmatic over the emotional is not merely a tactic; it reveals a fundamental truth about the way that the Scots view the union, which in turn does much to explain English indifference. The Act of Union of 1707 was a deal brokered between English and Scottish elites long before the birth of ideas such as a "national interest". The English elites' motives were a mix of regime security (extinguishing the Stuart threat) and a desire to absorb an economic competitor. Compensation for the Scottish investors in the Darien scheme secured votes in the Scottish Parliament for union, but it also secured agreement that Scottish trade and finance would be subservient to the City of London. The Scottish elites' motives were not merely to avoid bankruptcy in the face of English competition, but to import the Protestant Whig gains of 1688 while retaining their own legal and ecclesiastical authority (together with the removal of aristocrats to London, this would help produce the Scottish Enlightenment).

The Scots did not integrate into a Greater England after 1707, but nor did they immediately become British. The idea of "Britain" was originally developed as a state identity (the supposed legacy of King Arthur) to justify English territorial expansion and ethnic cleansing in Ireland during the Elizabethan era. It would be reinvented during the creation of the "second empire" (after the loss of the American colonies in 1783) as a portmanteau that could accommodate multiple political and cultural identities, both real and imagined (e.g. Walter Scott). This neutralised the political threat of romantic nationalism and provided the means by which the ambitious Scottish landowning and commercial classes could access the single market of the empire on equal terms with the English, while retaining a nominal national identity that in turn smoothed over the reality of the cultural and religious divisions of Scotland.

As these dominant classes became more anglicised in the nineteenth century, the "North British" identity began to evaporate. British increasingly became a synonym for English while Scottish was increasingly reserved for backward (often sentimentalised) rural and proletarian habits, much as "Northern" was in industrial Britain. By the twentieth century, this also reflected the growing belief among English elites that Scotland (and the periphery more generally) was of dwindling value, as its traditional economic resources (coal, shipyards and manpower for empire) declined. The gradual concession of devolution after 1974 and contemporary insouciance about the end of union are corollaries of the globalised turn of the City of London since the 1960s (the current Lord Mayor, Fiona Woolf, is a Scot: "I am quite often extolling the virtues of the global experience that we have, because the UK is a small domestic market and is something of a trampoline for bouncing out all over the rest of the world").

In 2005, the Scottish academics Iain McClean and Alistair McMillan noted that: "Primordial Unionism (the belief that the union is good in and for itself) now survives only in Northern Ireland. Instrumental Unionism supported the Union as a means to other ends, such as the Empire and the Welfare State; but the first is gone and the second is now evolving differently in the four territories of the UK. Representation and finance are the unsolved, and arguably insoluble problems of the post-1997 devolution settlement".

British identity reached its apogee in the post-WW2 era due to three developments. First, the welfare state required a common identification - now eulogised as the "spirit of '45" - to underpin collective provision (significantly this British identity excluded Northern Ireland, where much welfare, such as council housing, remained sectarian). This meant that the NHS and the BBC became more British than the monarchy in the eyes of many progressives. Second, "British" increasingly became the self-identifier of immigrants and their descendants (thus London is the most British region of England), and was thereby hitched to the idea of multiculturalism from the 60s onwards (the NF and BNP's focus on British as a contested identity - "there ain't no black in the Union Jack" - ironically helped this). Third, it also correlated with socio-economic status, with the professional and executive classes more likely to both identify as British and be pro-EU from the 80s onwards (and to consequently consider English identity, outside sport, as reactionary).

The decline of British identity and its substitution by a "modern" national identity is presented in Scotland as progressive (and has been since Tom Nairn's 1977 The Break-Up of Britain), but its agenda is fundamentally conservative. As the historian Tom Devine sees it, "The Scottish parliament has demonstrated competent government and it represents a Scottish people who are wedded to a social democratic agenda and the kind of political values which sustained and were embedded in the welfare state of the late 1940s and 1950s. ... It is the Scots who have succeeded most in preserving the British idea of fairness and compassion in terms of state support and intervention. Ironically, it is England, since the 1980s, which has embarked on a separate journey." As Gerry Hassan noted in response, "The case that the welfare state of the 1940s and 1950s is the pinnacle of human ingenuity and the best we can do is profoundly pessimistic".

England's "separate journey" has seen the political right increasingly associate the word British with "failed" multiculturalism, the nanny state, metropolitan elites, and a belief that the Scots and other peripheral nations are getting more than their fair share of public expenditure. Though there is clearly a background hum of xenophobia, this indicates that English nationalism remains instrumental rather than primordial, i.e. concerned with the economic and social interests of particular classes (small capitalists more than workers) rather than a culturally homogeneous community. The ambivalence over the flag of St George is as much about "vulgar jingoism" as racism.

As postwar British identity had been insubstantial beyond the welfare state, public corporations and archaic relics likes the monarchy, the "national revival" of Thatcher found itself resorting by default to the forms of empire, where British practice actually meant something. Thus local government was eroded by central diktat, public assets were transferred into private hands, and recalcitrant communities were ostracised. From "partner in empire" Scotland had become a subaltern. This neocolonial treatment, most famously in the form of the Poll Tax, convinced many Scots that England wanted out of the marriage as much as they did. The resulting migration of votes from the Conservative Party to the SNP was entirely pragmatic.

New Labour failed to reboot the welfare state as a common endeavour and instead approached it as a managerialist challenge (internal markets, targets, PFI etc). Lacking this original buttress, and with no interest in imperial remnants like the Commonwealth, it preferred to redefine the identity of Britain as a "young country" and a "global player", from The City to Iraq. From the concrete invention of tradition (tartan), we had moved to the performative aspiration of neoliberalism (Cool Britannia). The modest nature of New Labour's commitment to devolution, and the modest achievements of the resulting assemblies, is testament to the instrumental attitude that continues to prevail on both sides of the border. This means that the union is a dead letter, whatever the result in September. Indeed, the quicker the social infrastructure of Britain is fragmented through privatisation and austerity, the quicker the break-up of Britain (to use Tom Nairn's phrase) will occur. Devo-max is not an end point but just another turn in the dance. The music will only stop when Scottish MPs quit Westminster for good.

The driving force behind this is the political dominance of the City of London, which has always regarded the hinterland of Britain as it would come to regard empire, in terms of either its potential for economic exploitation or the threat it posed to the security of existing commerce (consider the foundation of Londonderry). Geography made London a major entrepot and trading nexus, but it was mercantilism that turned Elizabethan "Britain" into the British Empire. Through the Sterling area, empire bequeathed an outsized financial centre with minimal scruples and an allegiance to a narrow interest rather than a national community. This turned out to be the ideal profile to exploit the opportunities of modern finance, from unregulated trading though privatisation to tax avoidance. For a variety of political and personal reasons, David Cameron is not the Tory Prime Minister who wants to oversee the end of the union, but Boris Johnson may well be.

Wednesday, 20 August 2014

Crime and Project Management

Project management and politics are once more in the news. This is partly due to it being August and a slow news week, hence the appearance of thinly-disguised listicles and boilerplate about why government projects tend to fail. Plus ça change. Ahead of the 2010 general election, and on the back of another list of government IT failures, David Cameron "signalled a move away from big IT projects, suggesting he will use technology to increase the transparency of government". In light of the NSA and GCHQ revelations, as much as the Universal Credit debacle, this is ironic.

IT projects regularly fail in both the private and public sectors. The supposed higher rate of failure in the latter is attributed to the extra complication of regime change (at both government and ministerial level) and the conflicting interests of politicians and civil servants (the Yes Minister trope). In fact, there is no good evidence that there is a higher rate of failure in the public sector. What is known is that the private sector is better at obscuring failure and redefining success. Conversely, public sector projects are (with a few exceptions) publicly audited and their success or failure is often a matter of political judgement, so there is usually an opposition willing to challenge the government's verdict. The Obamacare rollout, which for a while looked like the epitome of big government project failure, has dropped out of the headlines in the US and lost its political charge due to its gradual success, as could have been predicted.

The main reasons why IT projects fail are also well known, because they are common to most projects and have nothing to do with technology. It's mostly about people, and a little bit about process: a lack of genuine commitment to change, poor leadership, not involving the right people, poor supplier management, a failure to meaningfully define success, a failure to know when to stop or change course. The mitigations and contingencies for these weaknesses are also well known, and widely ignored. The root problem is that many projects are faith-based. The earliest records of project management we have are religious works. For example, the Old Testament is one project after another: build an ark, find the promised land, build a temple.

Project management is an attempt to provide a rational framework for goals that are too often inspired and driven by emotion. It is no accident that words such as "mission" and "vision" feature prominently in the language. The problem that public sector projects face is not that government is incompetent, but that these projects tend to be more emotionally charged, both in terms of their impacts and ambition. Replacing payroll system A with payroll system B may lead to tears and tantrums in one company, but this is trivial in comparison with the change to a national benefits system that will supposedly transform shirkers into strivers.

Many government IT projects are initiated for gestural reasons. This is not just about ministers being seen to be doing something, but about articulating and concretising a political worldview, a practice as old as the pyramids. This truth is usually ignored by project management "experts". For example, suggesting that UC could be better implemented using an incremental approach, avoiding a risky big bang and allowing for gradual adaptation, makes perfect sense technically but ignores the political importance of the scheme, which goes well beyond "making work pay". The commonly-understood goal is to cut the welfare bill (officially through greater efficiency and reduced fraud, unofficially through increased harrying of claimants). The gestural purpose is to introduce the idea of a citizens' basic income. It is the all-or-nothing approach, and IDS's associated zealotry, that constitute the symbolic message to the electorate: everyone must work, the poor must be subsidised (due to the incontestable needs of the market), and complexity leads to fraud. A low-key, gradual improvement in process efficiency is politically irrelevant.

This emotivism is a double-edged sword. Just as politicians will happily lead a project to failure so long as it continues to reflect the desired policy stance, so public opinion (or at least the media) may deem a project a failure because of unreasonable expectations. The e-Borders system has "failed" because it is popularly assumed to be a tool for "controlling immigration", which is not something that can be effectively done at passport control (and has arguably been a non-issue for some time now). As a policing tool - i.e. intercepting criminal suspects - it appeared to work fine. The downgrade to the "Go home or face arrest" vans earlier this year was an equally pointless gesture aimed squarely at xenophobes attracted by UKIP and a demanding press, not at people who had overstayed their visas.

Government also has a role as an advocate of technology and (increasingly) the "power of information" to both business and the wider population, hence the e-government frenzy that started in the late 90s. This partly explains its reluctance to consider non or low-tech solutions for public services (e.g. the insistence that job seekers must do online job searches, despite the well-known problem of fake jobs on job boards) and also partly explains the attractiveness of government as a client for large solution providers. But the higher risk of perceived failure (or at least the higher profile given to failures in the media as evidence of ministerial incompetence, big government ineptitude and supplier abuse) makes this a costly strategy.

Clearly then, the subjective rewards (or "soft benefits" in project management-speak) must be of considerable value to politicians. I think there are three worth noting. First is decisiveness. Launching a project (or any sort of initiative) is the end-game for many politicians who fear their ministerial tenure may be short. Unless you can arrive late and take credit for a successful implementation (e.g. Boris Johnson and the London Bike Scheme), few want or expect to be around at project end (IDS may have had a stay of execution till 2015, but he is likely to quit the scene long before UC exits the pilot stage, if it ever does).

Another attraction is that government IT projects encapsulate the neoliberal idea of managerialism: the belief that for every problem there is a solution that is amenable to generic management skills. The project management process itself encapsulates the tropes of measurement and monitoring, with reality easily giving way to the figment of targets and progress (the "reset" of Universal Credit and IDS's insistence that it is "on target" are examples of the lunacy this gives rise to). Similarly, e-Borders might appear a colossal waste of money, but it serves to further normalise the idea that society should be constantly surveilled and that we should be able to accurately quantify classes of citizen or resident. Even the growing role of Parliamentary committees acting as project auditors, rather than just spending watchdogs, reinforces the neoliberal trope of constant inspection.

The third benefit is the normalisation of failure, which is central to capitalism. One of the mantras of Silicon Valley, which was adopted from agile software development, is the Beckettian "fail fast, fail better". In reality, this is often little more than cant in commercial organisations, where the empirical method remains alien, while most media evangelists turn out to be self-serving "serial entrepreneurs". Ironically, the history of "the entrepreneurial state" encourages the belief that failure is intrinsic to government projects, albeit in an ultimately beneficial way. You can hear this in the words of Tony Hall, the Director General of the BBC (lumped with government when it comes to project management critiques) when he talks of the corporation's role: "We must be the risk capital for the UK. We have got to be the people who have enough confidence to be able to say that we are going to back things that may not work".

Government is unlikely to wean itself off gestural politics so long as it depends on elections, so a certain amount of project failure is an unavoidable cost of democracy. The state's role as the risk-taker and experimenter of last resort also makes it unlikely that it will ever become (or be seen to be) a better project manager than a private sector with selective memory. Finally, managerialism is bigger than neoliberalism and central to all flavours of government. Even the "night-watchman" state of right-libertarian fantasy (which is by definition a police state) will waste money on projects equipping the military and building courthouses.

This intractable reality leads the critics of government project mismanagement down the blind alley of personal responsibility: "we should guarantee that the ministers, senior civil servants and corporate CEOs involved will all be publicly sacked if the project fails". This is close to Stalin's approach to project management (JFDI) and project failure (the gulags), but as we have seen with the banks, pinning blame in complex organisations for activities that spanned many years is difficult, and project failures rarely produce a smoking gun like an email requesting that LIBOR be rigged. The dirty truth is that big IT projects, in both the public and private sectors, carry an implicit indemnity for the participants, much as banking does for executives who stop short of actual criminality. If this weren't the case, no one of any real talent would take the job on. Despite the ample scope for fraud and abuse, big IT projects remain largely crime-free zones.

Monday, 4 August 2014

Lights Out for the Territory

The suggestion that we should all turn the lights out between 10 and 11 this evening, as a commemoration of the centenary of the start of World War One, is annoyingly sentimental and likely to be largely ignored. A cynic might suggest that it is merely an encouragement to gather round the glow of the TV, as the BBC broadcasts a candlelit service from Westminster Abbey, or to fire up Jeremy Deller's specially-commissioned app. I'm sure the latter will be worth watching (social commemoration is Deller's forté), but the former looks way too earnest (over two hours of Huw Edwards' downbeat tones as po-faced clerics snuff out candles) and likely to prove poor competition up against Death in Paradise and Tulisa: The Price of Fame.

The lights out theme (which also made me think of Iain Sinclair and Mark Twain) is meant to evoke Edward Grey's famous words, spoken on the eve of war, that "The lamps are going out all over Europe, we shall not see them lit again in our life-time". This is one of those portentous sayings that has acquired resonance due to subsequent history (not least that he died in 1933). Had the war really all been over by Christmas, as many politicians thought, we'd probably never have heard of the phrase (Grey only published it, and then on the basis of a friend's memory, in 1925). It also hints at the coming of aerial warfare and the blackouts of WW2, not to mention the destructive impact on Europe of what would eventually amount to over three decades of conflict. Prescient stuff.

The centenary has had the effect of obscuring and confusing current conflicts, as we look for echoes of the past and worry about an incipient global crisis. The conflicts in the Middle East (Syria, Iraq and Palestine) are all refracted through vague memories of the collapse of the Ottoman Empire (in the commentariat market the Sykes-Picot Agreement is up, the Balfour Declaration down), while Ukraine is increasingly cast as a murderous Balkan affair that risks dragging in the Great Powers. Even victory in the World Cup is seen as evidence of the final rehabilitation of Germany (for Germans that meant 1954, for Brits it appears we've only just got used to the idea).

Grey's words, dripping elite nostalgia (you can picture the scene, as the Foreign Secretary in his frockcoat gazes from his office out over St James' Park), have naturally appealed to American conservatives fearful that the US is losing its dominant role in the face of rising powers abroad and perceived weakness at home. This is a continuation of the pro-empire polemic undertaken by neocons since 2001, where the assumed errors of British policy (insufficiently interventionist before 1914 and too entangled in Europe afterwards) and the consequent cycle of decline are held up as warnings to the current global hegemon. In essence, the US should whack its assumed enemies wherever and whenever they appear, regardless of territorial integrity and collateral damage, and should treat multilateralism with disdain ("We're an empire now, and when we act, we create our own reality"). The plot of pretty much every action movie since Vietnam.

Despite the caution of the Obama administration, which is little more than an oscillation back from the over-reach of Bush Jr, US foreign policy has not turned to peace and love, and there is no prospect of it doing so any time soon. Hillary Clinton's defence of Israel over Gaza is not about appeasing the "Jewish lobby" ahead of a Presidential campaign, but a simple articulation of State Department policy in which Israel functions as a proxy for the US. If you harm us, we will whack you. Disproportionately. The non-intervention in Syria is not appeasement, it is acceptance that regional interests are best served by a grinding conflict that largely takes that country out of the game. Similarly, the "loss" of Crimea is of little consequence compared to the advance of NATO to Russia's borders.

Such pragmatic calculation is the way of the world, but as Iraq and Afghanistan have shown, it is easy for delusion and prejudice to cause those calculations to backfire. The two fundamental errors the British made in 1914 were thinking that they could restrict their military contribution to naval domination and a small expeditionary force, while France and Russia provided the land armies (and the bulk of casualties), and that the cost of financing the war (including loans to their allies) would be manageable because it would be short. In effect, a repeat of the episodic conflicts of the Napoleonic era, with a similar byproduct of additional imperial possessions picked up on the cheap (I suspect the bicentenary of the start of the Congress of Vienna will not be marked).

What US foreign policy shares with that of its British analogue of one hundred years ago is the strongly defined division between here and there, between home and abroad. Not just in the quotidian sense that foreign is a different country, but in the belief that normal laws and norms of behaviour do not apply "there", nor do they apply to "them" when they are here. The US-UK cooperation over torture and intelligence-gathering since the millennium are cases in point, as is the tolerance of Israel "mowing the lawn" in Gaza. This is the wider truth: despite the best attempts of apologists like Niall Ferguson, empire corrupts both ruler and ruled, and the lesson from Britain today is that the stink hangs around for a very long time. The "unsivilized" territory beyond the horizon that entices Huck Finn is also a site of genocide, but that crime and many others can be traced back to offices overlooking Foggy Bottom and St James' Park. What we need is more light, not less.