The news that crisis loans for the poor may soon be provided through payment cards (aka food stamps) has been almost exclusively discussed in ethical terms. The right praises the scheme as a way of stopping the feckless from spending money on booze and fags, and would like it extended to all benefits, while the left points out the stupidity of trying to address a "dependency culture" (real or imagined) by making people less independent. As ever, when morality rears its head, you can be sure the real issue is being obscured.
In the Guardian, Zoe Williams talks about "expelling people from the sphere of money". By denying them the simple pleasure of spending, "the fillip of power in the process", you are making them a second-class citizen. "When you relegate people to a world outside money, you create a true underclass: a group of people whose privacy and autonomy are worth less than everyone else's, who are stateless in a world made of shops". Well, only if you consider shopping to be a human right, and Harrods to be some sort of sovereign state (old Mo' probably agrees).
This moralistic interpretation leads to the conclusion that the changes to the social fund are the result of an ideological government with a punitive attitude towards the poor. Yah boo. While I have no doubt the government (and their neoliberal confreres in Labour) are relaxed about the change, this ignores the fact that individual councils have been given latitude to decide how they will administer the new local funds. The introduction of vouchers is not the result of central government diktat, hence the decision of some councils to issue food parcels, fund food banks, or facilitate credit union loans instead.
The decision of others (probably many) to use prepaid cards should be seen as a default response: unthinking rather than malicious. When faced with a new responsibility, councils now instinctively look to outsource delivery. As a proven relationship is valuable, this gives an advantage to incumbent suppliers. It just so happens that Sodexo, an on-site outsourcing firm with numerous local government contracts, currently manages the Azure prepaid card for asylum seekers on behalf of the UK Border Agency (as was).
Being a large corporate, Sodexo has an interest in dealing primarily with other large suppliers - i.e. the merchants who will accept the cards. This strategy helps ensure wide service coverage and lower transaction costs through economies of scale. Unsurprisingly, the Azure card is currently restricted to large national supermarket chains like Sainsburys and Morrisons. You can't buy bananas with it at your local market or convenience store, let alone a scratchcard or a can of White Lightning. These large chains also have sophisticated till systems that can automatically reject invalid purchases, regardless of the personal sympathies of the checkout cashier.
The inexorable logic of capitalism turns the crisis loan into a commodity in its own right. Some commentators have drawn a parallel with nineteenth century attitudes to pauperism, but the potentially more significant parallel is with that era's truck system, whereby part of a worker's wage was paid in tokens that could only be redeemed in a company store. In contrast to industrial capitalists seeking vertical monopolies over their workers, this modern variant is an alliance between government, the supermarket cartel and privileged intermediaries. It's easy to see how it could be further extended, not just to unemployment and disability benefits, but conceivably even to in-work benefits such as tax credits.
To make this work, it would be necessary to destigmatise food vouchers. This isn't necessarily that big a challenge. After all, Luncheon Vouchers weren't considered demeaning as an employment perk. If they simply add the tax credit in the form of points to a loyalty card, they'd be home and dry.
Popular Tropes
▼
And now for something completely different ...
▼
Thursday, 28 March 2013
Wednesday, 27 March 2013
A Climate of Fear
Before the NHS can be opened up to the market, popular support must be undermined. This is problematic as the defence of established ways naturally appeals to the conservative temperament. To get conservatives on board, the ideological focus initially shifts towards external threats. In other words, the NHS is being battered by forces beyond its control, so we must change it to save it. These external threats include genuine secular trends, such as advances in medical science and demographic change, but with a relentlessly negative spin (too costly, too old). Other threats include the staple of all things "foreign", from incompetent locum doctors to "health tourists". Even government reform is shamelessly held up as a threat, as politicians decry the destabilising effect of yet another restructure before proposing their own bureaucratic cure.
As pessimism becomes entrenched, the narrative gradually shifts towards the claim that the NHS is intrinsically incompetent. A major trope is the idea that hospitals are a danger to our health. Popular anxiety has been fuelled both by legitimate concerns, such as infections and neglect, and by the irresponsible indulgence of quackery, which has an interest in promoting the risks of conventional treatments under the cover of "choice". When the language shifts up another gear, to imply that the NHS is comprehensively rotten (a culture based on the "normalisation of cruelty"), then you know we are fast approaching the end-game. This has been reinforced this week by Jeremy Hunt explicitly claiming that hospitals are "failing" and that they have "betrayed" patients. The rhetorical equivalence of the hospital and the charnel house can't be far off.
The Health Secretary's proposals in response to the Francis report include the usual managerialist mechanisms of ratings and sanctions. These will institutionalise failure as a feature of the health service, much as the same mechanisms have already done in education. Just as "failing" schools have been handed over to the private sector, so failing hospitals will be handed over to private providers. Health care staff are to be inspected as vigilantly as teachers, while professional status is to be eroded at the bottom end of the pay scale. Nurses are to be obliged to spend a year learning how to care (i.e. wash patients and change bedpans), though this appears to be considered unnecessary for doctors and hospital managers. Given that there will be no extra money, this could mean nurses displacing cheaper ancillary workers, leading to either lower wages or fewer staff. In tone, this sounds punitive: teach nurses their place in the class hierarchy of health.
Compare and contrast with the revelation that job centre staff are expected to meet targets for benefit sanctions, and that offices are being judged in league tables, with underachievers threatened with "performance management". Ian Duncan Smith has sought to deny that this is departmental policy, as it clearly shows that failure (i.e. the sanctioning of a set number of claimants) is being deliberately engineered. He doesn't appear to be trying too hard, and I don't get the feeling that Liam Byrne, his Labour shadow and fellow workfare enthusiast, is really as appalled as he claims to be. Before long, we'll simply accept that the arbitrary decimation of benefits is a necessary discipline for the jobless. Having it decided by the drawing of lots will probably be advocated as fairer.
It has been an interesting couple of weeks for Jeremy Hunt. Last Tuesday came the news that the three main political parties had agreed a deal to set up a new independent press watchdog. This brought to a close the political strand of the phone-hacking affair, which blew up on Hunt's watch as Culture and Media Secretary just as he was trying to facilitate News Corp's takeover of BSkyB. It is widely recognised that both the Leveson inquiry and the subsequent political response failed to address the key issue, namely the concentration of media power in the hands of Rupert Murdoch and a few other rich men. Alan Rusbridger, the editor of the Guardian, has produced many bland words advocating reasoned compromise between press and politicians over the issue of regulation, but even he couldn't avoid admitting the stark truth: "The most powerful newspaper group in the country was – on the kindest interpretation – out of control. The police and parliament were cowed". In the case of Jeremy Hunt, "enthusiastically supportive" would be more accurate.
Given the role that the press has played in denigrating and undermining the NHS over the years, it is distasteful to watch their hysterical over-reaction to the supposed threat to free speech represented by the new regulatory regime. Nick Cohen even went so far as to claim that "a great chill will descend on the free republic of online writing, which until now has been a liberating and democratic force in modern British life". While we should never underestimate the ability of the state to blunder into repression, the idea that any regulator would have the resources, let alone the inclination, to proactively monitor the citizens of "the free republic of online writing" (i.e. obscure bloggers) is laughable.
This is the hyperbole of fear. A weak press regulator is decried as the end of 300 years of press freedom, thus avoiding the need to address the odious privilege of newspaper barons, while the NHS is painted as if it were a murderous conspiracy, the better to justify its dismemberment.
As pessimism becomes entrenched, the narrative gradually shifts towards the claim that the NHS is intrinsically incompetent. A major trope is the idea that hospitals are a danger to our health. Popular anxiety has been fuelled both by legitimate concerns, such as infections and neglect, and by the irresponsible indulgence of quackery, which has an interest in promoting the risks of conventional treatments under the cover of "choice". When the language shifts up another gear, to imply that the NHS is comprehensively rotten (a culture based on the "normalisation of cruelty"), then you know we are fast approaching the end-game. This has been reinforced this week by Jeremy Hunt explicitly claiming that hospitals are "failing" and that they have "betrayed" patients. The rhetorical equivalence of the hospital and the charnel house can't be far off.
The Health Secretary's proposals in response to the Francis report include the usual managerialist mechanisms of ratings and sanctions. These will institutionalise failure as a feature of the health service, much as the same mechanisms have already done in education. Just as "failing" schools have been handed over to the private sector, so failing hospitals will be handed over to private providers. Health care staff are to be inspected as vigilantly as teachers, while professional status is to be eroded at the bottom end of the pay scale. Nurses are to be obliged to spend a year learning how to care (i.e. wash patients and change bedpans), though this appears to be considered unnecessary for doctors and hospital managers. Given that there will be no extra money, this could mean nurses displacing cheaper ancillary workers, leading to either lower wages or fewer staff. In tone, this sounds punitive: teach nurses their place in the class hierarchy of health.
Compare and contrast with the revelation that job centre staff are expected to meet targets for benefit sanctions, and that offices are being judged in league tables, with underachievers threatened with "performance management". Ian Duncan Smith has sought to deny that this is departmental policy, as it clearly shows that failure (i.e. the sanctioning of a set number of claimants) is being deliberately engineered. He doesn't appear to be trying too hard, and I don't get the feeling that Liam Byrne, his Labour shadow and fellow workfare enthusiast, is really as appalled as he claims to be. Before long, we'll simply accept that the arbitrary decimation of benefits is a necessary discipline for the jobless. Having it decided by the drawing of lots will probably be advocated as fairer.
It has been an interesting couple of weeks for Jeremy Hunt. Last Tuesday came the news that the three main political parties had agreed a deal to set up a new independent press watchdog. This brought to a close the political strand of the phone-hacking affair, which blew up on Hunt's watch as Culture and Media Secretary just as he was trying to facilitate News Corp's takeover of BSkyB. It is widely recognised that both the Leveson inquiry and the subsequent political response failed to address the key issue, namely the concentration of media power in the hands of Rupert Murdoch and a few other rich men. Alan Rusbridger, the editor of the Guardian, has produced many bland words advocating reasoned compromise between press and politicians over the issue of regulation, but even he couldn't avoid admitting the stark truth: "The most powerful newspaper group in the country was – on the kindest interpretation – out of control. The police and parliament were cowed". In the case of Jeremy Hunt, "enthusiastically supportive" would be more accurate.
Given the role that the press has played in denigrating and undermining the NHS over the years, it is distasteful to watch their hysterical over-reaction to the supposed threat to free speech represented by the new regulatory regime. Nick Cohen even went so far as to claim that "a great chill will descend on the free republic of online writing, which until now has been a liberating and democratic force in modern British life". While we should never underestimate the ability of the state to blunder into repression, the idea that any regulator would have the resources, let alone the inclination, to proactively monitor the citizens of "the free republic of online writing" (i.e. obscure bloggers) is laughable.
This is the hyperbole of fear. A weak press regulator is decried as the end of 300 years of press freedom, thus avoiding the need to address the odious privilege of newspaper barons, while the NHS is painted as if it were a murderous conspiracy, the better to justify its dismemberment.
Wednesday, 20 March 2013
Bitter Lemons of Cyprus
Paul Krugman, apropos the Cyprus kerfuffle, wonders what it is about European islands that causes them to turn into havens for runaway banks. He mentions Iceland and Ireland, though the description also fits Britain. Cyprus is actually the odd one out here. Unlike the Atlantic islands, the isle of Aphrodite has traditionally been a gateway between Europe and the Near East. It was a logistical hub during the Crusades and, acting as a shield against Ottoman Turkey to the north, it was of strategic importance for the sea-trade routes from Genoa and Venice to the Levant. The title of this post, by the way, comes from the book of the same name by Lawrence Durrell, more famous for that other Levantine work, The Alexandria Quartet.
Though Cypriot banks were heavily exposed to Greece, this should be seen in the context of traditional enosis, the desire for union that motivated the independence movement in the 1950s. On an island still divided between Greeks and Turks, investment in Greece was a strategic imperative and a way of connecting with the EU prior to accession. While commentators have noted the large amount of Cypriot investment in Russia (essentially laundering of illicit Russian money), few have considered the knock-on effect that a bank crash would have on the Middle East. I have no particular insight into how this will all pan out, but I suspect an injection of funds from the Gulf is as likely as one from Russia. Assuming that the shadowy investors of Qatar aren't going to buy Arsenal, they might care to snap up a Cypriot bank or two and get an option on the much-touted gas field.
There are claims that the Troika expected the Cypriot government to honour its existing (and EU standard) depositor guarantee of 100k and limit the levy to larger deposits (i.e. Russian and mainland Greek tax-dodgers), but the government feared losing Russian goodwill and decided to dun its own citizens, claiming in turn that the 6.75% was imposed upon it. This sounds like simple buck-passing, but what it serves to do is highlight the sensitivity of the issue of depositor safety, which is as much an issue for us in the UK (or anywhere else in the EU) as it is for Cypriots, particularly if this move is the harbinger of wider financial repression - i.e. paying down debt by a de facto tax on savings.
Unless you go overdrawn, your bog-standard UK bank account is unlikely to levy a fee for the service - it's free of charge, and has been among high-street banks for many years now. If you have a savings account, you probably earn a small amount of interest. How does this work commercially, given that the bank has to pay the costs of the branch, the ATMs, the staff etc? The answer is that the bank puts your money at risk by using it to fund loans to other customers or for proprietary trading. The bank will take the lion's share of any profits arising, hence the nugatory interest on your savings. It is able to do this (and thereby pay its management and traders big bonuses) because you, the depositor, don't think that you have any skin in a game of speculation.
Your innocence (and peace of mind) is ensured in part by deposit insurance. In the UK, the government-backed Financial Services Compensation Scheme provides a guarantee up to £85k (roughly equivalent to 100k Euros). In theory this is funded by a levy on banks and other deposit-taking institutions, however the back-stop is the government via the Bank of England, which simply means that, should your bank go bust and funds be insufficient, they will organise a non-voluntary whip-round among all taxpayers to compensate you (another de facto tax, and one which is inevitably regressive as any compensatory spending cuts will hit the poorer hardest).
While I think we can all sympathise at the individual level with small depositors in Cypriot banks, the idea that their money is being "stolen" by the government, or the Troika, is absurd. Their money was taken by the banks, who then misused and lost it. Caveat emptor. That said, it is legitimate to criticise the Cypriot government for reneging on their deposit guarantee. The political fallout of this is to leave depositors across the EU wary, which is not a clever move on anybody's part.
The significance of this event is that it lifts the veil and makes clear what should be an obvious truth: your money cannot be safe in a bank unless it operates on a full reserve basis - i.e. it does not speculate with your cash and charges you a fee for banking it. Unfortunately, such a system would severely constrain funds for investment (fractional reserve banking means investment is a multiple of deposits) and thus leave us all the poorer. It is therefore necessary for banking to operate a massive confidence trick. This is not a trick in the sense that we are hoodwinked into parting with our money for no return, but that we allow ourselves to be deluded, by the opacity of finance, into the belief that we can make our money available to others at no risk and at some reward, which is plainly illogical. The truth is bitter.
Though Cypriot banks were heavily exposed to Greece, this should be seen in the context of traditional enosis, the desire for union that motivated the independence movement in the 1950s. On an island still divided between Greeks and Turks, investment in Greece was a strategic imperative and a way of connecting with the EU prior to accession. While commentators have noted the large amount of Cypriot investment in Russia (essentially laundering of illicit Russian money), few have considered the knock-on effect that a bank crash would have on the Middle East. I have no particular insight into how this will all pan out, but I suspect an injection of funds from the Gulf is as likely as one from Russia. Assuming that the shadowy investors of Qatar aren't going to buy Arsenal, they might care to snap up a Cypriot bank or two and get an option on the much-touted gas field.
There are claims that the Troika expected the Cypriot government to honour its existing (and EU standard) depositor guarantee of 100k and limit the levy to larger deposits (i.e. Russian and mainland Greek tax-dodgers), but the government feared losing Russian goodwill and decided to dun its own citizens, claiming in turn that the 6.75% was imposed upon it. This sounds like simple buck-passing, but what it serves to do is highlight the sensitivity of the issue of depositor safety, which is as much an issue for us in the UK (or anywhere else in the EU) as it is for Cypriots, particularly if this move is the harbinger of wider financial repression - i.e. paying down debt by a de facto tax on savings.
Unless you go overdrawn, your bog-standard UK bank account is unlikely to levy a fee for the service - it's free of charge, and has been among high-street banks for many years now. If you have a savings account, you probably earn a small amount of interest. How does this work commercially, given that the bank has to pay the costs of the branch, the ATMs, the staff etc? The answer is that the bank puts your money at risk by using it to fund loans to other customers or for proprietary trading. The bank will take the lion's share of any profits arising, hence the nugatory interest on your savings. It is able to do this (and thereby pay its management and traders big bonuses) because you, the depositor, don't think that you have any skin in a game of speculation.
Your innocence (and peace of mind) is ensured in part by deposit insurance. In the UK, the government-backed Financial Services Compensation Scheme provides a guarantee up to £85k (roughly equivalent to 100k Euros). In theory this is funded by a levy on banks and other deposit-taking institutions, however the back-stop is the government via the Bank of England, which simply means that, should your bank go bust and funds be insufficient, they will organise a non-voluntary whip-round among all taxpayers to compensate you (another de facto tax, and one which is inevitably regressive as any compensatory spending cuts will hit the poorer hardest).
While I think we can all sympathise at the individual level with small depositors in Cypriot banks, the idea that their money is being "stolen" by the government, or the Troika, is absurd. Their money was taken by the banks, who then misused and lost it. Caveat emptor. That said, it is legitimate to criticise the Cypriot government for reneging on their deposit guarantee. The political fallout of this is to leave depositors across the EU wary, which is not a clever move on anybody's part.
The significance of this event is that it lifts the veil and makes clear what should be an obvious truth: your money cannot be safe in a bank unless it operates on a full reserve basis - i.e. it does not speculate with your cash and charges you a fee for banking it. Unfortunately, such a system would severely constrain funds for investment (fractional reserve banking means investment is a multiple of deposits) and thus leave us all the poorer. It is therefore necessary for banking to operate a massive confidence trick. This is not a trick in the sense that we are hoodwinked into parting with our money for no return, but that we allow ourselves to be deluded, by the opacity of finance, into the belief that we can make our money available to others at no risk and at some reward, which is plainly illogical. The truth is bitter.
Saturday, 16 March 2013
This is what we have lost
Ken Loach's new film, The Spirit of '45, a documentary celebrating the achievements of the Attlee government, has predictably been monstered by the right. According to the Daily Mail, it is a "Marxist fantasy", while on last night's edition of The Review Show, the Tory MP Chris Skidmore called it "propaganda", "a whitewash of the 70s", and "like Triumph of the Will" (he's obviously still smarting over Danny Boyle's Olympic triumph). Skidmore's clincher was to ask why, if 1945 was so wonderful, the country rejected the New Jerusalem and voted Attlee out in 1951. Labour actually won the popular vote in that year (as it had in 1945 and 1950), so if "the country" spoke, it certainly wasn't expressing a preference for Churchill. The reason the Tories won the 1951 election was due to the structural bias in seats (i.e. a Tory vote counted more than a Labour one) and the collapse in the Liberal vote (down from 9.1% in 1950 to 2.5%), most of which went to the Tories (they'll be hoping for the same in 2015).
I've not seen The Spirit of '45 yet, so I can't comment on its merits as a film, but my interest for the moment is in the way it has been received. Leaving aside the frothing of the loons, there are two chief criticisms being made: the lack of a balancing view and its selective use of time. The mainstream view was articulated on The Review Show by Martha Kearney. She introduced the film by noting it contained "no opposing views" and was "an attack on modern conservatism through the prism of the past". She later opined that it was "a party-political broadcast for an outdated ideology", while Maureen Lipman said the film should have included "reasonable opposing voices". This appeal to "balance" is central to the BBC's ideology. Criticising Ken Loach for being polemical is as redundant as criticising Richard Littlejohn for being opinionated. When even the Telegraph can see the fatuity of attacking Loach on these grounds ("like wanting a goat to do a handstand"), you sense that paranoid intolerance may now have completely infected the Beeb.
Kirsty Wark set the tone earlier in the week on Newsnight (12 March) when she kicked off her interview by asking "is this a call to arms or left-wing agitprop". "Neither" replied Loach, with commendable patience. Tristram Hunt was on hand to provide the balance. This largely came down to claiming that the 50s and 60s was a period of "successful capitalism" in contrast to the less successful versions of the 30s and the post-Thatcher era. This is the classic social democrat position: that 1950 (the end of the first Attlee administration) marked the limit of socialism, ushering in the Butskellite consensus of the mixed economy and the welfare state. We didn't need any more fundamental change; we had reached the promised land and simply needed to manage it well thereafter.
What seemed to particularly bother the critics was the way the film jumped from 1945 to 1979, "ignoring" the years in between. The social democrats wanted the successes of the 50s and 60s acknowledged, while the right wanted the horrors of the 70s (i.e. the unions) wheeled out as a justification for Thatcher. This temporal elision is a classic example of montage, the compression of time to juxtapose contrasting or contending images: thus happy working class people in the late 40s give way to Thatcher doing her Saint Francis of Assisi number thirty years on. None of the supposed "critics" bothered to mention this, despite it being a technique famously employed by early Soviet film-makers like Sergei Eisenstein. If you wanted to make the case that the film was propaganda, this would appear to be a gilt-edged opportunity passed up.
Pivotal moments are the result of long-term trends, not sudden shifts. One revisionist approach to British history claims that Labour benefited from the normative experience of war socialism and that but for the conflict they would never have won in 1945, but as the strong vote share (and increased number of votes) in 1950 and 1951 shows, this is not convincing. The historiographical confusion over 1945 is in part the failing of the great man theory of history (i.e. bemusement that voters should reject the war leader Churchill) and the triumph of social trends history. As I understand it, Loach cleaves to the latter by setting the victory in the context of the 30s. In other words, the result of a long-term trend which, as a glance at the popular vote share in general elections would show, started in 1900. It should hardly have been a surprise. By the same token, the seeds of 1979 were planted by 1950. Though the Tory share of the vote followed a gradual decline after 1955 (in fact, a trend decline since 1935), the Labour share saw a steeper decline, accelerating in the 1970s and reaching a nadir in the mid-80s. Thus was Thatcher foretold.
Loach clearly intends to present 1945 and 1979 respectively as the peak and trough for a particular spirit, which is probably more accurately described as "communitarian" rather than "socialist". He doesn't dwell on the intervening years because the aim is (I presume) to show how shockingly different the two points in time were in terms of that spirit. A pedantic criticism would be that the peak was actually 1950 and the trough was nearer 1986, but that would sacrifice the easy recognition of the iconic years. A more profound criticism is that a celebration of the spirit of any age risks encouraging us to daydream that it can be recaptured, which is by definition a forlorn exercise. There is an obvious echo of maudlin Jacobite nostalgia in the title's use of "45".
The aggressive criticism directed at Loach springs from the advocacy of constant change and the disruptive nature of market forces - i.e. the neoliberal consensus. It accepts that there is a "spirit of history", that there is no alternative, and certainly no option to turn back or arrest this unstoppable force. Thus ostensibly conservative critics are reduced to partisan name-calling ("Marxist" and like "Nazi propaganda") because there is no truly conservative critique they can advance. And here you have the real irony. Loach is making an essentially conservative point: this is what we have lost.
Bonnie Greer was the lone voice on The Review Show who correctly saw Loach's film as a work of archaeology, uncovering the original spirit from beneath the layers of subsequent history and contemporary ideology. Trying to get across "what it was like then" and connect with a modern audience looks to be the aim of the film. Again, Loach here stands in a familiar (and sentimental) tradition of British documentary film-making: giving society's unrepresented a voice; allowing them to tell their own story (though inevitably mediated by well-meaning people like Loach himself).
What gave the entire programme a surreal air was that it also included a review of a new TV series, In The Flesh. This concerns zombies whose condition ("Partially Deceased Syndrome") is being managed by drugs as they are gradually reintegrated back into society. Most of the critics confessed that they didn't really get it. They noted various themes such as the challenges of change, alienation, integration, vigilantism and so on, but seemed unable (or unwilling) to discuss the mutation of the zombie horde over the years from a threatening "them" to a metaphorical "us". This was disappointing (though not unexpected) as the zombie trope has a resonant connection with Loach's film.
The original Carribean zombie of voodoo lore combines two forms: the animated corpse, the undead, and the drugged worker, a parody of slavery that also echoes the European idea of the robot (originally a Czech word implying unfree labour). The introduction of the zombie into US culture in the early 30s (the Bela Lugosi film White Zombie) coincides with the Great Depression, with long queues of shuffling unemployed men and itinerant labourers told to move on by small towns. The next major evolution in the trope is arguably Richard Matheson's book I Am Legend (1954), which reflects the Red Scare paranoia of the time (infection, symptoms not always obvious etc), not unlike the parallel alien invasion trope of The Invasion of the Bodysnatchers. Though the infected are notionally vampires, their behaviour is a clear harbinger of the modern zombie apocalypse trope.
The 1968 film Night of the Living Dead gave us zombies whose pathology is unknown. They just are. They could be anyone. It's as if something has erupted in society, as if norms have broken down. This is a metaphor of social change and all the anxieties associated with it, the Hobbesian war of all against all, in which the threats to the heroes are not just hostile zombies but other intolerant and amoral humans. In the 80s, zombies became rabid consumers (Dawn of the Dead famously has them trying to break into a shopping mall), with a hankering for delicacies (human brains). They are biddable, obsessive idiots. They are a projection not of our bestial natures, like Mister Hyde, but of our fear that we are already brain-dead. The modern zombie is often seen as an analogue of the dehumanised worker, while economics now happily provides house-room to such concepts as the "zombie firm".
The evolution of the trope in recent years has seen a reconciliation of the zombie and the human. We don't immediately recognise them as different (often at our peril, usually to humorous effect). The low affect, dishevelment and shambling gait of the stereotypical zombie makes it look like a typical teenager, or at least one on drugs (edgy but fundamentally adorable). The zombie may be more victim than threat, deserving of our sympathy and tolerance, and there is hope we may be able to manage their condition and reintegrate them as useful members of society, while keeping a close eye on them. But given that zombies don't actually exist in the real world, who then are the metaphorical zombies? My own suspicion is that the "socialised zombie" trope reflects our recognition that we are becoming a divided society in which a lumpen class of the low-paid and barely employed youth must be both tolerated and subdued (carrot and stick), lest they run amok. The now iconic pictures of the 2010 riots, with the hooded hordes attacking shops, could have been a zombie film re-enactment. Do we now take our cue from Dawn of the Dead rather than Les Miserables? Storm the mall, rather than build barricades?
Another dimension of this trope relates, I think, to the dilution of class consciousness. The real threat to democracy is not fringe neo-fascists, or populist revulsion with "the caste", but a loss of identity for the very concept of "the people", which has historically been embodied in contending classes and socio-economic groups. This can be seen in the vagueness of such constructions as the "squeezed middle". In a review of the difficulties that the idea of "the people" and its relationship to democracy presents to modern political thinkers, Boyan Znepolski offers this insight into the negative turn this has given rise to: "In contrast to Marx's proletariat, the people is not the carrier of a new project for the world, it is an embodiment of a destructive rage that must punish an unjust social order by destroying it". We, the people, lacking confidence in democracy and the established parties, become zombies. The baccilus we "carry" is "rage", which was precisely the name used for the pathogen in 28 Days Later, the 2002 British zombie film.
Insofar as it is possible to interpret Loach's intentions without having seen the film, I don't think The Spirit of '45 is an expression of rage so much as of hurt, but it does appear to sound a lament for the loss of "the people" as a actor on the political stage. The reality of that time, of multiple desires and self-interests, may make a mockery of the retrospective projection of solidarity, but there is truth in the claim that most people had a common ambition in 1945 and that the Attlee government largely met it. Now, Ken Loach, barricaded in the shopping mall, looks out at a society driven mad by selfishness. But, the funny thing is, a characteristic of the zombie horde is their common purpose. They never attack each other, or other species, they relentlessly come after us remaining humans, almost as if they are desperate that we should join them. Was the spirit of '45 the zombie plague, which we successfully resisted and can now sentimentally patronise, or did we start to become zombies in 1979?
I've not seen The Spirit of '45 yet, so I can't comment on its merits as a film, but my interest for the moment is in the way it has been received. Leaving aside the frothing of the loons, there are two chief criticisms being made: the lack of a balancing view and its selective use of time. The mainstream view was articulated on The Review Show by Martha Kearney. She introduced the film by noting it contained "no opposing views" and was "an attack on modern conservatism through the prism of the past". She later opined that it was "a party-political broadcast for an outdated ideology", while Maureen Lipman said the film should have included "reasonable opposing voices". This appeal to "balance" is central to the BBC's ideology. Criticising Ken Loach for being polemical is as redundant as criticising Richard Littlejohn for being opinionated. When even the Telegraph can see the fatuity of attacking Loach on these grounds ("like wanting a goat to do a handstand"), you sense that paranoid intolerance may now have completely infected the Beeb.
Kirsty Wark set the tone earlier in the week on Newsnight (12 March) when she kicked off her interview by asking "is this a call to arms or left-wing agitprop". "Neither" replied Loach, with commendable patience. Tristram Hunt was on hand to provide the balance. This largely came down to claiming that the 50s and 60s was a period of "successful capitalism" in contrast to the less successful versions of the 30s and the post-Thatcher era. This is the classic social democrat position: that 1950 (the end of the first Attlee administration) marked the limit of socialism, ushering in the Butskellite consensus of the mixed economy and the welfare state. We didn't need any more fundamental change; we had reached the promised land and simply needed to manage it well thereafter.
What seemed to particularly bother the critics was the way the film jumped from 1945 to 1979, "ignoring" the years in between. The social democrats wanted the successes of the 50s and 60s acknowledged, while the right wanted the horrors of the 70s (i.e. the unions) wheeled out as a justification for Thatcher. This temporal elision is a classic example of montage, the compression of time to juxtapose contrasting or contending images: thus happy working class people in the late 40s give way to Thatcher doing her Saint Francis of Assisi number thirty years on. None of the supposed "critics" bothered to mention this, despite it being a technique famously employed by early Soviet film-makers like Sergei Eisenstein. If you wanted to make the case that the film was propaganda, this would appear to be a gilt-edged opportunity passed up.
Pivotal moments are the result of long-term trends, not sudden shifts. One revisionist approach to British history claims that Labour benefited from the normative experience of war socialism and that but for the conflict they would never have won in 1945, but as the strong vote share (and increased number of votes) in 1950 and 1951 shows, this is not convincing. The historiographical confusion over 1945 is in part the failing of the great man theory of history (i.e. bemusement that voters should reject the war leader Churchill) and the triumph of social trends history. As I understand it, Loach cleaves to the latter by setting the victory in the context of the 30s. In other words, the result of a long-term trend which, as a glance at the popular vote share in general elections would show, started in 1900. It should hardly have been a surprise. By the same token, the seeds of 1979 were planted by 1950. Though the Tory share of the vote followed a gradual decline after 1955 (in fact, a trend decline since 1935), the Labour share saw a steeper decline, accelerating in the 1970s and reaching a nadir in the mid-80s. Thus was Thatcher foretold.
Loach clearly intends to present 1945 and 1979 respectively as the peak and trough for a particular spirit, which is probably more accurately described as "communitarian" rather than "socialist". He doesn't dwell on the intervening years because the aim is (I presume) to show how shockingly different the two points in time were in terms of that spirit. A pedantic criticism would be that the peak was actually 1950 and the trough was nearer 1986, but that would sacrifice the easy recognition of the iconic years. A more profound criticism is that a celebration of the spirit of any age risks encouraging us to daydream that it can be recaptured, which is by definition a forlorn exercise. There is an obvious echo of maudlin Jacobite nostalgia in the title's use of "45".
The aggressive criticism directed at Loach springs from the advocacy of constant change and the disruptive nature of market forces - i.e. the neoliberal consensus. It accepts that there is a "spirit of history", that there is no alternative, and certainly no option to turn back or arrest this unstoppable force. Thus ostensibly conservative critics are reduced to partisan name-calling ("Marxist" and like "Nazi propaganda") because there is no truly conservative critique they can advance. And here you have the real irony. Loach is making an essentially conservative point: this is what we have lost.
Bonnie Greer was the lone voice on The Review Show who correctly saw Loach's film as a work of archaeology, uncovering the original spirit from beneath the layers of subsequent history and contemporary ideology. Trying to get across "what it was like then" and connect with a modern audience looks to be the aim of the film. Again, Loach here stands in a familiar (and sentimental) tradition of British documentary film-making: giving society's unrepresented a voice; allowing them to tell their own story (though inevitably mediated by well-meaning people like Loach himself).
What gave the entire programme a surreal air was that it also included a review of a new TV series, In The Flesh. This concerns zombies whose condition ("Partially Deceased Syndrome") is being managed by drugs as they are gradually reintegrated back into society. Most of the critics confessed that they didn't really get it. They noted various themes such as the challenges of change, alienation, integration, vigilantism and so on, but seemed unable (or unwilling) to discuss the mutation of the zombie horde over the years from a threatening "them" to a metaphorical "us". This was disappointing (though not unexpected) as the zombie trope has a resonant connection with Loach's film.
The original Carribean zombie of voodoo lore combines two forms: the animated corpse, the undead, and the drugged worker, a parody of slavery that also echoes the European idea of the robot (originally a Czech word implying unfree labour). The introduction of the zombie into US culture in the early 30s (the Bela Lugosi film White Zombie) coincides with the Great Depression, with long queues of shuffling unemployed men and itinerant labourers told to move on by small towns. The next major evolution in the trope is arguably Richard Matheson's book I Am Legend (1954), which reflects the Red Scare paranoia of the time (infection, symptoms not always obvious etc), not unlike the parallel alien invasion trope of The Invasion of the Bodysnatchers. Though the infected are notionally vampires, their behaviour is a clear harbinger of the modern zombie apocalypse trope.
The 1968 film Night of the Living Dead gave us zombies whose pathology is unknown. They just are. They could be anyone. It's as if something has erupted in society, as if norms have broken down. This is a metaphor of social change and all the anxieties associated with it, the Hobbesian war of all against all, in which the threats to the heroes are not just hostile zombies but other intolerant and amoral humans. In the 80s, zombies became rabid consumers (Dawn of the Dead famously has them trying to break into a shopping mall), with a hankering for delicacies (human brains). They are biddable, obsessive idiots. They are a projection not of our bestial natures, like Mister Hyde, but of our fear that we are already brain-dead. The modern zombie is often seen as an analogue of the dehumanised worker, while economics now happily provides house-room to such concepts as the "zombie firm".
The evolution of the trope in recent years has seen a reconciliation of the zombie and the human. We don't immediately recognise them as different (often at our peril, usually to humorous effect). The low affect, dishevelment and shambling gait of the stereotypical zombie makes it look like a typical teenager, or at least one on drugs (edgy but fundamentally adorable). The zombie may be more victim than threat, deserving of our sympathy and tolerance, and there is hope we may be able to manage their condition and reintegrate them as useful members of society, while keeping a close eye on them. But given that zombies don't actually exist in the real world, who then are the metaphorical zombies? My own suspicion is that the "socialised zombie" trope reflects our recognition that we are becoming a divided society in which a lumpen class of the low-paid and barely employed youth must be both tolerated and subdued (carrot and stick), lest they run amok. The now iconic pictures of the 2010 riots, with the hooded hordes attacking shops, could have been a zombie film re-enactment. Do we now take our cue from Dawn of the Dead rather than Les Miserables? Storm the mall, rather than build barricades?
Another dimension of this trope relates, I think, to the dilution of class consciousness. The real threat to democracy is not fringe neo-fascists, or populist revulsion with "the caste", but a loss of identity for the very concept of "the people", which has historically been embodied in contending classes and socio-economic groups. This can be seen in the vagueness of such constructions as the "squeezed middle". In a review of the difficulties that the idea of "the people" and its relationship to democracy presents to modern political thinkers, Boyan Znepolski offers this insight into the negative turn this has given rise to: "In contrast to Marx's proletariat, the people is not the carrier of a new project for the world, it is an embodiment of a destructive rage that must punish an unjust social order by destroying it". We, the people, lacking confidence in democracy and the established parties, become zombies. The baccilus we "carry" is "rage", which was precisely the name used for the pathogen in 28 Days Later, the 2002 British zombie film.
Insofar as it is possible to interpret Loach's intentions without having seen the film, I don't think The Spirit of '45 is an expression of rage so much as of hurt, but it does appear to sound a lament for the loss of "the people" as a actor on the political stage. The reality of that time, of multiple desires and self-interests, may make a mockery of the retrospective projection of solidarity, but there is truth in the claim that most people had a common ambition in 1945 and that the Attlee government largely met it. Now, Ken Loach, barricaded in the shopping mall, looks out at a society driven mad by selfishness. But, the funny thing is, a characteristic of the zombie horde is their common purpose. They never attack each other, or other species, they relentlessly come after us remaining humans, almost as if they are desperate that we should join them. Was the spirit of '45 the zombie plague, which we successfully resisted and can now sentimentally patronise, or did we start to become zombies in 1979?
Thursday, 14 March 2013
Where should we plant the Money Tree?
Ed Miliband has announced that a future labour government would create a network of regional banks, similar to the long-established Sparkassen of Germany, to boost loans to SMEs. This would be a "local deposits for local business loans" model, with the banks prohibited from lending outside their region or speculating with deposits via international markets (probably). This is a cute manoeuvre that serves to push the Party's regionalist credentials while implicitly criticising the under-performing Funding for Lending Scheme and the failure of quantitative easing (QE) to get investment away from property and FTSE shares and into startups and growth-constrained manufacturers. The problem is that this may be little more than a cute manoeuvre. There are solid arguments in favour of a more balanced banking system, but the very problems this is expected to address, the regional imbalances of the economy, are simultaneously impediments to its success.
It's worth remembering that the UK once had a thriving regional banking system, around 150 years ago, providing funding for industry and commerce. The gradual consolidation of these banks into large national chains saw funds increasingly diverted via the City, where they could access a mix of high-yield investments abroad and low-risk gilts. This process continued through the twentieth century, culminating in the merging of retail and investment banks after deregulation in the 1980s. This process, coinciding with the dismantling of much of the industrial base, exacerbated the pull of capital from the regions to the City. Parallel to this, the growth of the housing market (which shifted investment away from industry to property), and the uneven growth in property values (which favoured London and the South East), further sucked capital out of the regions and led to the corruption of purpose of regional lenders like Northern Rock.
In this context, newly-minted regional banks will be trying to arrest the tide of history. Indeed, one obvious criticism is that this is shutting the stable door long after the horse has bolted and enjoyed many years of retirement at stud. Sparkassen are essentially savings banks, so their loan book is largely dependent on their ability to raise funds locally. Unsurprisingly, there are many more of them in what used to be West Germany. This isn't a factor of the different political systems prior to unification, but of the availability of savings today. A UK equivalent would face the structural problem that it is more difficult to raise capital in the North East than in the South East.
Even when capital can be raised, the attractiveness of property as an asset class means that a lot of the available investment will be diverted to bricks and mortar, which means that building societies (which are marginal in Germany) will be strong competitors for savings with the regional banks in the UK. Of course, building new houses in Cumbernauld is not as attractive an investment as building them in Crawley, so despite the best intentions of government, capital is likely to leech out of the regions indirectly. For example, loans to building firms based in the North that then bid for contracts in the South. No doubt some bright spark will realise the benefits of relocating his property development business headquarters from Leyton to Lichfield, while continuing to focus on riverside flats in Lambeth.
In theory, regional banks committed to investing in local business could have a dramatic effect if they were to act as a conduit for massive capital injections. The sketchy Labour plan talks about these banks operating in tandem with a national investment bank, much as the Sparkassen coordinate with the KfW national development bank in Germany, so this is a real possibility, but the key question is: just how much money will they be able to channel this way? In some respects, this is a recasting of New Labour's strategy of pumping money into the regions via public expenditure - i.e. public sector jobs and big infrastructure projects. The difference is that the money will be distributed via the private sector. According to Miliband: "I am determined that One Nation Labour becomes the party of the small business and the entrepreneur". If New Labour was an out-and-out neoliberal cheerleader, supporting big capital while using tax receipts to recycle to the regions, then One Nation Labour is being positioned as (among other things) the champion of small capital. It shows how corrupting "triangulation" is when you end up sounding like a latter-day Pierre Poujade.
David Cameron recently claimed that there is no Magic Money Tree, but as many have pointed out, that is precisely what QE is. The money created to fund the gilt purchase has no material origin - i.e. it doesn't represent actual value in the form of labour or a natural resource - so this is as close to ex nihilo (out of nothing) as sophistry allows. As we all now know, this "new money" has largely gone into property, shares and other assets beloved of the already wealthy. It hasn't turned into loans to SMEs, and certainly hasn't been biased more towards the regions than London. While the regions have picked up a disproportionate share of the austerity tab, in terms of public expenditure cuts and depressed standards of living, QE has served to insulate London and the South East to a degree, though obviously more in Kensington than Kennington.
When the economic history of this era comes to be written, QE will probably be seen as a massive failure of nerve more than a failure of imagination. The desire to prop up an unbalanced economy and prevent catastrophic collapse led to the very causes of that imbalance being further subsidised: property and financial services. This wasn't inevitable. The £375bn of gilts purchased since 2009 could have been used as collateral to fund a national investment bank, with explicit rules set in respect of the level of funding to go to the regions (and even to specific sectors). This would have been the opportune time to set up regional banks, both to act as conduits for that primary investment and to recycle the increase in savings that would have occurred as those investments boosted local employment. For the record, Labour were in government in 2009.
It's worth remembering that the UK once had a thriving regional banking system, around 150 years ago, providing funding for industry and commerce. The gradual consolidation of these banks into large national chains saw funds increasingly diverted via the City, where they could access a mix of high-yield investments abroad and low-risk gilts. This process continued through the twentieth century, culminating in the merging of retail and investment banks after deregulation in the 1980s. This process, coinciding with the dismantling of much of the industrial base, exacerbated the pull of capital from the regions to the City. Parallel to this, the growth of the housing market (which shifted investment away from industry to property), and the uneven growth in property values (which favoured London and the South East), further sucked capital out of the regions and led to the corruption of purpose of regional lenders like Northern Rock.
In this context, newly-minted regional banks will be trying to arrest the tide of history. Indeed, one obvious criticism is that this is shutting the stable door long after the horse has bolted and enjoyed many years of retirement at stud. Sparkassen are essentially savings banks, so their loan book is largely dependent on their ability to raise funds locally. Unsurprisingly, there are many more of them in what used to be West Germany. This isn't a factor of the different political systems prior to unification, but of the availability of savings today. A UK equivalent would face the structural problem that it is more difficult to raise capital in the North East than in the South East.
Even when capital can be raised, the attractiveness of property as an asset class means that a lot of the available investment will be diverted to bricks and mortar, which means that building societies (which are marginal in Germany) will be strong competitors for savings with the regional banks in the UK. Of course, building new houses in Cumbernauld is not as attractive an investment as building them in Crawley, so despite the best intentions of government, capital is likely to leech out of the regions indirectly. For example, loans to building firms based in the North that then bid for contracts in the South. No doubt some bright spark will realise the benefits of relocating his property development business headquarters from Leyton to Lichfield, while continuing to focus on riverside flats in Lambeth.
In theory, regional banks committed to investing in local business could have a dramatic effect if they were to act as a conduit for massive capital injections. The sketchy Labour plan talks about these banks operating in tandem with a national investment bank, much as the Sparkassen coordinate with the KfW national development bank in Germany, so this is a real possibility, but the key question is: just how much money will they be able to channel this way? In some respects, this is a recasting of New Labour's strategy of pumping money into the regions via public expenditure - i.e. public sector jobs and big infrastructure projects. The difference is that the money will be distributed via the private sector. According to Miliband: "I am determined that One Nation Labour becomes the party of the small business and the entrepreneur". If New Labour was an out-and-out neoliberal cheerleader, supporting big capital while using tax receipts to recycle to the regions, then One Nation Labour is being positioned as (among other things) the champion of small capital. It shows how corrupting "triangulation" is when you end up sounding like a latter-day Pierre Poujade.
David Cameron recently claimed that there is no Magic Money Tree, but as many have pointed out, that is precisely what QE is. The money created to fund the gilt purchase has no material origin - i.e. it doesn't represent actual value in the form of labour or a natural resource - so this is as close to ex nihilo (out of nothing) as sophistry allows. As we all now know, this "new money" has largely gone into property, shares and other assets beloved of the already wealthy. It hasn't turned into loans to SMEs, and certainly hasn't been biased more towards the regions than London. While the regions have picked up a disproportionate share of the austerity tab, in terms of public expenditure cuts and depressed standards of living, QE has served to insulate London and the South East to a degree, though obviously more in Kensington than Kennington.
When the economic history of this era comes to be written, QE will probably be seen as a massive failure of nerve more than a failure of imagination. The desire to prop up an unbalanced economy and prevent catastrophic collapse led to the very causes of that imbalance being further subsidised: property and financial services. This wasn't inevitable. The £375bn of gilts purchased since 2009 could have been used as collateral to fund a national investment bank, with explicit rules set in respect of the level of funding to go to the regions (and even to specific sectors). This would have been the opportune time to set up regional banks, both to act as conduits for that primary investment and to recycle the increase in savings that would have occurred as those investments boosted local employment. For the record, Labour were in government in 2009.
Monday, 11 March 2013
A New Map of the World
Big Data started its technology hype cycle some years ago, driven by the usual combination of novelty (lots of data, care of the Internet), Moore's Law (the processing power to analyse it), and hucksterism (this is the new competitive edge). Classic public examples are Google's ability to predict flu outbreaks by analysing searches for treatments by geographical area, or the same company's analysis of the historical distribution of words in texts scanned by Google Books. These are largely benign uses, but more questionable ones, such as identifying people who are good bets for usurious loans, are being dreamt up all the time.
The chief delusion of Big Data is that it can tell you anything (which could mean it tells you either nothing or everything - it doesn't handle ambiguity terribly well). It isn't just a species of techno-nonsense, and the hyperbole isn't just restricted to the usual gaggle of utopians and commercial boosters. Like all panaceas and cults, from plastic to Scientology, it has ambitions to be universal and totalising: this will change your life, it is inescapable, resistance is useless. It is this claim to universality that has seen it gradually seep into the realm of politics, partly as a result of the managerialist ideology of consultancies like McKinsey's, and partly as a result of the Nate Silver effect - i.e. the success of the data-nerds in doing a better job of predicting the outcome of last year's US Presidential election than a bunch of right-wing fantasists with a sixth sense.
In June 2008, three months before Lehmann Brothers imploded, Chris Anderson, the editor-in-chief of Wired and author of the Long Tail meme, published a piece on Big Data in which he suggested that the sheer volume and extent of data available through the Internet would make theory and the scientific method essentially redundant. That's a big punt. His thinking was that the method of traditional science was based on the difficulty of gathering and analysing data. Because we could not directly study or comprehend everything, we relied on representative sampling, which gave rise to certain structural practices such as theoretical models, random control trials, and meta-analyses. In the "Petabyte Age" ("sensors everywhere, infinite storage and clouds of processors"), the "data deluge makes the scientific method obsolete". We don't need a theory, we don't need to test that theory, we can deduce a single and authoritative answer to any new question from existing data.
There is an obvious flaw. As Will Davies notes, the cheerleaders of Big Data believe that: "it is collected with no particular purpose or theory in mind; it arises as a side-effect of other transactions and activities". But of course all data are gathered with a purpose in mind, or at least with the potential of a purpose, which is why Google were criticised for picking up unsecured wi-fi data via Street View. Datasets are not value-neutral. They are the result of discrimination, even if arbitrary or accidental, and ever-larger datasets do not necessarily mitigate this as they may have low signal to noise ratios. And all that is before you even consider the erroneous assumptions built into the analytical algorithms. Of course, this observation, that data are biased in their collection and representation, can in turn be used as an argument for elite control. Though Big Data may make some expertise redundant, it opens up the field for new data specialists who can "correctly" weight and interpret the data.
Politics has become increasingly prone to the demand for evidence-based policy-making as an alternative to crude belief, but the very idea that everything can be measured and thereby managed is itself an ideological construct. It should be no surprise that Big Data is increasingly being positioned as tomorrow's solution to today's problems. This is the classic strategy of building ideologies that are above politics, like religion once was and neoliberal economics has become. Though there is limited tolerance for technocracy as a solution (witness Mario Monti), the principle itself has yet to be widely accepted as intrinsically repellent. We want to believe that scientists and technocrats act in good faith, and are thus happy to accept that any flaws in their actions are likely to be the result of inadequate data rather than prejudice or self-interest. Big Data furthers this ideology by holding out the prospect that complete data can provide perfect knowledge.
The social sciences have been dominated by two hegemonic disciplines in recent decades. Economics has seen humanity as a common tabula rasa on which individual preferences are written. Biology has seen individuals as instances of a common class, with our "truths" (illness, heritage) written in DNA. These are apparently polar opposites - the self-actualiser and the prisoner of genetic destiny - but they share a common view that we, both as individuals and as a society, are simply the aggregation of data. Big Data promises to fully reconcile the two: making sense of our preferences in aggregate, and finding the personal needle in the social haystack. What's not to like?
What this utopian vision relegates to the background is the way that Big Data creates a whole new asset class, though this is the central proposition of Big Data in its commercial context. Political reservations about it tend to focus on privacy and personal liberty, i.e. the fear that one's own property will be scrutinised or alienated, with less discussion about the "commons" or public goods. Big Data presumes universal datasets - data that is complete, comprehensive and consistent. What this means in practice is a preference for monopolies, though we blithely accept that these will be private rather than public, Google and Amazon rather than the state. Big Data = big corporations. The ultimate logic of this is a future Mark Zuckerberg being commissioned by the government to determine your healthcare eligibility based on your lifestyle preferences. And unlike an ATOS assessment, there will be no appeal.
The absurdity of Big Data's totalising ambitions was made beautifully clear many decades ago by the Argentinian writer, Jorge Luis Borges, in his fable On Exactitude in Science.
The chief delusion of Big Data is that it can tell you anything (which could mean it tells you either nothing or everything - it doesn't handle ambiguity terribly well). It isn't just a species of techno-nonsense, and the hyperbole isn't just restricted to the usual gaggle of utopians and commercial boosters. Like all panaceas and cults, from plastic to Scientology, it has ambitions to be universal and totalising: this will change your life, it is inescapable, resistance is useless. It is this claim to universality that has seen it gradually seep into the realm of politics, partly as a result of the managerialist ideology of consultancies like McKinsey's, and partly as a result of the Nate Silver effect - i.e. the success of the data-nerds in doing a better job of predicting the outcome of last year's US Presidential election than a bunch of right-wing fantasists with a sixth sense.
In June 2008, three months before Lehmann Brothers imploded, Chris Anderson, the editor-in-chief of Wired and author of the Long Tail meme, published a piece on Big Data in which he suggested that the sheer volume and extent of data available through the Internet would make theory and the scientific method essentially redundant. That's a big punt. His thinking was that the method of traditional science was based on the difficulty of gathering and analysing data. Because we could not directly study or comprehend everything, we relied on representative sampling, which gave rise to certain structural practices such as theoretical models, random control trials, and meta-analyses. In the "Petabyte Age" ("sensors everywhere, infinite storage and clouds of processors"), the "data deluge makes the scientific method obsolete". We don't need a theory, we don't need to test that theory, we can deduce a single and authoritative answer to any new question from existing data.
There is an obvious flaw. As Will Davies notes, the cheerleaders of Big Data believe that: "it is collected with no particular purpose or theory in mind; it arises as a side-effect of other transactions and activities". But of course all data are gathered with a purpose in mind, or at least with the potential of a purpose, which is why Google were criticised for picking up unsecured wi-fi data via Street View. Datasets are not value-neutral. They are the result of discrimination, even if arbitrary or accidental, and ever-larger datasets do not necessarily mitigate this as they may have low signal to noise ratios. And all that is before you even consider the erroneous assumptions built into the analytical algorithms. Of course, this observation, that data are biased in their collection and representation, can in turn be used as an argument for elite control. Though Big Data may make some expertise redundant, it opens up the field for new data specialists who can "correctly" weight and interpret the data.
Politics has become increasingly prone to the demand for evidence-based policy-making as an alternative to crude belief, but the very idea that everything can be measured and thereby managed is itself an ideological construct. It should be no surprise that Big Data is increasingly being positioned as tomorrow's solution to today's problems. This is the classic strategy of building ideologies that are above politics, like religion once was and neoliberal economics has become. Though there is limited tolerance for technocracy as a solution (witness Mario Monti), the principle itself has yet to be widely accepted as intrinsically repellent. We want to believe that scientists and technocrats act in good faith, and are thus happy to accept that any flaws in their actions are likely to be the result of inadequate data rather than prejudice or self-interest. Big Data furthers this ideology by holding out the prospect that complete data can provide perfect knowledge.
The social sciences have been dominated by two hegemonic disciplines in recent decades. Economics has seen humanity as a common tabula rasa on which individual preferences are written. Biology has seen individuals as instances of a common class, with our "truths" (illness, heritage) written in DNA. These are apparently polar opposites - the self-actualiser and the prisoner of genetic destiny - but they share a common view that we, both as individuals and as a society, are simply the aggregation of data. Big Data promises to fully reconcile the two: making sense of our preferences in aggregate, and finding the personal needle in the social haystack. What's not to like?
What this utopian vision relegates to the background is the way that Big Data creates a whole new asset class, though this is the central proposition of Big Data in its commercial context. Political reservations about it tend to focus on privacy and personal liberty, i.e. the fear that one's own property will be scrutinised or alienated, with less discussion about the "commons" or public goods. Big Data presumes universal datasets - data that is complete, comprehensive and consistent. What this means in practice is a preference for monopolies, though we blithely accept that these will be private rather than public, Google and Amazon rather than the state. Big Data = big corporations. The ultimate logic of this is a future Mark Zuckerberg being commissioned by the government to determine your healthcare eligibility based on your lifestyle preferences. And unlike an ATOS assessment, there will be no appeal.
The absurdity of Big Data's totalising ambitions was made beautifully clear many decades ago by the Argentinian writer, Jorge Luis Borges, in his fable On Exactitude in Science.
In that Empire, the Art of Cartography attained such Perfection that the map of a single Province occupied the entirety of a City, and the map of the Empire, the entirety of a Province. In time, those Unconscionable Maps no longer satisfied, and the Cartographers Guilds struck a Map of the Empire whose size was that of the Empire, and which coincided point for point with it. The following Generations, who were not so fond of the Study of Cartography as their Forebears had been, saw that that vast Map was Useless, and not without some Pitilessness was it, that they delivered it up to the Inclemencies of Sun and Winters. In the Deserts of the West, still today, there are Tattered Ruins of that Map, inhabited by Animals and Beggars; in all the Land there is no other Relic of the Disciplines of Geography.This is a variant on the principle of "the map is not the territory", the confusion of a model with reality. The evangelists argue that this is precisely the problem that Big Data circumvents, in that it isn't a model but reality itself - the data is the one truth. But it's interesting to note how Big Data and visualisation so often go hand-in-hand, as if we need the raw truth to be mediated by a slick interface. We want to believe that the data is authoritative, but we also want it presented like an unthreatening cartoon. As Borges's near-contemporary, Paul Valéry, said: "Everything simple is false. Everything which is complex is unusable."
Wednesday, 6 March 2013
Bonus Culture
One of the legacies of Postmodernism in design is the conformist nonconformism of the modern office. Not everywhere is as wacky as the Googleplex, with its bean bags and micro-scooters, but even law firms now indulge in funky colour schemes and perplexing art-prints, while ironic toys and mugs clutter desks. The epitome of this is found in high-tech, where even brand names are self-consciously PoMo, with lower caps and exclamation marks. I've long thought that homeworking could be seen in this light - as a sort of ironic form of working. This perhaps explains why homeworking is often discussed as if it were a fashion accoutrement. Though there are practical arguments both for and against, its popularity seems to wax and wane for other reasons.
Recently, Yahoo! decided to end homeworking for its employees. The move is clearly defensive - a circling of the wagons. According to Marissa Mayer, the CEO: "We need to be one Yahoo!, and that starts with physically being together". If staff at Yahoo (enough with the exclamation mark) are demoralised, this will be incidental to their place of work. The business has been failing for years. The suspicion is that the ultimatum is a way of getting rid of some dead wood, though the move has led to think-pieces (devoid of much thought) questioning whether homeworking is in retreat. The issue is not whether homeworking is appropriate or effective (clearly it is in many cases), but what governs the change in policy to either implement or abandon it?
Among the witless blather on the subject, there is an inadvertant clue in a piece by the relentlessly self-promoting Heather McGregor: "In times of crisis, I may need more face-time with my team, not less; when the recession hit us in 2008/9, one of the first things I asked was for people who had been home-working to return to the office". She doesn't explain why crises require more face-time. A crisis management meeting may be necessary to quickly review a situation and refocus resources, but you'd want to keep this to a minimum so that people then get on with addressing the crisis. Talking can quickly become an alternative to doing.
What these two statements show is that CEOs look to reassert their power when a business becomes vulnerable. Like reinforcing the dress code or being strict about arrival times, it's about showing who's in charge. This a tactic with a long pedigree - witness the stock tales about new military commanders who re-establish discipline - and may well prove effective. My point is that it reveals something about the nature of homeworking, namely that it is a power commodity, a means by which power is exchanged within the employer-employee relationship. In good times, some power (a privilege or exemption to a rule) is given to employees. In tough times, that power is taken back.
In some cases it is clear that the absence of homeworking as a privilege relates more to the employee's own assessment of its value. According to the Head of British Vogue: "We have come to believe that working at home is a completely adequate alternative to showing our face in the office. But it's not". Allowing that Ugly Betty is a caricature, I don't think it should be a surprise that in an environment where visible networking is crucial, "showing your face" is an important political act. If you find (as many do) that working from home allows you to "get things done", then this implies that what happens in the office is often inimical to productive work, even allowing for tacit exchange and the other benefits of face-time. The flip-side of homeworking, and the PoMo influence on the office environment, is the growth of the "home as a factory, office as a club" trope. This shows that both homeworking and office time can become forms of privilege.
It is also clear from the descriptions of homeworking and the advice on best practice that this is assumed to be a largely middle-class thing, where a dedicated space (a home office) is possible and human contact is maintained by popping out for lunch and coffee. Most personal testimony in the media is from self-absorbed journos, which is hardly representative. I've even seen claims that call-centre staff (i.e. working-class jobs) cannot easily be allowed to work from home as "the cost of linking secure databases to thousands of houses stands as a considerable obstacle", which is technological nonsense. The issue is clearly one of discipline and control, which is a feature of the job not of the work itself.
The chief value of homeworking is not the lack of distraction, or being able to work in pyjamas, but the avoidance of the daily commute. Commuting is a key exchange between employer and employee. Businesses cluster together because of the efficiencies and economies of scale this provides. The downside is that it drives up the cost of labour in the vicinity. Commuter transport systems expand the area from which labour can be recruited, so pushing down wages. The quid pro quo for the employee is access to cheaper housing - i.e the further you commute, the bigger a house you can afford.
Homeworking represents a shift in the balance of power. When middle-class jobs are already well-paid, it becomes advantageous to some workers to "commute" extra pay into extra time (though there may also be cash savings on transport costs). Homeworking is a bonus paid in time. When the terms of trade change, and business has the greater advantage, homeworking is reduced without any compensation in higher wages.
Recently, Yahoo! decided to end homeworking for its employees. The move is clearly defensive - a circling of the wagons. According to Marissa Mayer, the CEO: "We need to be one Yahoo!, and that starts with physically being together". If staff at Yahoo (enough with the exclamation mark) are demoralised, this will be incidental to their place of work. The business has been failing for years. The suspicion is that the ultimatum is a way of getting rid of some dead wood, though the move has led to think-pieces (devoid of much thought) questioning whether homeworking is in retreat. The issue is not whether homeworking is appropriate or effective (clearly it is in many cases), but what governs the change in policy to either implement or abandon it?
Among the witless blather on the subject, there is an inadvertant clue in a piece by the relentlessly self-promoting Heather McGregor: "In times of crisis, I may need more face-time with my team, not less; when the recession hit us in 2008/9, one of the first things I asked was for people who had been home-working to return to the office". She doesn't explain why crises require more face-time. A crisis management meeting may be necessary to quickly review a situation and refocus resources, but you'd want to keep this to a minimum so that people then get on with addressing the crisis. Talking can quickly become an alternative to doing.
What these two statements show is that CEOs look to reassert their power when a business becomes vulnerable. Like reinforcing the dress code or being strict about arrival times, it's about showing who's in charge. This a tactic with a long pedigree - witness the stock tales about new military commanders who re-establish discipline - and may well prove effective. My point is that it reveals something about the nature of homeworking, namely that it is a power commodity, a means by which power is exchanged within the employer-employee relationship. In good times, some power (a privilege or exemption to a rule) is given to employees. In tough times, that power is taken back.
In some cases it is clear that the absence of homeworking as a privilege relates more to the employee's own assessment of its value. According to the Head of British Vogue: "We have come to believe that working at home is a completely adequate alternative to showing our face in the office. But it's not". Allowing that Ugly Betty is a caricature, I don't think it should be a surprise that in an environment where visible networking is crucial, "showing your face" is an important political act. If you find (as many do) that working from home allows you to "get things done", then this implies that what happens in the office is often inimical to productive work, even allowing for tacit exchange and the other benefits of face-time. The flip-side of homeworking, and the PoMo influence on the office environment, is the growth of the "home as a factory, office as a club" trope. This shows that both homeworking and office time can become forms of privilege.
It is also clear from the descriptions of homeworking and the advice on best practice that this is assumed to be a largely middle-class thing, where a dedicated space (a home office) is possible and human contact is maintained by popping out for lunch and coffee. Most personal testimony in the media is from self-absorbed journos, which is hardly representative. I've even seen claims that call-centre staff (i.e. working-class jobs) cannot easily be allowed to work from home as "the cost of linking secure databases to thousands of houses stands as a considerable obstacle", which is technological nonsense. The issue is clearly one of discipline and control, which is a feature of the job not of the work itself.
The chief value of homeworking is not the lack of distraction, or being able to work in pyjamas, but the avoidance of the daily commute. Commuting is a key exchange between employer and employee. Businesses cluster together because of the efficiencies and economies of scale this provides. The downside is that it drives up the cost of labour in the vicinity. Commuter transport systems expand the area from which labour can be recruited, so pushing down wages. The quid pro quo for the employee is access to cheaper housing - i.e the further you commute, the bigger a house you can afford.
Homeworking represents a shift in the balance of power. When middle-class jobs are already well-paid, it becomes advantageous to some workers to "commute" extra pay into extra time (though there may also be cash savings on transport costs). Homeworking is a bonus paid in time. When the terms of trade change, and business has the greater advantage, homeworking is reduced without any compensation in higher wages.
Sunday, 3 March 2013
The Eastleigh Paradox
The Easterlin Paradox (which sounds like what used to be known as a fat airport book) is the observation that average happiness levels in a society do not correlate with GDP. Rich people tend to be happier than poor people, but rich countries are not happier in aggregate than poor ones, assuming the latter are not failed states incapable of meeting basic needs. The standard explanation for this is that happiness is broadly a function of relative status. Apart from the transnational elite, this means we assess our happiness by comparison to our fellow citizens. It has obvious ideological ramifications, hence its general acceptance on the left (there's more to life than growth) and rejection on the right (wealth creation benefits everyone).
There are two apparent paradoxes revealed in the analysis of the Eastleigh by-election, a near-homophone that got me thinking about Easterlin. The first is that UKIP's relative success depended as much on attracting ostensibly pro-EU LibDems to their anti-EU platform as much as Tory eurosceptics. Both coalition parties lost 14% of the vote, relative to the 2010 general election, while UKIP gained 24%. The suspicion that the LibDem defectors went to the Tories (a rather improbable protest given that both parties share government, though there was some movement in both directions) and that the Tories then passed on 1 in 4 voters to UKIP is allayed by the post-vote polling. UKIP actually gained slightly more defectors directly from the LibDems. This hints that beneath the LibDem beard and cable-knit sweater there lurks a less attractive underbelly.
The suggestion that this was an anti-coalition protest vote doesn't stack up unless you can explain why UKIP was preferred over Labour, whose vote share didn't change. "A plague on all their houses" is a possible explanation, but the idea that principled LibDem voters, appalled at the coalition government's track record, decided to throw votes at an anti-EU and xenophobic party rather than anodyne "One Nation" Labour seems unlikely. It also needs to be noted that the constituency is nowhere near the front line of those bearing the brunt of austerity, and that the coalition parties together polled 57% of the vote, so "kick the bums out" doesn't really add up either. A more credible explanation is that the LibDem's pavement politics have been successful in mobilising centre-right voters over the years, which is certainly supported by their dominance of the local council in an affluent part of the country, and that the protest is more a generalised one mixing mild dissatisfaction with the council (planning blight), mild dissatisfaction with the coalition (Osborne's budget), and a general irritation with the established parties (expenses, lying, groping).
This feeds into my own pet theory that the LibDems are actually a more fragile coalition than either Labour or the Tories. Signature policies, such as localism, the environment and civil liberties, have served to paper over the cracks by allowing Orange-bookers, nonconformists and the parochial to find common ground. I fully expect the LibDems to perform reasonably well in the South in 2015 but be slaughtered in the North where the government's obvious pro-South bias (untempered by the LibDems) won't be forgotten. The result may be a shift towards a more neoliberal and libertarian LibDem party, along the lines typically found on the continent. Lloyd George will finally be buried.
The second paradox is that immigration was the single most important issue for a majority of UKIP voters (59%), well ahead of the EU (33%), and was one of the top issues for voters overall (23%), despite the constituency being 97% native English speaking and not having any significant local tensions that could be related to immigration. Of course, "immigration" here is just a code, but a code for what? I doubt this is simple racism for most voters. While it is true that racists tend to have very little interaction with people from a different ethnic background (they avoid them, after all), and that their worldview tends to be heavily influenced by second-hand prejudice, from the snidery of the Daily Mail and Express to the bigotry of online forums, it seems unlikely that Eastleigh has an unusually high concentration of them. I suspect that "immigration" is a handy catch-all for anything and everything that threatens the settled order, from teenage Goths to Polish shops and from married gays to Romanian Gypsies. This is Chesteron's people of England speaking.
The connection between Eastleigh and Easterlin is the point that relative prosperity has little bearing on happiness. In a relatively well-off constituency, with an apparently effective local council, people still want to tell you how bloody angry they are about ... oh, I dunno, something or other.
There are two apparent paradoxes revealed in the analysis of the Eastleigh by-election, a near-homophone that got me thinking about Easterlin. The first is that UKIP's relative success depended as much on attracting ostensibly pro-EU LibDems to their anti-EU platform as much as Tory eurosceptics. Both coalition parties lost 14% of the vote, relative to the 2010 general election, while UKIP gained 24%. The suspicion that the LibDem defectors went to the Tories (a rather improbable protest given that both parties share government, though there was some movement in both directions) and that the Tories then passed on 1 in 4 voters to UKIP is allayed by the post-vote polling. UKIP actually gained slightly more defectors directly from the LibDems. This hints that beneath the LibDem beard and cable-knit sweater there lurks a less attractive underbelly.
The suggestion that this was an anti-coalition protest vote doesn't stack up unless you can explain why UKIP was preferred over Labour, whose vote share didn't change. "A plague on all their houses" is a possible explanation, but the idea that principled LibDem voters, appalled at the coalition government's track record, decided to throw votes at an anti-EU and xenophobic party rather than anodyne "One Nation" Labour seems unlikely. It also needs to be noted that the constituency is nowhere near the front line of those bearing the brunt of austerity, and that the coalition parties together polled 57% of the vote, so "kick the bums out" doesn't really add up either. A more credible explanation is that the LibDem's pavement politics have been successful in mobilising centre-right voters over the years, which is certainly supported by their dominance of the local council in an affluent part of the country, and that the protest is more a generalised one mixing mild dissatisfaction with the council (planning blight), mild dissatisfaction with the coalition (Osborne's budget), and a general irritation with the established parties (expenses, lying, groping).
This feeds into my own pet theory that the LibDems are actually a more fragile coalition than either Labour or the Tories. Signature policies, such as localism, the environment and civil liberties, have served to paper over the cracks by allowing Orange-bookers, nonconformists and the parochial to find common ground. I fully expect the LibDems to perform reasonably well in the South in 2015 but be slaughtered in the North where the government's obvious pro-South bias (untempered by the LibDems) won't be forgotten. The result may be a shift towards a more neoliberal and libertarian LibDem party, along the lines typically found on the continent. Lloyd George will finally be buried.
The second paradox is that immigration was the single most important issue for a majority of UKIP voters (59%), well ahead of the EU (33%), and was one of the top issues for voters overall (23%), despite the constituency being 97% native English speaking and not having any significant local tensions that could be related to immigration. Of course, "immigration" here is just a code, but a code for what? I doubt this is simple racism for most voters. While it is true that racists tend to have very little interaction with people from a different ethnic background (they avoid them, after all), and that their worldview tends to be heavily influenced by second-hand prejudice, from the snidery of the Daily Mail and Express to the bigotry of online forums, it seems unlikely that Eastleigh has an unusually high concentration of them. I suspect that "immigration" is a handy catch-all for anything and everything that threatens the settled order, from teenage Goths to Polish shops and from married gays to Romanian Gypsies. This is Chesteron's people of England speaking.
The connection between Eastleigh and Easterlin is the point that relative prosperity has little bearing on happiness. In a relatively well-off constituency, with an apparently effective local council, people still want to tell you how bloody angry they are about ... oh, I dunno, something or other.
Friday, 1 March 2013
Remembering Tony and Chris
The looming ten-year anniversary of the 2003 Iraq War has seen a flurry of memorialising, though more for the political participants than the soldiers or civilians who died in the conflict. A fortnight ago, various anti-war worthies looked back on the Stop the War march, pinpointing it as the moment when the scales fell from their eyes. Last week saw a hostile reception for Richard Seymour's Unhitched, an unsympathetic review of Christopher Hitchen's career, and in particular his pro-war-on-terror stance. This week saw Tony Blair resurfacing on Newsnight to parade his angst and unrepentance. Meanwhile, the carnage grinds on in Iraq.
The right (and neoliberal fellow-travellers) have understandably criticised the anti-war demo memorialisers as narcissistic, but in so doing they avoid the need to admit that the anti-war case was bang-on in every major respect: the WMD claim was confected, the US had already decided on regime change regardless of Saddam's "threat level", and the consequences of invasion were likely to be very bad. This cannot be blithely dismissed by pro-war apologists, like John Rentoul, on the grounds that contemporary UK opinion polls supported the war, or even by quibbling about the number of civilian dead or whether Iraqis today are hopeful about their prospects. A protest does not need to mobilise 50% of the population to be right, and the claim that things are better now in Iraq than they were 10 years ago is a bit like celebrating the decline of anti-semitism in Eastern Europe. Ends do not justify means.
The Newsnight programme aired the now common belief that the US bungled the invasion by stirring up the hornet's nest of sectarian and ethnic rivalries, and that it lacked a credible plan for state-building in the aftermath. This requires you to believe that the US State Department knew next to nothing about the country in advance and that the occupation authorities were incompetent thereafter. While cock-up is more prevalent than conspiracy, this tragic error trope seems a little too convenient, much as the way that the absence of WMD has been cast as "an assertion made in good faith that turned out not to be true". Christopher Meyer, the former UK ambassador to the US, noted that democracy cannot be parachuted in like humanitarian aid, at which heads nodded sagely. Again, this suggests that exporting democracy was actually a war aim that regrettably failed.
I think a more useful way of looking at the Iraq War is to assume that it was a success, rather than a failure. Given the outcome, this would imply that the objective was to emasculate Iraq as a regional power. This is consistent with the proposition that the root cause of the war was Saddam's unwillingness to be limited to his role as a buffer against Iran in the 80s; that once he'd decided to upset the regional balance of power with the invasion of Kuwait, and had proven unrepentant after defeat in 1991, it became necessary for Iraq to be neutralised. A no-man's land is the next best option to a buffer. WMD and the spurious links to al-Qaeda and 9/11 were mere opportunism. The UK suggestion that this was a continuation of the humanitarian policing operations in Kosovo and Sierra Leone is plainly absurd. As for the spread of democracy, no one ever suggested we invade Saudi Arabia in that cause.
Martin Kettle feels that the toxic legacy of Iraq must not blind us to the era of Blair (and Brown), which was "more an attempt to reassert social values and new forms of solidarity in the aftermath of Thatcher than an attempt simply to embrace Thatcher's possessive individualism". He even describes it in terms of there being "no alternative". As far as the left is concerned, "There wasn't a coherent alternative on offer and, even if there had been, not enough people would have voted for it." This is weaselly. The inclusion of the judgemental "coherent" is meant to rule out the actual alternatives on offer. The assumption that had there been a "coherent alternative" it would still not have commanded popular support is obviously illogical. If it never existed, how can we say what the electorate's view would have been?
Kettle dislikes the vulgar idea that Blair was just the continuation of Thatcher, but that, with suitable caveats about amelioration, is what he was. The emotionalism of the 2003 anti-war movement, and the virulence of the subsequent attacks on Blair, were in part a transference of the broader frustration caused by the neoliberal turn. After the hopes raised in 1997, it was bad enough to discover by 2003 that Blair was a neoliberal true believer, but worse to realise that he'd drunk the neocon Kool-Aid as well. "There is no alternative" had ceased to be the personal motto of Margaret Thatcher and had become the mantra of the entire political class. The failure of the protest to change government policy simply reinforced the feeling of powerlessness.
Where Blair had WMD as a precisely imagined threat, Christopher Hitchens had the hydra-headed monster of Islamofascism. Since 1945, "Fascism", when appended to any type of movement or activity, simply means something that we will not tolerate - i.e. we cannot co-exist or negotiate with it, like "heavy" parents. We are ironically using a totalitarian bogey to adopt a totalitarian attitude. Hitchens got into knots trying to define Fascism in such a way as to allow an equivalence with the mad mullahs. For example, being hostile to modernity, nostalgic for empire, obsessed with humiliations, prone to anti-semitism, leader-worship, sexual repression, anti-intellectualism etc. Unfortunately, these would all equally apply to your average right-wing nut job. In Hungary, these would be qualifications for a cabinet post.
Hitchens catalogue of sins was the description of a reactionary, which showed his ongoing attempt to square his conflicting progressive and neocon impulses. While all these characteristics can be found among fascists, they do not cohere as Fascism, which is the ideology of the totalitarian state. al-Qaeda, as a supra-state network obssessed by the propaganda of the deed, is closer to the violent Anarchism of the late 19th century, but "Islamoanarchy" doesn't work so well as a bogey-word. Hitchen's scatter-gun justification for the term is evidence that he was himself at work on propaganda, the imaginative realisation of a repellent concept, not a lucid analysis of a real-world movement. His strength as a polemicist was in the invention of a caricature of the enemy, from Mother Theresa to Jerry Falwell. It is perhaps no surprise that the term seems to be rapidly falling out of fashion following the death of Osama Bin Laden.
One of Seymour's chief criticism's of Hitchens is that he always wanted to be on the right side of history, not unlike the Stalinist fellow-travellers that his hero Orwell criticised. He could not fully accept the neoliberal case, so he diverted his commitment to the safer strand of the anti-clerical Enlightenment, much as Orwell escaped both Eton and Stalin by dallying with Trotsky. Blair clearly thought that neoliberalism was the spirit of history incarnate, for good and ill. His personal tragedy was to be seduced by more cynical Americans into accepting that the neocon agenda was part and parcel of the zeitgeist. Like Hitchens, he was a little bit in love with the US and the ideal of the noble state. The UK proved too small for both their ambitions.
The right (and neoliberal fellow-travellers) have understandably criticised the anti-war demo memorialisers as narcissistic, but in so doing they avoid the need to admit that the anti-war case was bang-on in every major respect: the WMD claim was confected, the US had already decided on regime change regardless of Saddam's "threat level", and the consequences of invasion were likely to be very bad. This cannot be blithely dismissed by pro-war apologists, like John Rentoul, on the grounds that contemporary UK opinion polls supported the war, or even by quibbling about the number of civilian dead or whether Iraqis today are hopeful about their prospects. A protest does not need to mobilise 50% of the population to be right, and the claim that things are better now in Iraq than they were 10 years ago is a bit like celebrating the decline of anti-semitism in Eastern Europe. Ends do not justify means.
The Newsnight programme aired the now common belief that the US bungled the invasion by stirring up the hornet's nest of sectarian and ethnic rivalries, and that it lacked a credible plan for state-building in the aftermath. This requires you to believe that the US State Department knew next to nothing about the country in advance and that the occupation authorities were incompetent thereafter. While cock-up is more prevalent than conspiracy, this tragic error trope seems a little too convenient, much as the way that the absence of WMD has been cast as "an assertion made in good faith that turned out not to be true". Christopher Meyer, the former UK ambassador to the US, noted that democracy cannot be parachuted in like humanitarian aid, at which heads nodded sagely. Again, this suggests that exporting democracy was actually a war aim that regrettably failed.
I think a more useful way of looking at the Iraq War is to assume that it was a success, rather than a failure. Given the outcome, this would imply that the objective was to emasculate Iraq as a regional power. This is consistent with the proposition that the root cause of the war was Saddam's unwillingness to be limited to his role as a buffer against Iran in the 80s; that once he'd decided to upset the regional balance of power with the invasion of Kuwait, and had proven unrepentant after defeat in 1991, it became necessary for Iraq to be neutralised. A no-man's land is the next best option to a buffer. WMD and the spurious links to al-Qaeda and 9/11 were mere opportunism. The UK suggestion that this was a continuation of the humanitarian policing operations in Kosovo and Sierra Leone is plainly absurd. As for the spread of democracy, no one ever suggested we invade Saudi Arabia in that cause.
Martin Kettle feels that the toxic legacy of Iraq must not blind us to the era of Blair (and Brown), which was "more an attempt to reassert social values and new forms of solidarity in the aftermath of Thatcher than an attempt simply to embrace Thatcher's possessive individualism". He even describes it in terms of there being "no alternative". As far as the left is concerned, "There wasn't a coherent alternative on offer and, even if there had been, not enough people would have voted for it." This is weaselly. The inclusion of the judgemental "coherent" is meant to rule out the actual alternatives on offer. The assumption that had there been a "coherent alternative" it would still not have commanded popular support is obviously illogical. If it never existed, how can we say what the electorate's view would have been?
Kettle dislikes the vulgar idea that Blair was just the continuation of Thatcher, but that, with suitable caveats about amelioration, is what he was. The emotionalism of the 2003 anti-war movement, and the virulence of the subsequent attacks on Blair, were in part a transference of the broader frustration caused by the neoliberal turn. After the hopes raised in 1997, it was bad enough to discover by 2003 that Blair was a neoliberal true believer, but worse to realise that he'd drunk the neocon Kool-Aid as well. "There is no alternative" had ceased to be the personal motto of Margaret Thatcher and had become the mantra of the entire political class. The failure of the protest to change government policy simply reinforced the feeling of powerlessness.
Where Blair had WMD as a precisely imagined threat, Christopher Hitchens had the hydra-headed monster of Islamofascism. Since 1945, "Fascism", when appended to any type of movement or activity, simply means something that we will not tolerate - i.e. we cannot co-exist or negotiate with it, like "heavy" parents. We are ironically using a totalitarian bogey to adopt a totalitarian attitude. Hitchens got into knots trying to define Fascism in such a way as to allow an equivalence with the mad mullahs. For example, being hostile to modernity, nostalgic for empire, obsessed with humiliations, prone to anti-semitism, leader-worship, sexual repression, anti-intellectualism etc. Unfortunately, these would all equally apply to your average right-wing nut job. In Hungary, these would be qualifications for a cabinet post.
Hitchens catalogue of sins was the description of a reactionary, which showed his ongoing attempt to square his conflicting progressive and neocon impulses. While all these characteristics can be found among fascists, they do not cohere as Fascism, which is the ideology of the totalitarian state. al-Qaeda, as a supra-state network obssessed by the propaganda of the deed, is closer to the violent Anarchism of the late 19th century, but "Islamoanarchy" doesn't work so well as a bogey-word. Hitchen's scatter-gun justification for the term is evidence that he was himself at work on propaganda, the imaginative realisation of a repellent concept, not a lucid analysis of a real-world movement. His strength as a polemicist was in the invention of a caricature of the enemy, from Mother Theresa to Jerry Falwell. It is perhaps no surprise that the term seems to be rapidly falling out of fashion following the death of Osama Bin Laden.
One of Seymour's chief criticism's of Hitchens is that he always wanted to be on the right side of history, not unlike the Stalinist fellow-travellers that his hero Orwell criticised. He could not fully accept the neoliberal case, so he diverted his commitment to the safer strand of the anti-clerical Enlightenment, much as Orwell escaped both Eton and Stalin by dallying with Trotsky. Blair clearly thought that neoliberalism was the spirit of history incarnate, for good and ill. His personal tragedy was to be seduced by more cynical Americans into accepting that the neocon agenda was part and parcel of the zeitgeist. Like Hitchens, he was a little bit in love with the US and the ideal of the noble state. The UK proved too small for both their ambitions.