Search

Tuesday, 16 January 2018

The Legacy of George Osborne

The collapse of Carillion has been heralded as a watershed moment for privatisation. I doubt that's true. The deficiencies of public sector outsourcing and PFI deals have been known for decades, while the prospect of a major change in public sector management is probably only going to occur with a change of government, which may not happen before 2022. The refusal to bail out the business, even though it means more write-offs for state-owned RBS, suggests that the government is keen to avoid the distraction of both a specific inquest into the company and a general debate about public services and infrastructure investment, as much as a desire to protect taxpayers' money. Given the competing demands of Brexit, that is hardly surprising. While the Labour Party's robust insistence to the opposite has been predictable (imagine how this would have been handled pre-Corbyn), the failure of George Osborne to make his now habitual dig at Theresa May has been less so.


The former Chancellor's editorial in yesterday's Evening Standard, which relegated the topic behind the scourge of plastic waste, suggested the root cause was a combination of the excessive size of Carillion and the tendency of civil servants to prefer the devil they know: "The failure to use a variety of smaller, mid-size companies undermines innovation and leaves services hostage when things go wrong". What is missing from this timid analysis is political accountability. The push for government contracts to go to more and smaller firms was a cosmetic gesture during Osborne's tenure at the Treasury that has resulted in minimal change for pretty obvious structural reasons. It has little to do with civil servants' aversion to novelty. His claim also obscures that the material change in circumstances in 2010 was the imposition of austerity. This led to Whitehall putting the screws on suppliers to reduce contract costs, which was one of the contributory factors to Carillion's eventual demise. As the old Ernest Hemingway joke has it, "How did you go bankrupt? Two ways. Gradually, then suddenly".

Osborne, not for the first time, is being disingenuous. Dealing with a single big supplier is easier for the client, particularly when cuts erode the public sector's capacity for supplier management, while big companies with extensive portfolios of public revenues are more likely to secure the financing necessary to initiate large contracts without the need for the state to make formal guarantees that would appear on the national accounts. This tendency towards the big also encourages suppliers to operate essentially as intermediaries (hence the way they morph into empty brands with meaningless names), willing to facilitate any new service required by the client regardless of their actual competence or history, if only to maintain the relationship and the advantages of incumbency. The result is the consolidation of companies handling government contracts, not a flowering of supplier variety, and an ever more desperate search for low-cost sub-contractors and cuttable corners rather than a commitment to innovative service delivery.

What was being outsourced to Carillion was financial and project management, not innovation or risk. For this reason it is pointless to talk of nationalising the company, or to wonder why it is being liquidated rather than being put into administration. It isn't a going concern and it has little in the way of tangible assets. There aren't any means of production to put into public ownership and the company's institutional capital centres on the ability to win public sector contracts, which would be redundant if it were nationalised. Of the 4.43bn total assets it reported in 2016, only 144m were fixed assets (property, plant, equipment etc). Cash, receivables, inventory and investments made up 2.4bn and goodwill (representing the intangible cost of historic acquisitions) a further 1.6bn. In other words, its tangible assets were largely money or near-money, most of which appears to have been haemorrhaged over the last year. That creditors may receive only a few pennies in the pound tells you not only that the cash has gone but that the contracts represented by the receivables aren't sufficient to meet the liabilities, confirming that the proximate cause of collapse was evaporating margins and an inability to secure further financing.

Both of these will have been evident for months, if not years, which is why it is entirely legitimate to demand scrutiny of the apparent under-estimation of the risk of collapse by both Carillion's board and government ministers. But we shouldn't lose sight of the fundamental issue here, namely that the outsourcing sector and PFI contracts have both been pushed by structural forces towards financial arbitrage rather than competitive service delivery, a situation exacerbated by the contingent demands of austerity, which has reduced both the number of contracts and their margins. The result is a "market" made up of fragile suppliers who are obliged to grow or die; or at least keep winning new contracts (no matter how small the profit) to keep the life-support machine turned on. When even the FT describes Carillion as "a lawful sort of Ponzi scheme - using new or expected revenues to cover more pressing demands for payment", then the cat is well and truly out of the bag. This was an unsustainable model and plenty of people must have known that for a long time.

Carillion's executive remuneration was geared to winning contracts and growing, which meant that there would always be a reluctance to retrench, even when the warning signs started to flash, such as in 2014 when hedge funds began short-selling the company's shares. Other obvious signs of distress were the growing deficit in the pension fund and the imposition of abusive supplier payment terms, which were increased to 120 days in 2013 (another example of financial arbitrage - client terms remained at 30 days in most cases). This was evidence in the public domain. The apparent failure to raise any of these issues at board level points to the uselessness of non-executive directors who share the same cultural values (growth at all costs, the primacy of shareholder value, the need for superior executive rewards etc). This failure also extends to ministers, who in respect of the governance of public sector outsourcing and PFI deals have a de facto non-executive role. Their inability to read the signs, and the suggestion that they may have deliberately ignored some in order to cut the business more slack, points to a similar cultural weakness.

The centrist debate on outsourcing revolves around the concept of "drawing the line", which is a mixture of Coasian theory and liberal propriety. This suggests that getting it right is a matter of refinement and judgement. As Simon Jenkins harrumphs, "How to find the ideal mix of public sector loyalty and private sector incentive has bedevilled state procurement since the dawn of time. But there should be clear rules, as with monopoly regulation, such as limits on market dominance, debt and remuneration". The suggestion that we lack clear rules on all these matters is absurd. Public procurement is highly regulated and rule-bound, not just because of political sensitivity but because suppliers themselves appreciate the advantages that accrue to incumbents through high barriers to entry. The flaw in this abstract thinking is that in the real world public sector loyalty and private sector incentive constitute a volatile mix. There is no stable combination; no clear line. This is not necessarily a problem if one is dominant. The problem arises when a dynamic like austerity erodes both loyalty and incentives. As the public sector struggles to recruit and retain staff, and as outsourcers collapse, the government is waking up to the legacy of George Osborne.

Wednesday, 10 January 2018

Privilege and Merit

The news that Virgin Trains is to stop selling the Daily Mail is a pretty obvious distraction from season ticket price rises and the scandal of the East Coast franchise, however it does incidentally highlight the class association of the railways in popular discourse. It is important to note that while Virgin Trains offered the Daily Mail for sale to standard class passengers, it provided complimentary copies to first class passengers. The howls from Northcliffe House about "censorship" do not reflect the loss of the paltry revenue stream arising from the former, but the ending of the tacit endorsement entailed by the latter. In other words, this is about the loss of privilege rather than the denial of a human right and one that challenges the newspaper's claim to pre-eminence among the UK's middle-class readership. In claiming that its staff objected to the Daily Mail's content, Virgin Trains has endorsed the liberal activism of Stop Funding Hate (changing the world one informed purchase at a time), while stopping short of suggesting that its employees might have a say over its own offerings, such as its rip-off "free" Wi-Fi. In so doing, its implies that the Daily Mail may be less representative of "decent" opinion in Britain, which is an affront to the paper's self-image.

One point made with dull regularity when the annual price rises come round is that railways are a middle-class subsidy. This is true but irrelevant. So are roads and universities. Pretty much any public good tends to be exploited more effectively by the middle class, from the NHS to national parks, but that's not a sufficient reason to argue against public investment. The question should be whether the subsidy produces outcomes that benefit all. Railways bring wider gains for the economy and society than those that accrue to actual travellers. They allow for city agglomerations that enable larger and more specialised industries, which produces more jobs and higher average pay. The increased demand for rail travel over recent decades reflects the increasing proportion of service and professional jobs within the economy plus the related rise in property values in city centres. The result is more and more people needing to travel in from the suburbs or satellite towns. This is exacerbated in cities like London by the Green Belt, which obliges commuters to live further out, and by premium services like HS2 that seek to extend the outer limit of the commuter zone. Compared to the subsidy of the Green Belt, public spending on suburban rail travel is relatively modest.

One reason for emphasising the middle-class nature of rail travel is to suggest that the cost of nationalisation would disproportionately fall on the working class (and emblematic white-van drivers in particular) in the form of higher taxes, though as ever this avoids the broader question of incidence and the shift from wealth and income to consumption taxes. Public ownership of the railways is not a panacea, though it still has significant advantages over the use of private operators. What ultimately matters, as we saw in the 1970s and 80s, is the level of investment. The attraction of rail for both operators and financiers is the same: a natural monopoly with a guaranteed income stream. But a monopoly almost always under-invests, hence rail privatisation leads to a de facto state railway parasitised by rent-extractors focused on branding and public relations (hello, Richard Branson). The franchise model is particularly unsuited to railways because few operators can realistically bid, so there isn't a competitive market, and the strategic importance of rail transport means the state will always be on the hook, which allows private operators to bale out when their profits are threatened, as we have repeatedly seen in the UK.


The argument for introducing market discipline - that competition between suppliers will lead consumers to reveal their preferences - simply does not apply to railways. By dropping the Dail Mail, Virgin Trains is cannily giving the impression of a working market in which low sales have revealed its customers distaste for the paper: it has been judged on its merits. The same sort of thinking appears to lie behind the creation of the Office for Students, which has been in the news lately. The introduction of variable tuition fees has not led to the appearance of a dynamic price signal, largely because guaranteed student loans and earnings-related repayment mean that buyers aren't actually price-conscious (and accelerated two-year degree courses won't change that). Likewise, competition has not led to market exit by poor suppliers because supply, at the level of the individual college and within the limitations created by an annual buying round, is not elastic. University access remains a game of musical chairs in which some of the chairs are in a hard-to-reach annex. Like many other regulators, the job of the OfS will be to simulate competition in order to justify rent-extraction and to institutionally favour privileged suppliers such as Oxford and Cambridge.

The furore over Toby Young has been hitched to a variety of current phenomena, from Harvey Weinstein to campus free-speech, but the central argument concerns merit. To his liberal detractors, Young lacked both specific qualifications for the job and a reputation for probity and public service sufficient to be seen as one of "the great and good". This criticism not only elided the antagonistic purpose of the OfS but ignored the lesson of his father, Michael Young's The Rise of the Meritocracy: that all elites are self-perpetuating and merit is consequently less objective than we imagine. The younger Young's defenders (wisely) did not try to big-up his unimpressive work with free schools but instead lauded his free-thinking and "caustic wit". His supposed merit was his iconoclasm, which shows how merit can easily be interpreted to suit any agenda. Young's role as the little boy confronted by a naked emperor was intended to help improve the higher education sector by forcing it to "think outside the box", but that necessarily entailed criticising the academic and public service elite who have defined the boundaries of that box up till now.

Coincidentally, Young's appointment came shortly after the resignation as a government advisor of Andrew Adonis, the meritocrat's meritocrat, which perhaps created a contrast that was a little too stark for (liberal) public taste. Indeed, you have to wonder why the suddenly-available Adonis wasn't considered as a last-minute candidate for the OfS board role, even if anything other than chair (a role he allegedly lobbied for) might be beneath his dignity. Not only is he impeccably neoliberal and well-versed in the sector but he went out of his way to criticise "fat cat" vice-chancellors last year. I suspect the answer is that Adonis might take the brief to introduce competition, as opposed to merely simulate it, too seriously. His claims that university expansion was a mistake and that the sector operates as a price cartel were perhaps a little too iconoclastic (not to mention questionable) at a time when the government was more interested in beasting "snowflake" students as part of a misbegotten culture war. Both Young and Adonis have been left without plum jobs, but I doubt they will lose the privilege of having their fervid opinions relayed by the UK press, from the Guardian to the Daily Mail.

Friday, 5 January 2018

The Office for Fuck's Sake

There are many reasons for criticising the odious Toby Young, but the crudity of his language (aka "caustic wit") is not one of them. He is, after all, a paid entertainer. If Frankie Boyle had got the gig with the new Office for Students, rightwing criticism of the Scottish comic's language would be equally beside the point. Likewise, Young's history of making derogatory comments about women or working class kids isn't a good look for someone who is supposed to further the interests of all students, but it isn't particularly germane to his role on a body that will be acting as a competition regulator for universities. As should be obvious from the makeup of the rest of the board, which has been successfully obscured by the focus on Young, this is a typical neoliberal agency intent on the further marketisation of higher education. There will be gender and class bias, but this will be structural rather than the result of Young's powers of persuasion. The reason why he is unsuitable for a position on the board of the OfS in its own terms is that he lacks any relevant experience of business or competition regulation. Being a journalist or setting up a free school are irrelevant achievements in this context.

Young is dim enough to perhaps not realise when is being played. The timing of the announcement of the appointment, one minute into the New Year, looks less like an attempt to bury bad news and more like a calculation that booze and the bank holiday would quickly drive the debate onto Twitter, which in turn meant that wider press coverage tended to be about the ensuing outrage rather than the remit of the OfS or the business-heavy composition of its board. If Young were to step down, this would be treated by many as a moral victory, while the OfS itself would get a free pass. The battle won, the war lost. You can see a similar dynamic at work when a Berkeley professor uses Twitter to explain how Donald Trump manipulates the media agenda through Twitter, thereby ignoring the more substantive damage being done in Congress. The Tories appear to have decided some months ago (I'm thinking specifically of the spat between Jeremy Hunt and Stephen Hawking) that they have little to lose in adopting an antagonistic media stance and much to gain in terms of distraction. The elevation of Toby Young is clearly part of a wider strategy.

The root problem here is liberal morality. A good example of this came on the same day as the OfS board announcement when John Harris penned a piece in the Guardian titled "Take it from the insiders: Silicon Valley is eating your soul". Harris imagines that the tendency of tech titans to restrict their own childrens' exposure to social media is a telling admission of their product's toxicity, like a beer baron insisting that his kids be shielded from alcohol, but all it really indicates is a retreat from a public social network to a reserved space of privilege (I doubt the kids are sitting bored at home). That liberal critics emphasise the supposed addictive properties of social media (Harris wheels out the trusty "dopamine hit") is just the usual criticism of the mass as weak-willed and incapable of self-control and thus of a piece with articles on New Year resolutions and detoxing. Thirty years ago the same articles were being written about the malign effects of television and the need to restrict childrens' viewing time. A hundred years ago children were being criticised for always having their noses in books.

Having diagnosed the problem to his own satisfaction, Harris espies a solution: "There is a possible way out of this, of course. It resides not in some luddite fantasy of an army of people carrying old Nokia phones and writing each other letters, but the possibility of a culture that actually embraces the idea of navigating the internet with a discriminating sensibility and an emphasis on basic moderation". That last word is being used in a dual sense: as an act of self-censorship and as advocacy of a middle way. What Harris doesn't explain is how we are to acquire a "discriminating sensibility". Playing Snake on a Nokia 3310, perhaps. If Facebook is increasingly creating a walled garden that shuts out the world ("let us discriminate for you"), Twitter remains much more of a public space open to riotous assembly, but it is one in which propriety and judgement are becoming ever more prominent, hence the increasing number of users who feel the need to broadcast their decisions to block or report others. This performative curation of one's own filter bubble looks suspiciously like a bigger dopamine hit than making a sarky comment beneath a blue tick mark's tweet.


Eric Posner of the Chicago Law School published Twenty Theses About Twitter last July. He starts by suggesting that "People sign up for Twitter for two reasons: to obtain information and to exert influence", but then goes on to show why this is a false prospectus. There are better sources of information and Twitter is a poor medium for making a case sufficient to change someone's mind. Posner's central thesis is one of self-indulgence:

7. Twitter’s real function is to enable people to obtain validation for their beliefs.
8. People send tweets with a single overriding purpose: to get the tweet "liked" or retweeted.
9. When your tweet is liked or retweeted, you enjoy a dopamine surge.

This may sound like a reduction of humanity to the status of lab rats, but Posner is no behaviourist. What he is suggesting is that the risk/reward dynamic of Twitter is so different to the "real world" of calculating utility maximisers that the online persona can undermine the offline:

16. In the non-virtual world, successful people take care to keep up impressions, for example, they avoid making controversial statements to friends, colleagues, and strangers except when unavoidable, and even then do so in a carefully respectful way.
17. In Twitter, the same people act as if their audience consisted of a few like-minded friends and forget that it actually consists of a diverse group of people who may not agree with them in every particular on politics, religion, morality, metaphysics, and personal hygiene.
18. Without realizing it, people who use Twitter damage the image of themselves that they cultivate in the non-virtual world.

In effect, Twitter is a drug that corrodes human capital through the collateral damage it does to social relations, much like cocaine. Of course this theory only holds for those people whose Twitter ID reflects their public persona and whose real world human capital is significant. While economic liberals like Posner stress the dopamine hit, which is essentially a transactional model that leaves the idea of preference intact, cultural liberals are more likely to focus on the corrosive effect of anonymity, hence the lasting popularity of the Ring of Gyges as a trope in modern analyses of the disinhibition that the Internet enables. The one focuses on a lack of self-restraint, the other on the circumvention of social constraint. To put it another way, Posner thinks the reward (the hit) is too cheap in the short-term and the delayed cost (damage to reputation) is being excessively discounted, while cultural liberals see anonymity as a form of free-riding. What they share is a belief that the price of social media should be higher, even though its nature (the need for data at scale and the reliance on advertising) demands that the service be provided free. Twitter is a pure market because it enables the expression and ranking of almost any individual preference, but it is "irresponsible" because it does so by undermining the price mechanism.

The low cost of entry of social media is a leveller in the sense that a "speech-act" of a nobody has the same form as that of a celebrity or professional commentator. In the offline world this is rare outside of public meetings - i.e. heckles - which is why such events are carefully managed, if not avoided. But just as the speaker at a lectern or in a pulpit has a structural advantage, so Twitter creates a hierarchy of regard through the number of followers and likes. A consequence of this is that holding others up to ridicule becomes a powerful tool for the powerful, on a par with having a column in the Spectator, but as the case of Toby Young shows, Twitter still retains the potential to empower "the mob". While many "nobodies" have helped excavate Young's more objectionable tweets, the press coverage on the outrage has inevitably concentrated on disobliging comments by other "slebs" or credentialled experts. And so the game goes on. Toby Young is obviously a charlatan and a fool, but that makes him all the more suitable for the role of a patsy. He must by now have twigged that he is the sacrificial offering for the inauguration of the OfS, so presumably he has also secured a suitable pay-off. I expect him to feature in a future honours list.

Friday, 29 December 2017

The Case Against a Basic Income

The case against a universal basic income was put yesterday by two very different thinkers. Nick Boles is a well-connected Tory who since his election to Parliament in 2010 has served as Minister for Planning and now Minister for Skills (his achievements have been negligible but he remains a fixture of various think-tanks). Daniel Zamora is an academic sociologist who has made a name for himself with a critique of Michel Foucault's attitude towards neoliberalism and the welfare state (in a nutshell: too admiring of the one and too dismissive of the other). Both are against a basic income, but for different reasons. Apart from the simple coincidence of publication, I have decided to contrast their positions because of the difference in approach it highlights between UBI sceptics of the right and the left. Beyond the issue of affordability, the right tends to resort to arguments that mix old notions of the sin of idleness with newer psychological theories of self actualisation. In contrast, left sceptics question whether UBI is truly transformative, suggesting that it may simply entrench capitalism. The one is concerned with individual salvation, the other with social revolution.

Here is Boles's case in its (apparent) entirety: "The main objection to the idea of a universal basic income is not practical but moral ... Its enthusiasts suggest that when intelligent machines make most of us redundant, we will all dispense with the idea of earning a living and find true fulfilment in writing poetry, playing music and nurturing plants. That is dangerous nonsense. Mankind is hard-wired to work. We gain satisfaction from it. It gives us a sense of identity, purpose and belonging … we should not be trying to create a world in which most people do not feel the need to work". This isn't a robust argument, not least because it fails to explain why a poet, a musician or a gardener is incapable of earning a living, or why society should tolerate the idle rich. Despite it opening assertion, it fails to explain how work is moral, probably because the traditional arguments for this (essentially biblical homilies) sound ridiculous today.

Even the argument for self-actualisation is dubious. I can achieve a sense of identity, purpose and belonging by joining a criminal gang, which would hardly be moral in the conventional sense. I can also achieve those same ends by devoting myself to unpaid charitable work, which would be conventionally moral but would require financial subsidy by the community were I not one of the otherwise idle rich. The essentialist argument, that work is part of human nature, is also demonstrably untrue. We are not hard-wired to work, we are hard-wired to survive, which is why levels of labour in pre-industrial societies tended to differ across geographies, reflecting the varying demands of local environments. Beyond achieving survival and social reproduction, we work (if under no external pressure) out of self-interest and for self-satisfaction, at which point work can look a lot like play. This means that there are always two types of work: necessary labour and surplus labour. As Marx pointed out, it is the struggle for control of the latter that drives society.


Daniel Zamora makes a more coherent case in Jacobin, first of all putting the growing interest in UBI in a historical context: "Paradoxically, then, UBI seems to be a crisis demand, brandished in moments of social retreat and austerity. As politics moves to the right and social movements go on the defensive, UBI gains ground. The more social gains seem unreachable, the more UBI makes sense. It’s what botanists would call a 'bioindicator': it indexes neoliberalism’s progress. Support for basic incomes proliferates where neoliberal reforms have been the most devastating." This is interesting because it suggests an interdependency between UBI and neoliberalism. Though the idea predates both that particular style of capitalist reasoning and its political hegemony from the 1980s onwards, it is true that modern UBI advocacy often adopts neoliberal tropes, such as personal utility and human capital (the "entrepreneur of himself", as Foucault put it). That, however, should make us cautious of accepting descriptions of UBI that ignore its history (e.g. the regular misrepresentation of Thomas Paine's concept of an endowment) or claim a special relevance because of recent technological developments (them robots).

Before making his central argument, Zamora rehearses a few of the usual criticisms of UBI. First, that "No existing economy can pay for a generous basic income without defunding everything else." This doesn't automatically follow, but the claim highlights how thinking about the funding of a UBI tends to focus on the state's fiscal position (current expenditure versus revenue) rather than the nation's accumulated wealth, and thereby ignores the power of the sovereign to radically reapportion that wealth (something it can easily do, as shown by the bank bailouts and QE after 2008). A meaningful UBI would be in addition to, not instead of, the welfare state, and that necessarily means it would be funded from national wealth rather than exclusively from current income. In effect, the state would expropriate wealth now held in land, property and equities. That might sound extreme, but it is merely retroactively implementing a sovereign wealth fund along the lines of Norway or Alaska. Alternatively, a UBI can be funded out of current taxation, but just as it should be additional to current expenditure, so that taxation should come from capital assets currently privileged by the tax regime, such as land, offshore holdings and equities (this can build a fund quite quickly - consider the rapid growth of China's SWFs).

Zamora's second conventional argument is more credible: "If UBI does take shape, current power relations will favor those who have economic power and want to profit by weakening the existing system of social protection and labor market regulations. Who will decide the monthly amount and who will dictate its terms and condition? Who do today’s power relations favor? Certainly not the worker." This is very true, but that is why the question that should be asked of a UBI scheme is not "Is this affordable?" but "How will it be initially calculated and then uprated in future?", because that will tell you where the power lies. If a state adopts a sovereign wealth fund approach, then the dividend would probably be set through a transparent calculation under democratic control. As GDP grows, the fund dividend should grow likewise (if a portion were invested abroad and produced yields higher than UK GDP, the excess could be reinvested in the fund without affecting the dividend). A tax-based approach could also be geared to GDP, which in turn would encourage tax to shift from below-average growth revenue such as income to above-average growth revenue such as property and equities, which would also be more progressive.


Unlike Boles, Zamora is alert to the issue of surplus labour and thus the importance of work time, though he sees it in terms of the distributive justice of work as much as the distributive justice of income: "That’s why a universal job guarantee and a reduction in work hours still represent the most important objectives for any left politics. Collectively reducing work time is politically and socially preferable to creating a socially segmented pool of unemployed workers, a situation that would have serious consequences for the employed. It’s not hard to imagine how this situation could foster divisions within the working class — as it already has over the last several decades". A UBI and a reduction in work hours are not mutually-exclusive, indeed the best scheme would see a portion of growth remitted as reduced time, which incidentally would be a spur to productivity. Keynes saw a progressive reduction of work hours (and by implication a UBI) as the product of growth and "the power of compound interest". Zamora is right to highlight the potential for friction between the employed and the unemployed, but in doing so he omits another character in the story.

The modern dichotomy of the working class - strivers versus skivers - is not the same as the old dichotomy of the deserving and undeserving poor, even if the characters involved appear to be the same. The latter assumed a third party, the rich, whose largesse was to be apportioned sparingly, whether through charity or taxation, as a moral judgement on the behaviour of its recipients. The newer dichotomy is a product of the welfare state and the idea that the working class, as the largest part of the population, must meet its welfare needs through the collective insurance of social security. This essentially liberal design succeeded in removing the third party from the equation, leaving an increasingly bitter struggle between workers suffering compassion fatigue and a "welfare class" (or underclass), that came into sharp focus in the 80s. It was no coincidence that this period saw the emergence of the ideological demand that insurance, for loss of income as much as health, should be optional and subject to market provision. That was the rich figuratively slamming the door as they went out.

The heart of Zamora's case against UBI is the deleterious effect it would have on the welfare state and thus working class solidarity: "The institutions workers established after World War II did more than stabilize or buffer capitalism. They constituted, in embryonic form, the elements of a truly democratic and egalitarian society, where the market would not have the central place it now occupies". There are obvious echoes of Karl Polanyi in this broad historical sweep, which Zamora further elaborates: "Isn’t the best way to fight against capitalism to limit the sphere in which it operates? Establishing a base income, by contrast, merely allows everyone to participate in the market". This is true, and it highlights why talk of UBI as "the consumer of last resort" (or its attractiveness to Internet companies seeking a universal monopoly) is essentially neoliberal, but the implied alternative of non-market exchange is both utopian and potentially even more socially divisive. Restricting some people's access to the market is not the same as restricting the scope of the market in all our lives.


Zamora's final argument is oddly Hayekian, imagining a new coordination problem arising because a UBI would undermine the price signal of wages: "Under capitalism, the division of labor is set in a brutal fashion, relegating large sectors of the population to jobs that are difficult and badly paid, but often of great value to society. A 'utopian' UBI, by contrast, simply assumes that in a society liberated from the work imperative, the spontaneous aggregation of individual desires would yield a division of labor conducive to a properly functioning society; that the desires of individuals newly freed to choose what they wish to do would spontaneously yield a perfectly functional division of labor. But this expectation is assumed rather than demonstrated." The assumption behind a UBI is that the increased power of workers to refuse shit jobs would increase wages for those jobs, achieving either a new equilibrium (closer to the work's true social value) or increased capital investment in automation. The point is less that people can choose what they wish to do (they would still be constrained by their ability and qualifications) but that they can more easily choose what they refuse to do.

Zamora is wise to reject the moralising that accompanies arguments both for and against a UBI and to concentrate on the way that New Labour-style poverty alleviation and the isolation of a segment of the working class serve capitalist interests: "Combatting poverty thus permits the inclusion of social questions on the political agenda without having to fight against inequality and the structural mechanisms that produce it ... both the Left and the Right want the 'surplus population' to be the problem, thereby supplanting those old, out-of-date, dogmatic ideas that placed exploitation at the heart of the social critique". However, this leads to a pessimism about the potential of UBI to directly address accumulated wealth and capitalist power: "It is much easier to imagine what a different form of social organization could look like on the basis of the more progressive elements within the welfare state than it is to start from abstract ideas that are quite often disconnected from the reality of workers. It’s always easy to imagine different worlds and communist societies in a theoretical and abstract way." This apparently pragmatic view puts a lot of weight on the persistence of those "progressive elements", but that seems equally abstract when faced with the reality of Universal Credit.

Ultimately, Zamora's case doesn't convince because he starts from a narrow, neoliberal definition of UBI, essentially because his professional interest is the neoliberal critique of welfare. This produces a rhetorical UBI that is parsimonious, funded from income tax, and that seeks to reduce the scope of the traditional welfare state. None of these are actually givens. A better UBI would be generous enough to effectively empower labour in negotiation with capital, would be funded from accumulated wealth, and would be additional to existing conditional welfare (it would also require a major investment in social housing). If Zamora exhibits a left pessimism, Nick Boles displays a conservative lack of imagination. The news that Silvio Berlusconi is trying to resurrect his political career by proposing a "dignity income" (a UBI of €1000 a month) does not mean that the Italian right is more developed in its thinking, but rather that it too is fitting the concept into existing political forms, in its case clientelism. The political problem for UBI in the UK is the right's continued reliance on moralising, the centre's antipathy to the welfare state, and the left's residual labourism. That the Labour party is showing an interest owes much to Corbyn and McDonnell's grounding in the marginalised, post-labourist socialism of the 1970s.

Friday, 22 December 2017

English Eerie

The recently finished TV series Detectorists is a story of ambition, often thwarted and often ridiculed, of malevolent outside influences, of being true to one's nature, of the community of the living and the dead, and of the consolations of fellowship. It is both dynamically conservative and organically socialist. In its material concerns, it is of an ilk with Cash in the Attic and Escape to the Country, but it rises above cupidity and entitlement not simply through the dramatisation of solidarity and the deployment of wit but by knitting its themes into the tradition of the English eerie (historical echoes, revelatory nature, symbolic animals), which is in essence a meditation on historic property rights and dispossession. For all its bucolic charm, there are links here with the rural horror of Ben Wheatley's A Field in England and the desolation of Iain Sinclair's London Orbital. It is, I'd argue, not just one of the cleverest comedies to have appeared on British television in many years, but an essential Brexit narrative, combining self-absorption, resentment and a stoic acceptance of fate. This doesn't mean that it comes down on one side or another of the debate, but that it recognises the common deficiencies as much as the common decencies.

Though there are occasional villains (Lance's ex-wife) and moments of farce (the mayor's chain lost while dogging), the central thread is the ongoing contest between the two detecting pairs, Andy and Lance and "Simon and Garfunkel". This is not merely a struggle over the control of land (detecting permissions) but a clash of language. The latter two are not only uncertain in their own names (revealed to be Peters and Lee near the end of the second series), but they frequently change their club name, employing ever more pretentious and latinate forms, from AntiquiSearchers to Terra Firma. Peters is condescending, pedantic and given to florid terminology, not to mention the flourishing of legalistic pieces of paper. Andy and Lance are sarcastic, often mock-obtuse, and prone to laying verbal traps. This is Norman meets Saxon, which has an inevitable echo in remainer versus leaver. The end of the third series sees reconciliation against a backdrop in which Andy and Lance's separate homes are secured, suggesting perhaps that we can all get over Brexit so long as we fix the housing problem.


So what exactly is the eerie and why is it particularly English? It isn't the same as horror because it isn't directly threatening. The overriding sense is of something that has disappeared but left a reproachful memory. It is a haunting without a ghost. The eerie is also different to the uncanny. The latter is disconcerting because it is ambivalent - Freud defined the Unheimlich as something that is both at once familiar and frightening, like meeting your double (the quintessential modern form of the uncanny is the android). If horror is about threats to the person from without, and the uncanny speaks of the uncertainty of personhood, then the eerie is about an absent, historical other. It is also necessary to distinguish the eerie from the psychogeographical, which is part of a different (if detourned) tradition of the ruminative traveller in an antique land. Psychogeography is about the difficulty of holding onto the past: an aestheticisation of decay and the redundancy of the built environment. The eerie is about a past that is unwilling to let go of us, but which never gets round to actually making a demand on us. If the uncanny is an expression of an imprecise anxiety, the eerie is an expression of an imprecise guilt.

Some see English eeriness, particularly when it harks back to Saxon or Roman times, as an acknowledgement that the English are a nation formed by invasions and intermingling. Others see its sensibility of place as arising from a Romantic reaction to industrialisation and the emergence of an ecological consciousness: as people became aware of their own feelings independent of traditional hierarchies and communities, so they began to think of themselves as situated in nature. If fact, all these interpretations are attempts to project a modern sensibility backwards. For example, there is a big difference between the layered history of Rudyard Kiplings Puck of Pook's Hill, which is essentially a defence of seizure and thus empire, and the modern "conquest" trope that seeks to claim the common victimhood of colonialism. Likewise, few Romantic painters or poets had any real exposure to the geographically limited early industrialisation of the late 18th century. This desire to conjure an ancient lineage is actually a postmodern technique, as seen in the inspiration that the Urn Burial of Thomas Browne (the antiquarian's antiquarian) provided for writers attuned to the eerie, such as Borges and Sebald. That both were foreigners is significant: the English eerie is about alienation, not nativist nostalgia.

Though eerieness has a long literary heritage, such as the ghost stories of M R James, the role of Pan in Edwardian art and literature (The Wind in the Willows, the stories of Saki etc), and the magical realism of G K Chesterton, the English variety (and it is very much Anglo-Saxon, the Celtic fringe offering a wholly different mythos and sense of place) really comes into its own during the postwar years, and is very much a product of location filming. True eeriness is audio-visual rather than textual. The foundation for this was the upsurge in sentimental interpretations of an England (and specifically the southern and eastern counties) facing imminent invasion that appeared in the early 1940s. This was a revival of the earlier invasion literature of the Edwardian era, filtered through the neo-romantic British art of the late-1930s. Though these were propaganda works directed at emphasising "what we're fighting for" (e.g. Listen to Britain) and encouraging US support by an appeal to shared heritage (e.g. A Canterbury Tale), the sense of both a demanding past and scores to be settled clearly owed much to the new salience of class (e.g. the squire's treachery in Went the Day Well) and property (e.g. the wild meadow threatened by development in Tawny Pipit).


The eerie would be given a fresh turn in the postwar years as the rebuilding of Britain encroached on the countryside in a manner far more systematic and intrusive than had occurred in Victorian times. New towns, motorways and airports radically shifted the boundaries of urban life at the same time that mechanisation depopulated the rural economy. We were simultaneously living in a collapsed past (the popularity of pageants and murals that mixed historical eras was noticeable over the century between the 1870s and 1970s, suggesting that the ahistoricity and pastiche of postmodernism had long modernist roots) and in a collapsed future (the competing claims of SF and apocalyptic literature). If the eeriness of the 40s had been nostalgic and conservative, but with an egalitarian edge, the eeriness of the 50s and 60s was uncertain and increasingly characterised by foreboding, reflecting wider social changes and a gloomy geopolitics. The tension between a nostalgic turning away from the wider world (Tolkein, White and Peake, in their varied ways) and the promise-cum-threat of high-tech modernity (from the Festival of Britain to the New Wave of SF) would be a major input to the cultural ferment of the 60s, but it also drove the reactionary and pessimistic cultural currents that would appear in a variety of forms in the 70s, from the ecological movement to the National Front.

While this tension produced much that was essentially decadent (from Dan Dare to The Silmarillion), it would also produce some genuinely memorable eerie cinema that explored the themes of social change and communal revenge, such as Quatermass and the Pit and Village of the Damned (the latter based on John Wyndham's The Midwich Cukoos). Though this was hardly avant garde (Brian Aldiss memorably referred to Wyndham's books like The Day of the Triffids as "cosy catastrophes"), it is important to remember that the English eerie is an essentially middle-brow form concerned with the eruption of ancient antagonisms in a conventional setting. Despite being about an alien invasion, and thus perhaps playing to contemporary worries over immigration, Village of the Damned centres on a generational threat and the necessity of extermination (and incidentally the destruction of the big house) to secure order. Michael Reeve's 1968 film Witchfinder General is often cited as an eerie classic, however it also marks the beginning of a turn towards an older horror tradition in which the rural is equated with ignorance and disruptions in the social order are due to a feral underclass and outside agitators. Even in a contemporary setting, "folk horror" doesn't usually speak to our times beyond a banal projection of class anxiety. The territorial concerns of the eerie are lacking.

Robin Hardy's 1973 film The Wicker Man is perhaps the last major cinematic work in the tradition of the 1940s, though it achieves its effect by going so over the top that it's impossible to imagine a credible sequel. It's power lies not just in the dramatic tension that arises from delaying the key revelation, but from the way the utter illogicality and implausibility of the plot reinforces the eeriness of the setting. It transplants an English village and squire that would not have been out of place in an Agatha Christie novel to a Hebridean island that, in its culture and climate, is as far away from the Todday of Whisky Galore as you could imagine. During the 1970s, the themes of a vengeful countryside and an increasingly dilapidated city would become little more than a parody of older Edwardian tropes about hostile natives and metropolitan decadence, such as in Straw Dogs. By the early 80s this had been recycled into both the neo-Jacobean camp of Derek Jarman's Jubilee and the gory humour of An American Werewolf in London (which owed as much to Cold Comfort Farm as Lon Chaney Jr). Eeriness in English-set films had been reduced to little more than a stylistic affectation.


Where eeriness lived on during the 70s was on British TV, a medium that was now able to technically compete with film in its presentation of the outdoors. If M R James is today considered eerie, this is largely because of the impact of A Ghost Story for Christmas, not because of an independent revival of interest in his books. While much of this output employed the usual tropes of rural mystery and an unquiet past, such as in Penda's Fen or Children of the Stones, eeriness was as likely to be found in the brutalist architecture of new towns and on apparently deserted suburban estates as within the vicinity of Glastonbury. As heavy industry started to visibly decay, rusting plant and machinery became as striking as long barrows and gnarled trees, proving that eeriness is about loss of purpose and changes in ownership (all empty houses, no matter their age, are eerie). Today, the 1970s have come to be characterised as a second Dark Age for some conservatives: an era not only of accelerated decline but of existential dread, which in turn provides a template for the pessimism of contemporary liberals. This is to misunderstand that the eerie tone of that decade reflected accelerated change: the number of kids who woke in a new bedroom in an out-of-town estate and saw alien fields outside their window.

During the 80s, the older tropes of eeriness tended to be subsumed into an increasingly commodified neo-paganism that would eventually turn into little more than a marketing channel for the music festival industry. The eeriness of deindustrialised areas was gradually over-written by retail sheds and call-centres, though these were soon to be identified as a new form of eerie in their own right via the concept of "edgelands". Beyond these, little of the old eerie remained as the 90s and 00s saw the physical and cultural distinction between town and country steadily eroded. As Nick Groom puts it: "Country towns, villages, and farming are being colonized by urban economies that create clone towns and clone countryside. The high streets of market towns are homogenized, and rural England disappears under out-of-town developments and industrialized agri-business". This artificiality has something of the uncanny about it, but it has bleached eeriness out of the landscape. Increasingly, the countryside is just an area with a lower density of housing (hence the intellectually dishonest equivalence of the Green Belt with the rural). Detectorists may herald a revival of the English eerie as a dramatic approach to the issues of property, or it might just encourage more people to move to Essex. Whichever, it was a joy to watch.