Search

Friday 24 September 2021

Binary Rigidity

I made the point on Twitter recently that the debate around trans rights in the Labour Party has started to display parallels with the famous antisemitism flap of 2016-19. For example, the claim that MPs with broad access to the media were being shouted down and even physically threatened by the feral left; the suggestion that there was a particular strain of misogyny that arose from a leftwing milieu and expressed itself through intolerant criticism of the gender-critical (GC); and the demand that the party leadership condemn this misogyny unreservedly and take steps to discipline members guilty of it. There are some differences though. In this scenario, the target of the manoeuvre is not the leadership but its support for trans rights, and in particular support for the reform of the Gender Recognition Act (2004) to allow self-identification and bringing the Equalities Act (2010) in line with that, which could affect the single-sex exemptions of the latter. Behind the lurid GC tales of men claiming to be women so they can hang around female changing rooms is a more fundamental struggle over who gets to define the interests of women. 

Given that trans women are only ever likely to be a tiny proportion of all women, this might appear inconsequential, but for the middle-class women who have long dominated the discourse over womens' rights, often to the disadvantage of working-class or minority ethnic women, this is the thin end of a wedge that potentially undermines their privilege. While the majority of feminists (indeed, the majority of people) are supportive of trans rights and self-identification, the gender-critical enjoy significant press coverage and sympathy, reflecting their disproportionate presence among newspaper columnists and politicians, which in turn reflects the professional, middle-class milieu and the dominance of second wave feminism within it (trans rights being a dividing line for many third wave feminists). That this does not reflect the views of wider society does not cause any pause for thought. As with the Jewish community's widely-reported fear of a Corbyn government, the mere existence of the concern is sufficient to justify its salience in the media. But this is where another difference arises. 

Most people were unable to gauge how credible the Corbyn "threat" to Jews was, not because they suspected he was an antisemite but because they weren't Jews themselves. To dismiss the fears of the Jewish community as overblown or paranoid was to risk displaying a callous disregard for the feelings of others. In contrast, the claim that trans women are a threat to natal women is one that half the population can judge to its own satisfaction. As a result, the "trans threat" has increasingly been framed as an issue of misogyny, which in turn has escalated gender-critical language into blunt transphobia and led to the bizarre sight of reactionary men "GC-washing" their actual misogyny by claiming to be defending womens' rights. As the modern (for which read: young, third wave, intersectional) left tends to be pro-trans rights, that cause has come to be associated not only with leftism but with a particularly disrespectful strain whose emblematic form is a shitposter taking the piss out of Suzanne Moore on Twitter. More troubling than this parallel with the antisemitism flap is the tendency of some on the left to take gender-critical terminology at face-value.


For example, the Labour MP Rosie Duffield claimed she was avoiding the Labour Party conference this year because of threats. Sonia Sodha interpreted that as "left misogyny", despite Duffield having offered no evidence that those threats were both misogynistic and from the left (none were detailed in the Sunday Times article). My observation that Sodha's use of "left misogyny" was a strawman resulted in some people upbraiding me for downplaying misogyny or claiming that it cannot be found on the left. This response echoed the charge of denial when anybody questioned the prevalence of antisemitism on the left during the Corbyn years. Many left commentators - keen to avoid the binary rigidity of "its a problem" / "it's not a problem" - tied themselves in knots in their attempts to balance the acknowledgment of the seriousness of antisemitism with criticism of the amplification of it for factional reasons. As no balance is possible, because the other side is simply not arguing in good faith, this approach proved futile. It is now an article of (bad) faith that there was no exaggeration, hence the continued suspension of the whip from Jeremy Corbyn, while all the comment pieces by Owen Jones and others will be exhumed solely to confirm that the left acknowledges it has a problem with antisemitism.

This is what happens when you allow the debate to take place within the framework of liberal virtue. Jones's behaviour can be excused as the result of being a liberal media careerist, but it is still striking that many on the left appear to be ignorant of the history of this manoeuvre, with the result that they fall for it repeatedly. If you think the reason why Rosie Duffield has decided to skip this year's party conference is leftwing misogyny then you are being played, both by her and the Murdoch press. If you take the bait and seek to bracket left misogyny ("No one is denying there is a problem, of course misogyny is bad" etc) in order to shift the debate towards the substantive issue of her transphobia, you will not only be permanently apologising for left misogyny but you will have accepted the premise that there is a misogyny peculiar to the left. Before you know it, you are conceding "the left has a problem with misogyny" which eventually morphs into "leftism is inherently misogynistic because it valorises a male working class", which is the functional equivalent of "leftism is inherently antisemitic because it arises from vulgar anticapitalism". The only way out of this thicket is to reject the binary of "You're either vocally against X or you are secretly in favour of X".

Though the gender-critical movement is borrowing much from the anti-Corbyn movement's strategy, there are two important tactical differences. First, there is an exclusive focus on the politico-media bubble rather than street politics. There won't be any high-profile demonstrations as there isn't a willing resource such as the Jewish community to provide numbers, and because any counter-demonstration would probably not only be larger but would have the advantage of being positive ("Trans women are women") rather than negative ("It's a scam"). Second, the aim is not to use control of the party bureaucracy to undermine the leadership but rather to leverage the leadership's determination to minimise the role of party members in formulating policy to advance gender-critical positions. Which brings us neatly on to the question: what is Starmer up to? The proposed changes to party rules, including the return of the electoral college in place of one member one vote (OMOV), are clearly intended to insulate Labour against the possibility of another leftwing candidate securing the leadership, but this is clearly just one part of a wider programme.

Starmer's claim that the electoral college is needed to reinforce the influence of trade unions is simple dissembling (were he a more subtle operator, I'd even think it evidence of a sense of humour), coming only weeks after his office celebrated the election of Sharon Graham as Unite General Secretary on the grounds that she would focus on industrial matters rather than the internal politics of the Labour Party. In the event, it looks like the left-leaning unions aren't going to support this, which means that Starmer will need to get his way by first dividing the unions on the matter and hoping that the purges have swung the constituency representatives in his favour. Whatever else this initiative indicates, the manner of its launch and the casual disregard for the unions does not suggest a leader's office in touch with the wider party, and has even dismayed erstwhile Starmer supporters. It's possible this is simply a dead cat manoeuvre, intended to distract from other rule changes, notably making deselection more difficult and minimising conference's say in policy, but that still seems a crass approach. The changes to the thresholds for trigger ballots will please the right of the PLP, but even this looks an unnecessary over-reaction given their limited use in 2019 and the subsequent decline of the left, whether purged or simply disheartened, in many CLPs.


It is sometimes argued that MPs should not be subject to deselection by party members because they are accountable to their constituents: if the electorate think the MP is doing a bad job, they'll be voted out at the next election. This is obviously disingenuous as most electors vote along party lines. Only in exceptional cases will they turf out an underperforming MP, so taking the risk of damaging their preferred party's chances nationally in order to punish the individual, just as they'll rarely vote in an independent. Oddly enough, people who advocate this line - that the MP is a delegate of the constituency - also tend to echo the contradictory Burkean line that an MP should in fact be a trustee, with autonomy of action. They also seem remarkably sanguine about MPs who have failed at the ballot box being promoted to the House of Lords. Unsurprisingly, they tend to be particularly keen on the idea that MPs should have absolute authority when it comes to the selection of party leaders. In reality, MPs can only be made accountable in one of two ways: either by the threat of the withdrawal of the whip in Parliament, or by party members in the constituency deselecting them. The one is accountability to the leadership, the other to the membership. It should be obvious by his actions which Starmer prefers.

The language employed in Starmer's proposal for the NEC refers to MPs as "representatives of the public in Parliament", as distinct from members who are "fee paying supporters of the party". This diminution of the member to a mere supporter is perhaps more significant in revealing Starmer's thinking than the traditional defence of MPs against the membership by appeal to the democratic legitimacy bestowed by the wider electorate. It once more suggests a determination to redefine the membership as a passive resource (here essentially just a financial one) and to rule activism illegitimate. This is characteristic of the cartel party in which the membership is subordinate to the bureaucracy and that in turn is subordinate to the leader. While the opposition to this has coalesced around the vote to confirm David Evans as General Secretary, the chief architect of this shift is clearly Starmer himself. This should give the lie to the somewhat conspiratorial claims that he is a political naif who has been captured by more cunning operators on the party right, and that the reversion to an electoral college is in anticipation of a leadership challenge ahead of the next general election. 

This looks far more like the latest stage of a systematic programme to reverse the halting steps towards greater party democracy undertaken since 2010. But the aim is not to return to the horse-trading between unions, party members and different factions of the PLP that characterised the postwar party, and which was briefly revived under Ed Miliband. Nor is it simply to return to the Praetorian party of the New Labour years, in which the rank and file grew increasingly disillusioned as debate was limited to a gilded circle of advisors and the inner court. More profoundly, Starmer appears determined to end the history of Labour as a mass movement. With the unions marginalised, the membership neutered, and much of the PLP itself disempowered as policy and campaigning are increasingly outsourced, the party begins to look more than ever like a fixture of the establishment. If the membership's push for greater democracy has always been a prefiguration of a wider reform of society, then Starmer's adoption of the cartel model and Evans's disciplinary regime suggests a future of rigid authoritarianism and managed democracy. In that light, the gender-critical have good reason to suppose that their binary rigidity will soon find favour.

Friday 17 September 2021

Books Do Furnish a Mind

The pop-philosopher Julian Baggini in the Financial Times asks, Why is it so hard to get rid of our books? This might appear a trivial concern in the face of a pandemic and global warming, but the wry discourse of bourgeois unease as commodities climb up our walls reflects on both. Books have long been seen as dangerous, infectious even - a medium that allows ideas to travel not just in space but in time. They are also a very visible example of the stuff that we accumulate, the clutter of a lifetime's "relentless acquisition", as Baggini puts it. Of course we don't actually find it that hard to get rid of some of our books. If we did, you'd never see any at jumble sales or on eBay. But Baggini isn't talking about dog-eared Danielle Steele paperbacks. He is addressing his library, which is of a different class to your haphazard collection of SciFi and football memoirs. Implicit in the question is the suggestion that, in a world of e-books and the epistemic prosthesis of search engines, physical books are no longer neccesary for either entertainment or education, but also implicit is the idea that the book is losing its stature as a work of art in the age of digital reproduction. So perhaps a way of answering the question is to ask another: what do books represent today?

Baggini runs through the usual utilitarian and aesthetic arguments for and against keeping books but his central concern is not with their intrinsic value but the pyschological impact of having to "live under the weight of so many of them". That notion of weight suggests a near-unbearable pressure, but the reality seems to be more about lazy conformity than the crushing of a fragile ego: "The suspicion has to be that for many people, the main reason to keep a house full of books is to show ourselves and others that we are intelligent and well-read. Nothing else can signal this so effectively, or socially acceptably". This is a reductive view of cultural capital that dispenses with the wider social framework (habitus, disposition etc) outlined by Pierre Bourdieu in Distinction. While taste may be socially-determined, it doesn't follow that the "main reason" people acquire books is to show off. To think this is to fall back on an antique caricature of society and culture, such as when the nouveau riches of the nineteenth century would supposedly buy leather-bound volumes by the yard. I suspect the main reason people buy books is to read them.

That books are signalling devices is trivially true, but to imagine the only signal is intelligence is naive. Cultural goods chiefly represent class and status (this was Bourdieu's point), which is why the collected witterings of Prince Charles have a different cachet to the ghostwritten pap of the celebrity book market. The traditional reason for buying books "by the yard" (actually, buying complete libraries) was to display wealth, both in the volumes themselves and in the space (and expensive shelving) needed to display them. As the arrival of cheap paperbacks and the gradual spread of literacy in the nineteenth century democratised reading, class was increasingly performed by a preference for more expensive hardbacks and eventually the weighty coffee table book. Much of the twentieth century criticism of mass media centred on the assumption that the proles would reject literature for the shallow pleasures of radio, cinema and TV - or pornography, according to Orwell in 1984. In the event, people kept on buying and reading books.

In support of the idea that it's all for show, Baggini take aim at a particular modern habit: "Consider also those who have rows and rows of old travel guides. These books quickly go out of date so are not being kept because they will be used again". It simply isn't true to say that travel guides go out of date quickly. They deliberately avoid recommending hotels or restaurants that haven't been well-established, or look like they might not last, so you can rely on a 10-year old guide being at least 90% accurate, while monuments and museums hardly change over decades. Ironically, travel guides often become more valuable over the long haul as when change does happen they offer a means of time travel, not simply the revisiting of our own memories - no one will have experienced every site or establishment mentioned on every page - but the recreation of a historic environment that can be mentally traversed like an open simulation online. The humble travel guide is more than a souvenir, like the sticker on an old steamer trunk or the sewn patch on a rucksack, and people hang on to them for utilitarian as well as sentimental reasons.

Baggini returns to the theme of identity: "We use books to underline our identities when more often than not they undermine them. Most old books are memento mori for distant selves, since the person who read them no longer exists." That "underline" is another value-laden term, which sets up the pay-off of "undermine" (underpin would have been more accurate than underline, but the rhyme demands otherwise). The implication is that we are trapped in our past and can't move on, which sounds more like the observation of a psychoanalyst than a philosopher (though there is perhaps a hint of existentialism). There is also an obvious contradiction in Baggini insisting on the one hand that we must shed our old books to be true to our contemporary selves while on the other hand recommending that we retain a conventional canon - "Now when I look I see only books that are classics" - which is surely more pretentious than keeping hold of all those old Viz annuals. A better question than why is it so hard to get rid of our books is why do we hang on to some in particular? To talk of "classics" is to avoid the personal, which sits oddly with Baggini's claim that books are a mere projection of the ego.

There are two further problems with the idea of a purge. The first is that books may be literal memento mori, inherited from dead parents or others. In the case of a bookish person, they may be the only significant objects they leave behind. Baggini baldly states "we don’t keep old clothing that no longer fits, or beautiful pots and crockery that is unusable" - but in fact we do keep such things, if they connect us to the dead or to our own achievements (that old football shirt, that wedding dress). The second is that this view requires Baggini to ignore that books also project meaning into the future. Many of the books we acquire remain unread for months or even years. This isn't necessarily pretension or the consequence of retail therapy. They are an investment in a future self, an act of faith in self-development. In some cases, particularly non-fiction, the book is an insurance policy against potential future reference. This essentially academic approach to building a collection has itself become a post-Wikipedia conceit: the antilibrary (supposedly inspired by Umberto Eco) in which your collection of books is an expression of your ignorance rather than your knowledge. 


But this is perhaps too systematic an approach, just as weeding your library of old novels or out-of-date travel guides sounds like the literary equivalent of dry January. Indeed, is it even reasonable to refer to the clumps of books dotted around our homes as a library that needs such careful curation? In Baggini's telling it begins to sound like an estate and you wonder if culling his collection of books is a metaphorical hint to his potential heirs not to get their hopes up. Our politics is suffused with inheritance and intergenerational friction, like a Balzac novel. Is the urge to prune our libraries displaced guilt over a property windfall, or maybe just a general desire to erase the past? But our books define our future as well. As a projection into that future, fiction is often closer to a promise than the speculative contingency of non-fiction. We imagine that we will be better able to appreciate Anna Karenina in our 40s or The Old Man and the Sea in our 60s, so we hang on to them expectantly, even if we bought and first read them in our 20s. Proust's A la Recherche du Temps Perdu is rarely bought and devoured in one go, like a Sci-Fi or fantasy series, and not just because of its length. It is a commitment to the slow unfolding of future time as much as a languid recapture of the past. Likewise, Joyce's Ulysses must hold the record as the book most often started but not finished, largely because it is approached as a single narrative rather than a collection of experimental novellas that can be read in any order.

The books we hold on to are more likely to be works that defeated us rather than favourites. They remain a challenge. If you're lending a book to a friend, it's probably one you read and enjoyed rather than one you gave up half way through in frustration. In recommending that we lose the books that we never completed, or even started, Baggini is advocating an honesty about our own limits, an admission that our investment in a future self will not pay off: "To get rid of these books requires confronting some uncomfortable truths. It is to admit failure. To concede that our aspiration to become more widely read, more knowledgeable, more well-rounded, has not come to fruition. Worse, it never will". But, again, this ignores that some books represent appointments with ourselves at a time when we think we'll be ready to appreciate them. You don't cancel appointments because they are far in the future. This lifetime perspective is actually a traditional view, embodied in books of consolation in the face of grief or death. Indeed, it was quite normal for gifts of such books in the past to be made in the expectation that they wouldn't be read for many years.

At the heart of Baggini's argument is a very old philosophical stance, dating back to the impermanence of Heraclitus and the Stoic rejection of vanity: "Letting go of such books is as important as accepting that a wonderful holiday, concert or meal has come to an end. The right way to take pleasure is in recognition of its transience. Even knowledge must also be allowed to pass. Holding on to books seems to be a denial that what enters our heads is also destined to exit them". But behind this lurks some very modern attitudes. The idea that you should cull your books, only retaining those of highest value suggests a categorical division between commodities and assets. Baggini happily gets rid of books signed by the authors: "This does require tearing out the front page with the signed dedication, which at first felt like a sacrilegious desecration. But I’ve come to see it differently: the fact that I want to keep that page even when I’ve decided the book should move on honours the author rather than insults them". A more cynical view is that the signature is an asset whose value may well appreciate while the rest of the book was merely a commodity that will probably depreciate. Binning books, or ripping them apart, is surely more revealing of the ego than simply letting them gather dust on the shelves.

Friday 10 September 2021

The New Puritans

Anne Applebaum has a new essay in The Atlantic entitled The New Puritans. If that seems a little opaque, the sub-title is only slightly clearer: Mob Justice Is Trampling Democratic Discourse. The focus of her concern is not the Trumpist assault on the Capitol in January, nor the recent Supreme Court ruling to allow a Texas law that places bounties on abortion providers. Rather it is a heterogeneous group of "people who have lost everything—jobs, money, friends, colleagues—after violating no laws, and sometimes no workplace rules either. Instead, they have broken (or are accused of having broken) social codes having to do with race, sex, personal behavior, or even acceptable humor, which may not have existed five years ago or maybe five months ago". On Twitter she emphasised that this is not about "cancel culture" (whose alleged victims rarely suffer any loss and often seem to gain in media prominence) or "wokeism" (which is about the critique of institutions more than the abuse of institutional power). Rather she is highlighting "the kind of self-censorship and intellectual timidity we know from authoritarian societies" and the way that this has been fuelled by social media such that "the values of that online sphere have come to dominate many American cultural institutions: universities, newspapers, foundations, museums".

Much of her understanding of self-censorship stems from her career as a journalist and historian of the Cold War: "the political conformism of the early Communist period was the result not of violence or direct state coercion, but rather of intense peer pressure. Even without a clear risk to their life, people felt obliged—not just for the sake of their career but for their children, their friends, their spouse—to repeat slogans that they didn’t believe, or to perform acts of public obeisance to a political party they privately scorned". (Some of this will have been coloured by her own disillusionment with right-liberals in Poland and Hungary embracing authoritarianism, as outlined in her recent book, The Twilight of Democracy). Via a brief interlude in Turkey (the chilling effect of lese majesté under Erdogan on writers), she pivots to contemporary America: "But fear of the internet mob, the office mob, or the peer-group mob is producing some similar outcomes. How many American manuscripts now remain in desk drawers—or unwritten altogether—because their authors fear a similarly arbitrary judgment?" Manuscripts have remained in desk drawers throughout history because of the fear of social judgement, particularly those that dealt frankly with sexuality or challenged religious or political orthodoxy. There is little reason to believe that this has become more pronounced because of social media, and plenty of evidence to suggest that the Internet has allowed many more to be published.

Applebaum's essay begins with an allegory in the form of Nathaniel Hawthorne's The Scarlet Letter, which explains the reference to Puritans. But the parallel doesn't really work. The novel's plot may turn on social ostracism but what it's actually about is how sin and its attendant guilt physically manifests itself, not only in the socially-imposed form of the scarlet "A" sewn onto Hester Prynne's clothing but in the expression of shame though the bodily stigma of the father of her child, the otherwise upstanding Puritan minister Arthur Dimmesdale. It is not about false witness or the absence of due process. Prynne is a saintly, almost Christ-like character, which makes her both sympathetic and utterly unrealistic, but she and her sometime lover are also undoubtedly sinners, both in the eyes of their society and of themselves. Some of Applebaum's examples of supposed mob justice in the present involve individuals whose behaviour has undoubtedly been questionable ("Some have made egregious errors of judgment"), but that is where the parallel with the otherwise impeccable Prynne ends. She has sinned but both atones for that sin and becomes a larger spritual person than her peers as a result. For Applebaum's modern victims the punishment is not only unjust or disproportionate, it diminishes them as persons.

What is odd about this is why Applebaum should chose to make her point through The Scarlet Letter, an American literary classic but hardly a common reference today. A more famous treatment of the theme of ostracism in Puritan Massachusetts, and one that very much addresses false witness and the corruption of due process, is Arthur Miller's The Crucible. The characters are neither simply saints nor sinners, their motives are messy and compromised, and there is a sense of an institutional interest, beyond fussy legalism, that creates an implacable process that ultimately devours everyone. I suspect the reason why Applebaum doesn't use this literary parallel is because it was an overt allegory of McCarthyism and the comparison between the 1950s and today would not be flattering to her argument. That mid-century moral panic saw hundreds of people imprisoned in the US and thousands fired or blacklisted. Today's victims have suffered loss of jobs in a handful of cases and professional embarrassment in a few more, but there's no evidence that this is out of line with historic norms. People are often sacked for presumed misconduct or cold-shouldered for breaches of etiquette. It's also difficult to know where precisely to draw the line that bounds this group. While Applebaum references Ian Buruma, who left the New York Review of Books as editor after publishing an essay deemed dismissive of #MeToo, there is no reference to Steven Salaita, who was denied tenure by the University of Illinois for pro-Palestinian comments that donors objected to.


The determination to avoid any reference to anticommunist hysteria, and notwithstanding her musings on the social psychology of Cold War Eastern Europe, produces a massive elision in Applebaum's essay, namely the twentieth century: "In the 19th century, Nathaniel Hawthorne’s novel argued for the replacement of exactly that kind of [Puritan] rigidity with a worldview that valued ambiguity, nuance, tolerance of difference—the liberal worldview—and that would forgive Hester Prynne for her mistakes. The liberal philosopher John Stuart Mill, writing at about the same time as Hawthorne, made a similar argument. Much of his most famous book, On Liberty, is dedicated not to governmental restraints on human liberty but to the threat posed by social conformism, by 'the demand that all other people shall resemble ourselves.' Alexis de Tocqueville wrote about this problem, too. It was a serious challenge in 19th-century America, and is again in the 21st century." Surely the totalitarian demand for uniformity of opinion was at its peak in the middle years of the twentieth century? (Perhaps this period is omitted because its apparatus of repression was obviously intolerant of organic mobs.) Ironically, one could argue that the era of liberal triumphalism, from 1989 to 2003, was a perfect example of "the demand that all other people shall resemble ourselves".

Applebaum's thumbnail history involves another significant elision: "By contrast, the modern online public sphere, a place of rapid conclusions, rigid ideological prisms, and arguments of 280 characters, favors neither nuance nor ambiguity. Yet the values of that online sphere have come to dominate many American cultural institutions: universities, newspapers, foundations, museums". How could the values of social media (whatever they might be) have come to dominate cultural institutions in such a short space of time? How could the mob have come to exercise such sway over institutions historically dedicated, among other things, to resisting the mob? Applebaum offers no explanation of the dynamic, instead she doubles-down on the image of totalitarian repression: "Heeding public demands for rapid retribution, they sometimes impose the equivalent of lifetime scarlet letters on people who have not been accused of anything remotely resembling a crime. Instead of courts, they use secretive bureaucracies. Instead of hearing evidence and witnesses, they make judgments behind closed doors." This is all too reminiscent of postwar Eastern Europe, but the equivalence of a university with the Stasi is far-fetched, despite the conservative demonisation of the former as the enemy of liberty and free speech (a feature of the UK as much as the US).

What has changed in cultural institutions over the last two decades is less the advance of illiberal values promoted by social media and more the advance of neoliberal practice. This is where "secretive bureaucracies" and "judgments behind closed doors" are to be found (think of the IMF or ISDS). There is also the management of reputational risk. This is evident in the fact that while social media may get the ball rolling in terms of publicising specific cases and building support for protest ("calling out"), it is often only when those cases reach traditional media that they become causes celebre and prompt institutional action. This is something that has long been understood by the left, who have often found themselves crying in the wilderness about bad (even criminal) behaviour that the establishment has chosen to tolerate or simply ignore. If social media really were so powerful, George W Bush and Tony Blair would have been tried for war crimes long ago, and neither have been ostracised despite popular revulsion (Bush may have retired but Blair is as busy and prominent as ever). Applebaum makes her case as a defence of democratic principles, but this misses the irony that Plato, the original critic of the mob and its enabling demagogues and cowardly institutions, was actually criticising Athenian democracy, and that ostracism was one of its notable characteristics.

Cold War Liberalism is conventionally traced to George Kennan's 1946 telegram from Moscow advocating containment and the subsequent emergence of the more robustly confrontational Truman Doctrine, but this narrative ignores the role of McCarthyism, which could only have succeeded with bipartisan support. It was liberals as much as conservatives who enabled and promoted it. The eventual turn against it was essentially a matter of tone ("Have you no sense of decency, sir?") as much as the rulings of the Warren Court, reflecting the full absorption of the anti-communist mindset into establishment thinking. I think Applebaum traces the weakness and decadence embodied in her vision of the New Puritanism to victory in the Cold War and the disspiation of that mindset. In this telling, believing itself unthreatened, Western society has indulged the moral selfishness of social media and the know-nothing politics of populism. The attempt to substitute the War on Terror for the Cold War has failed. The ignominious retreat from Afghanistan and the end of liberal interventionism are the inevitable corollaries of this empowerment of the mob. I suspect she'll only be happy once Russia has become a credible threat once more. The problem of contemporary liberalism is not new Puritans but old warmongers.

Friday 3 September 2021

The Project 2.0

The state mimics the practices of business management to display ideological loyalty as much as for any practical reason. But the adoption of novel techniques often involves a time-lag. Why, if conformity is instinctive? It is not because of the innate conservatism of the Civil Service or a lack of commercial familiarity among politicians. Nor is it a weakness of democracy or a lack of market incentives. There are plenty of examples in the twentieth century, at least up to the 1980s, of the state being at the leading edge of new management techniques and technologies. Rather it reflects the growth of the mediating role of business consultancies and outsourcers. As parasitical entities, they don't originate new ideas but adopt them instrumentally once proven elsewhere. In practice, they tend to dilute and bastardise those ideas, their prime directive being not to evolve themselves but to maintain profitable cashflows from government contracts. The result is that their inhouse culture is often antiquated and their methods regressive (the emblematic case is business process outsourcing, which often involves reverting to older technology augmented by cheap manual labour). 

A good example of this time-lag is Agile, which came to prominence in the 1990s in software development, later branching out into project management and manufacturing, but took a couple of decades to reach the public sector. A further decade on, we see evidence of Agile practices spreading to the political hinterland of internal party management. Though the recent high-profile example is Labour, you can be confident that elements of this are also encroaching elsewhere. The intention of Keir Starmer and David Evans is that "Labour will work “collaboratively” in “multidisciplinary teams”, which will “adopt a product-mindset using agile ceremonies, be empowered to make decisions and encouraged to focus on rapid prototyping, deployment and iteration”." ("ceremonies" in this context means short, focused meetings, often standing up, that typically mark the start and end of a day). What are we to make of a political party that runs itself as a project with a product deliverable? To understand this, we need first to understand Agile.


Software methodologies, and their related project management styles, reflect the material base, which in practice means that they underwent a major change when mainframe systems were replaced by networked mini-computers and PCs in the 1990s. The 1970s and 80s were marked less by a single industry-wide methodology than by a metaphor that described a whole series of proprietary system approaches: Waterfall. In fact, the better metaphor would have been a cataract, a series of major phases, each of which wasn't started until the preceding one was completed. Those phases were: requirements, design, implementation, verification and maintenance (these were later refined with the addition of a sixth phase, analysis, before design). Though logically sequential, the obvious problem was that errors or mistaken assumptions in one phase might not come to light until later, at which point they would prove costly to address, often leading to incomplete or out of specification software that necessitated further corrective releases. Another metaphor would be a train that can only go forwards.

This led to an appetite for an approach to software development that incorporated feedback and allowed for earlier, less costly correction of errors. As software coding became easier with the arrival of fourth generation languages (4GL) in the 1980s, and as running code was no longer a laborious process of compiling and booking precious time slots on a mainframe, an iterative, lightweight approach began to dominate in the 1990s, reflected in distinct methodologies such as RAD, Scrum, DSDM and XP. These were collectively termed Agile, emphasising their common responsiveness and flexibility. As prototyping and iterative releases improved code quality, what quickly became clear was that the chief focus of that agility was not programming but responding to changes in the user requirements. This placed a greater emphasis on close collaboration with the software sponsors and users, which in turn required a more people-centric approach to project management: more focused meetings and negotiation, fewer Gantt charts and reports. The reliance on iterative releases and testing was also congruent with Silicon Valley culture ("Fail fast, fail forward" etc) and would prove highly suitable for the development of smartphone apps.

Naturally, the 1990s was the moment that the British government chose to adopt PRINCE2, a highly prescriptive Waterfall-style project management methodology that would subsequently be implicated in many of the high-profile IT failures of the New Labour years. As night follows day, a change in government in 2010 led to a desire to change the approach, with the result that Agile techniques started to appear. The prime example of this was Universal Credit, announced by Iain Duncan Smith in 2011, where the employment of Agile initially foundered on the Waterfall nature of government procurement. The project was reset in 2013 and Agile persevered with through to system launch in 2018. Despite delays - predictable given the highly-contentious nature of the system and frequent changes in policy - it is now displaying genuine agility. Of course, that's agility in response to the DWP's demands, not the needs of benefit claimants who still have to suffer the system's implacable discipline.


Universal Credit marks the point at which Agile enters the mental frame of the UK political class, initially as a silver bullet for complex software development and then as a more general approach for running a project. The variety of Agile methodologies can be plotted on a spectrum that runs from what is partly a cosmetic rebranding of older approaches, such as DSDM with its hybrid Waterfall structure, to more radical departures that question wider business practices, such as eXtreme Programming (XP). This has inevitably taken on a political dimension with XP in particular being accused of having socialist tendencies due to practices such as pair-programming and principles such as collective ownership. This is ironic given that the traditional caricature of socialism, i.e. as practised in the Soviet Union, is essentially the Waterfall approach: Marx and Engels provided the requirements, Lenin the design, Stalin the implementation, Khruschev the verification and Brezhnev the maintenance. You could argue that Gorbachev's glasnost and perestroika were proto-Agile practices.

A more subtle understanding of the sociology of software development is that the demand for Agile practices, which emerged in the US and UK among developers and junior project managers rather than being imposed from on high, reflects not simply a desire for greater worker autonomy but for personal distinction in a profession that, while it lacks formal credentials and is often highly exploited, is demonstrably middle class in its culture and rewards (e.g. the startup promise of equity). As Michael Eby describes it, Agile reflects "a relatively affluent subset of the younger working population, disaffected with the heteronomy of work, and emphasizing affective-existential themes like boredom, dehumanization, inauthenticity and meaninglessness". In support of this interpretation, it's worth noting that offshore software development teams in developing nations like India still tend to follow a hierarchical Waterfall approach, even if working to the designs of product managers and software architects in the West who fully practise Agile. What holds this together across timezones is the project.

Many of the practices and concepts outlined by David Evans in his reformation of the Labour Party machine, such as multi-disciplinary teams and core functions, long predate Agile, going back to the mid-to-late twentieth century tenets of process engineering and total quality management. The common theme is the project. By now you'll have recalled that "the project" was the catch-all term used to describe New Labour. Whereas Margaret Thatcher would ask "Is he one of us?", Tony Blair's question was "Is she onboard?" (project onboarding is actually a thing). Much of the managerialism that we associate with New Labour was based on the Waterfall model, such as the key role of lobbyists and think-tanks in setting the parameters of design (i.e. requirements and analysis), the importance placed on testing against quantitative targets rather than qualitative outcomes (i.e. verification), and the separation of responsibility for implementation, operations and maintenance (i.e. outsourcing). 


In effect, we're looking at the same project mentality, only now in the raiments of Agile practice. However, it's clear that Labour under Starmer and Evans are chiefly interested in the more mechanistic practices, to the point where some become ossified in jargon like "ceremonies". What's missing is any reference to those humane values and principles that (however self-indulgently) characterise Agile, and particularly the supposedly more leftwing methodologies like XP, such as feedback, respect and courage. This hasn't sprung out of nowhere. You don't need the full Forde Report to recognise that the right of the Labour Party has been running a project since 2016. I would describe it as Agile insofar as it adopted an iterative approach towards the challenge of unseating Jeremy Corbyn - the chicken coup, the antisemitism flap, the 2nd referendum debate - but I don't see much evidence beyond that. Policy development has been weak (i.e. few deliverables), vision has been notable by its absence (no system metaphor), and the party's campaigning, both in by-elections and more generally, has been uninspiring when not catastrophic (repeatedly failing unit tests).

In reality this is a top-down, Waterfall project that seeks to reengineer the party to the point where the left can never secure the leadership again. Everything else appears to be subservient to that one key deliverable. This will be a multi-phase project that will likely condemn the party to opposition for a decade, but as the chatter about a "Kinnock moment" suggests, that is considered a price worth paying. The absence of many Agile principles and values is telling, but the meaning of the jargon promoted by Evans is also pretty transparent: the "hub and spoke" model will centralise authority and emasculate the regions; "serving the needs of voters first" means further disempowering CLPs and relying on focus groups; a "product mindset" and "rapid prototyping" mean that policy will again be subservient to the fickle opinion of the media; while the emphasis on "collaboration" is clearly intended to prepare the ground for more consultants unaccountable beyond the leader's office, like the recently-recruited Philip Collins. It's the same old project that the party right have been pursuing, in one form or another, since the 1950s. It remains overdue and over budget.