One way of looking at the current political landscape is in terms of the broad division between progressive and conservative forces, which we can call "left" and "right" for shorthand. If we classify the Liberal Democrats (and their previous incarnations in the SDP and old Liberal party), along with the SNP, Plaid Cymru, Sinn Fein, SDLP and the Greens, as on the progressive wing (I know, but bear with me), then the left has enjoyed a majority of the popular vote in general elections since the mid-60s. Even before this, the right rarely got over 50% during the twentieth century, the exceptions being the Salisbury administration of 1900 and the National governments of 1931 and 1935 (which depended on Labour and Liberal defectors). This run came to an end in 2015 when the right enjoyed a majority through the combination of the Tories, UKIP and the two Northern Irish unionist parties. The decisive shift was not Labour voters attracted to the Kippers but LibDem voters who deserted to the Tories after 5 years of coalition government.
That desertion produced a Tory majority which in turn obliged David Cameron to cede a referendum on the EU to placate his Europhobic right-wing. Though Cameron's decision has been cast as a catastrophic error by some Conservatives, it may paradoxically have created the conditions for electoral dominance by the Tories for the foreseeable future, though I doubt Theresa May will be thanking him personally. The original error was the decision of Nick Clegg to enter into coalition with the Tories in 2010, both because it led indirectly to Brexit and because it revealed that many LibDem voters are actually more inclined to a conservative than a progressive position. This means that the crude left-right distinction outlined above is somewhat misleading, however it also reflects the fact that voting is often a matter of virtue-signalling, with some small-c conservatives choosing to identify as mild progressives (or even pose as radicals by voting Green) essentially for reasons of self-esteem. Likewise, communitarians who align with Labour on tax and public services may vote Conservative for reasons of "stability".
This realignment of the electorate has been strengthened by May's decision to pursue a hard Brexit. Though the LibDems and their media supporters have claimed that the party is on its way back after its victory in the Richmond by-election, it remains stuck at around 10% in the polls, which is half the figure it was consistently achieving up to 2010. This is despite its opposition to the Article 50 bill and the fact that Labour isn't competing to be the HQ of the remainer irreconcilables. Barely a fifth of the 48% constituency that voted remain seem inclined to throw in their lot with Tim Farron & co. To put it another way, half of the traditional LibDem vote appears to have been lost to the right - mainly to the Conservatives but some to UKIP - or to abstention. The inevitable decline of UKIP post-referendum may see some voters shifting back to the LibDems, but it is hard to believe this will be a significant number. Most Kippers are likely to switch to the Tories or give up voting, with the real hardcore nutters splintering into various far-right groupuscules .
What centrist (and some left) pundits appear unwilling to countenance is that the 48% does not constitute an election-winning progressive base. A significant number of remainers were Conservative voters who, like their MPs, now appear to be either reconciled to the inescapability of a hard Brexit or are hoping for a fudge to limit the damage. Either way, there is no evidence to suggest they could be won over to a wider progressive platform, just as it was always naïve in the past to imagine that Tory "wets" were somehow less than Tory. True-blue remainers aren't going to vote Labour, come what may. For this reason, the belief that a Labour party sans Corbyn could somehow stop Brexit in its tracks is for the birds. Labour might well improve in the polls with a new leader and a more supportive centrist media, but this would largely represent the return of disillusioned supporters (mostly abstainers rather than deserters to UKIP) and would probably do no more than restore the party to the 35% it achieved under Ed Miliband. That's necessary but it isn't sufficient.
The bottom line is that the shift of 10% of the electorate away from the LibDems (and mostly to the Conservatives) after 2010 is likely to be the dominant factor in shaping politics for the remainder of the decade and quite possibly beyond. Labour must hope that most UKIP supporters lapse into abstention rather than commit to the Tories, and to that end the by-election victory in Stoke may prove to be historically pivotal by accelerating UKIP's decay. This is more likely than the idea that the remaining progressive half of the LibDem vote can be won over by a shift to the centre, which would require the Liberal Democrat party to be pretty much squeezed out of existence. The LibDems may be able to tempt back those voters that deserted them after 2010 for the Tories in future by-elections and local council elections, but probably not in general elections. Equally, the remaining 10% of LibDem voters are probably pretty hardcore, so the "swayables" that Labour might poach may amount to little more than 2 or 3%. That's not insignificant, but with the 35% strategy dead and the new target for government probably around 40%, Labour will need more than ex-LibDems votes to win a majority.
Like it or not, the Labour Party has to win over soft Conservative voters, which means a combination of those 2010 vintage ex-LibDems and more traditional swing voters who can be appealed to on pragmatic grounds. This doesn't mean a return to centrism and "Worcester woman" but a pitch that paints the Tories as irresponsible gamblers and incompetents who cannot be trusted to look after the interests of the people. In other words, an appeal to self-esteem and stability. This means a conscious revival of social democratic policies that address voter concerns over insecurity (i.e. against Blairite neoliberalism as much as Tory neoliberalism), over social cohesion (which means shifting the debate from immigration to inequality), and over economic regeneration (which means investment over austerity). What it doesn't mean is the distraction of Blue Labour, with all its overtones of sectarianism, which would allow the Tories to shift the political debate towards patriotism and the evils of multiculturalism and political correctness. Labour needs to revive an inclusive British identity and a positive internationalism in contrast to the Tories' increasingly isolated English chauvinism.
Ironic though it may sound after the years of Blairite managerialism, Labour needs to define itself as a party capable of better managing the state in the interests of everybody while highlighting the Tory government's preferential treatment of vested interests as it negotiates with the EU27 (the City, global capital, the rich). This doesn't mean being cautious, because Brexit will call for radical measures and new ideas. Nor does it mean being nostalgic: the "spirit of '45" can provide some mood music, but "socialism in one country" isn't feasible any longer. It probably does mean a new leader simply because Corbyn cannot project sufficient managerial competence. Labour needs to present a "people's Brexit" in opposition to whatever hot mess the Tories produce. Such a strategy also stands a better chance of rebuilding Labour in Scotland, which is vital to achieving a governing majority. The Scots are unlikely to double-down on risk and back independence in the event of a hard Brexit, unless the May government is mad enough to push them into a corner, which could allow Labour to prosper by positioning itself between the "fundamentalisms" of the SNP and the Tories.
The obvious risk is that Theresa May might achieve sufficient compromises with the EU27 to soften Brexit, so narrowing the ground between the Tory and Labour positions. This strikes me as unlikely both because the Tory right will limit her room for manoeuvre and because the compromises will likely reflect unpopular preferences - e.g. for the City. There is also little in the Prime Minister's history to suggest she has the personal charm or cunning to sway the EU Council of Ministers. The flip-side of this is that she might so alienate the EU27 that the potential for Brexit to be moderated after 2020 by an incoming Labour government would disappear. Indeed, it might be in her interest to burn all the bridges, insisting that making a success of a Tory-designed Brexit was the only option, much as she has sought to close off alternative options since she ran for party leader. To mitigate this, Labour needs a credible alternative Brexit that can command majority support. The final irony of Brexit is that it may return us to a political duopoly and the decisive role of the swing voter.
Search
Monday, 27 February 2017
Friday, 24 February 2017
Curious Beasts
By-elections are curious beasts in that they rarely tell us much about the composition of future governments or even party ideologies. The famous centrist victories, such as the Liberals at Orpington in 1962 and the SDP at Crosby in 1981, were not harbingers of a fundamental shift in the political landscape, while those by-elections supposedly fought over the "soul" of a party (invariably Labour) tend to look absurd with the benefit of hindsight (e.g. the "gay/straight" contest in Bermondsey in 1983 that resulted in the election of the bisexual Simon Hughes). Much of the oddity of by-elections is due to the determination of the media to force a local vote into the straitjacket of a national narrative, but some of it is down to variations in turnout that can skew the result, hence the tendency of some seats that register a "shock result" to revert to their historic norm come the next general election. By-elections are therefore often unrepresentative, but this doesn't mean that they aren't meaningful. To understand them we need to step back and view them in a broader context, rather than treating them simply as a poll on the popularity of party leaders.
The historic significance of the Copeland and Stoke Central by-elections is that they occurred in the period between the passage of the Article 50 bill and the start of Brexit negotiations with the EU27. As such, they may well represent the high-water mark of Conservative popularity and the low-water mark of Labour's. The inescapable complexity of the coming deals, and the inevitability of compromise and choices over priorities, will leave the government vulnerable to substantive criticism as well as leaver disillusion. Labour's decision to vote through the Article 50 bill attracted liberal condemnation, but it means that its future attacks cannot be dismissed by government as the carping of unrepentant remainers, though no doubt the Tory press will try and make the charge stick (and the liberal press will flip from obsessing over Labour's leave-voting constituencies to its remain voters). The next two years will be a contest between versions of Brexit. Though it should set out some general principles now, e.g. in respect of EU citizen rights and worker protection, it makes sense for Labour to build its position incrementally on the back of popular disquiet at Tory mis-steps and emerging risks rather than through a composite motion at conference that offers hostages to fortune. We should never underestimate the Tories' capacity for catastrophic error.
Though much of today's media is focused on the short-term drama of the by-election results, notably in respect of the unpopularity of Jeremy Corbyn (hence Copeland is now centre-stage while former favourite Stoke is being ushered into the wings), the results are consistent with long-term trends. Turnout was down, though that is normal for by-elections. Copeland's drop, from 64% in 2015 to 51% this week, is pretty much what you'd expect, particularly given the bad weather. More troubling is the fall in Stoke Central from 50% to 38%. The disengagement of almost two-thirds of the electorate, despite the high profile of the contest and the insistence on its significance by all parties in the run-up to polling day, does not bode well for democracy. In terms of party share, Labour may have held Copeland (and formerly Whitehaven) for the best part of a century, but it's vote has been in steady decline since 1997 when Jack Cunningham got 58%. Jamie Reed's three elections between 2005 and 2015 saw shares of 51%, 46% and 42% in that order. Gillian Troughton's 37% yesterday clearly owed something to the combination of Corbyn and Sellafield, but it is otherwise on-trend.
A similar trend is visible in Stoke, with Labour's vote falling from a high of 66% in 1997 down to 39% in 2015 and then 37% this week. The fact that 37% was enough to win Stoke but lose Copeland is explained by the more even split of the vote to the right of Labour in the Midlands constituency. Up to 2010 this was between the Tories and Liberal Democrats. In the 2015 general election and this week's by-election UKIP supplanted the latter. In Copeland, where the third party has previously been well short of the second, the collapse in the UKIP vote, from 16% in 2015 to 7% now, closely matched the increase in the Tory share, from 36% to 44%, suggesting that the Kippers focus of their resources in the Potteries was a key factor in the Conservative victory in Cumbria. This, together with the fact that the Tory vote held up in Stoke (from 23% to 24%) and thus denied Paul Nuttall the chance to pip Labour, is a validation of Theresa May's apparent strategy to colonise UKIP's territory by promising a hard Brexit.
The Conservative leader's problem going forward is that the reality of Brexit will alienate some hardcore leavers, who will be disappointed by the necessary compromises, but also some softcore leavers who may not yet appreciate a lot of the negative consequences. Though some of those hardcore voters will gravitate (back) to the Kippers, UKIP is clearly in long-term decline. Paul Nuttall's strategy of challenging Labour in the North doesn't look any more credible today than it did in 2015, despite the extensive and persistent media support for the idea. Increasing its share in Stoke from 23% to 25% is a pretty poor return for all its efforts, not to mention all those Guardian and Newsnight vox-pops. If nothing else, the results this week should indicate that UKIP remains essentially a Tory ginger group, which will perhaps embolden the likes of Suzanne Evans to challenge the discredited and ridiculous Nuttall (or maybe Farage will stage another comeback now that he doesn't seem minded to spend more time with his family).
Assuming Corbyn stays put - and why wouldn't he? - we can expect the liberal media to once more start favouring the Liberal Democrats as the "Labour buster" of British politics, with perhaps a side order of fresh plaudits for the SNP as "sensible remainers". It will be quite like old times and may even lead to a falling off in the number of articles painting the working class as incorrigible bigots and ignorant fools, though I wouldn't hold my breath on that. Theresa May will no doubt be eulogised as the mistress of all she surveys, though this should remind us that pride goes before a fall. Unless more Labour MPs desert (which is not impossible), the next by-election through natural attrition will probably occur in the middle of the Brexit negotiations. By then, Blair and Mandelson will have positioned themselves as more pro-EU than official Labour, which means that they will become increasingly dogmatic and strident. This presents the ironic possibility that the more traditional Labour right might even start to see merits in Corbyn's approach.
The historic significance of the Copeland and Stoke Central by-elections is that they occurred in the period between the passage of the Article 50 bill and the start of Brexit negotiations with the EU27. As such, they may well represent the high-water mark of Conservative popularity and the low-water mark of Labour's. The inescapable complexity of the coming deals, and the inevitability of compromise and choices over priorities, will leave the government vulnerable to substantive criticism as well as leaver disillusion. Labour's decision to vote through the Article 50 bill attracted liberal condemnation, but it means that its future attacks cannot be dismissed by government as the carping of unrepentant remainers, though no doubt the Tory press will try and make the charge stick (and the liberal press will flip from obsessing over Labour's leave-voting constituencies to its remain voters). The next two years will be a contest between versions of Brexit. Though it should set out some general principles now, e.g. in respect of EU citizen rights and worker protection, it makes sense for Labour to build its position incrementally on the back of popular disquiet at Tory mis-steps and emerging risks rather than through a composite motion at conference that offers hostages to fortune. We should never underestimate the Tories' capacity for catastrophic error.
Though much of today's media is focused on the short-term drama of the by-election results, notably in respect of the unpopularity of Jeremy Corbyn (hence Copeland is now centre-stage while former favourite Stoke is being ushered into the wings), the results are consistent with long-term trends. Turnout was down, though that is normal for by-elections. Copeland's drop, from 64% in 2015 to 51% this week, is pretty much what you'd expect, particularly given the bad weather. More troubling is the fall in Stoke Central from 50% to 38%. The disengagement of almost two-thirds of the electorate, despite the high profile of the contest and the insistence on its significance by all parties in the run-up to polling day, does not bode well for democracy. In terms of party share, Labour may have held Copeland (and formerly Whitehaven) for the best part of a century, but it's vote has been in steady decline since 1997 when Jack Cunningham got 58%. Jamie Reed's three elections between 2005 and 2015 saw shares of 51%, 46% and 42% in that order. Gillian Troughton's 37% yesterday clearly owed something to the combination of Corbyn and Sellafield, but it is otherwise on-trend.
A similar trend is visible in Stoke, with Labour's vote falling from a high of 66% in 1997 down to 39% in 2015 and then 37% this week. The fact that 37% was enough to win Stoke but lose Copeland is explained by the more even split of the vote to the right of Labour in the Midlands constituency. Up to 2010 this was between the Tories and Liberal Democrats. In the 2015 general election and this week's by-election UKIP supplanted the latter. In Copeland, where the third party has previously been well short of the second, the collapse in the UKIP vote, from 16% in 2015 to 7% now, closely matched the increase in the Tory share, from 36% to 44%, suggesting that the Kippers focus of their resources in the Potteries was a key factor in the Conservative victory in Cumbria. This, together with the fact that the Tory vote held up in Stoke (from 23% to 24%) and thus denied Paul Nuttall the chance to pip Labour, is a validation of Theresa May's apparent strategy to colonise UKIP's territory by promising a hard Brexit.
The Conservative leader's problem going forward is that the reality of Brexit will alienate some hardcore leavers, who will be disappointed by the necessary compromises, but also some softcore leavers who may not yet appreciate a lot of the negative consequences. Though some of those hardcore voters will gravitate (back) to the Kippers, UKIP is clearly in long-term decline. Paul Nuttall's strategy of challenging Labour in the North doesn't look any more credible today than it did in 2015, despite the extensive and persistent media support for the idea. Increasing its share in Stoke from 23% to 25% is a pretty poor return for all its efforts, not to mention all those Guardian and Newsnight vox-pops. If nothing else, the results this week should indicate that UKIP remains essentially a Tory ginger group, which will perhaps embolden the likes of Suzanne Evans to challenge the discredited and ridiculous Nuttall (or maybe Farage will stage another comeback now that he doesn't seem minded to spend more time with his family).
Assuming Corbyn stays put - and why wouldn't he? - we can expect the liberal media to once more start favouring the Liberal Democrats as the "Labour buster" of British politics, with perhaps a side order of fresh plaudits for the SNP as "sensible remainers". It will be quite like old times and may even lead to a falling off in the number of articles painting the working class as incorrigible bigots and ignorant fools, though I wouldn't hold my breath on that. Theresa May will no doubt be eulogised as the mistress of all she surveys, though this should remind us that pride goes before a fall. Unless more Labour MPs desert (which is not impossible), the next by-election through natural attrition will probably occur in the middle of the Brexit negotiations. By then, Blair and Mandelson will have positioned themselves as more pro-EU than official Labour, which means that they will become increasingly dogmatic and strident. This presents the ironic possibility that the more traditional Labour right might even start to see merits in Corbyn's approach.
Wednesday, 22 February 2017
Ignorance and Democracy
One thing we can already say in advance of this week's two by-elections is that the structural decline of democracy was not arrested by the jolt of last June's referendum. Popular sovereignty was quickly absorbed by the Tories as they jerry-built a soft plebiscitary dictatorship in which the unilateralism and secrecy of Brexit looks set to infect all areas of government. Despite Tony Blair's bizarre call for an uprising, the fragmented centre has committed to a self-indulgent martyrdom, ironically proving its members to be all heart and no head. Meanwhile, Corbyn's attempt to triangulate between the popular will and the interests of Labour's electoral base has been condemned by those for whom triangulation was once a supreme virtue. As I noted back in June, the response to the vote was heavily conditioned by a theory of democracy, dating back to Plato, that focuses on two failings: the people's lack of expertise and their inability to discern the good.
The latter has proved the weaker critique, reflecting both the subjectivity of liberal modernity (we're told we can each pursue our own good) and the impossibility of getting the mass of voters to recognise themselves in the caricature of an irresponsible and inconsiderate demos (an ironic product of the creation of a demonised underclass as the real "enemy within"). It is the critique of ignorance, and the concomitant defence of expertise, that has proved to have legs, not least because it has provided a theoretical framework for the modish concern with "fake news". But the real reason for this idea's success is that it has never gone out of fashion, whereas the belief that the people couldn't recognise the good was forced to take a back-seat with the advent of universal suffrage. Today's "alt-facts" are part of a long tradition in which democracy is corrupted by the base appetites of the lower orders fed by opportunistic new media and demagogues.
In the current cycle, the finger of blame has been pointed firmly at the Internet, despite the evidence that UK tabloids were more decisive in influencing the leave vote and that US cable news was more influential with Trump voters (and with Trump himself - see his recent comments on Sweden prompted by a Fox News report). The idea is to suggest a better status quo ante, and thus relative decline, though academic evidence suggests that levels of public knowledge in respect of public policy have been pretty consistent over the years - i.e. consistently low - while general levels of trust in experts remain high. This points to two great truths. First, most people take a limited interest in politics because the subject has limited relevance to their daily lives. This is why issues around health and education (and occasionally housing) have resonance when they arise. Second, social hierarchies are nowhere near as fluid as the myth of meritocracy would have it. The idea that society as a whole would suddenly lose faith in "experts", while still retaining respect for a monarchy and an unelected House of Lords, is absurd.
The claim that society has intellectually degraded because of the Internet is a commonplace among both conservatives and establishment liberals, though while liberals often emphasise malign forces (e.g. neo-Nazis gaming Google), conservatives tend to focus on the indolence of the people. Tom Nichols, writing in Foreign Affairs on "How America Lost Faith in Expertise" (a summary of his book on the subject), gives this old idea a modern, special snowflake spin: "Americans have reached a point where ignorance—at least regarding what is generally considered established knowledge in public policy—is seen as an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to demonstrate their independence from nefarious elites—and insulate their increasingly fragile egos from ever being told they’re wrong". The culprit is clear: "Ask an expert about the death of expertise, and you will probably get a rant about the influence of the Internet. ... It has allowed people to mimic intellectual accomplishment by indulging in an illusion of expertise provided by a limitless supply of facts." At least he didn't blame postmodernism.
The latter has proved the weaker critique, reflecting both the subjectivity of liberal modernity (we're told we can each pursue our own good) and the impossibility of getting the mass of voters to recognise themselves in the caricature of an irresponsible and inconsiderate demos (an ironic product of the creation of a demonised underclass as the real "enemy within"). It is the critique of ignorance, and the concomitant defence of expertise, that has proved to have legs, not least because it has provided a theoretical framework for the modish concern with "fake news". But the real reason for this idea's success is that it has never gone out of fashion, whereas the belief that the people couldn't recognise the good was forced to take a back-seat with the advent of universal suffrage. Today's "alt-facts" are part of a long tradition in which democracy is corrupted by the base appetites of the lower orders fed by opportunistic new media and demagogues.
In the current cycle, the finger of blame has been pointed firmly at the Internet, despite the evidence that UK tabloids were more decisive in influencing the leave vote and that US cable news was more influential with Trump voters (and with Trump himself - see his recent comments on Sweden prompted by a Fox News report). The idea is to suggest a better status quo ante, and thus relative decline, though academic evidence suggests that levels of public knowledge in respect of public policy have been pretty consistent over the years - i.e. consistently low - while general levels of trust in experts remain high. This points to two great truths. First, most people take a limited interest in politics because the subject has limited relevance to their daily lives. This is why issues around health and education (and occasionally housing) have resonance when they arise. Second, social hierarchies are nowhere near as fluid as the myth of meritocracy would have it. The idea that society as a whole would suddenly lose faith in "experts", while still retaining respect for a monarchy and an unelected House of Lords, is absurd.
The claim that society has intellectually degraded because of the Internet is a commonplace among both conservatives and establishment liberals, though while liberals often emphasise malign forces (e.g. neo-Nazis gaming Google), conservatives tend to focus on the indolence of the people. Tom Nichols, writing in Foreign Affairs on "How America Lost Faith in Expertise" (a summary of his book on the subject), gives this old idea a modern, special snowflake spin: "Americans have reached a point where ignorance—at least regarding what is generally considered established knowledge in public policy—is seen as an actual virtue. To reject the advice of experts is to assert autonomy, a way for Americans to demonstrate their independence from nefarious elites—and insulate their increasingly fragile egos from ever being told they’re wrong". The culprit is clear: "Ask an expert about the death of expertise, and you will probably get a rant about the influence of the Internet. ... It has allowed people to mimic intellectual accomplishment by indulging in an illusion of expertise provided by a limitless supply of facts." At least he didn't blame postmodernism.
Nichols' disdain for the Internet reflects the professional anxiety of the credentialed academic: "I fear we are moving beyond a natural skepticism regarding expert claims to the death of the ideal of expertise itself: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laypeople, teachers and students, knowers and wonderers—in other words, between those with achievement in an area and those with none". Ouch. But as an academic he cannot avoid admitting the lack of novelty in all this: "Of course, this is no more and no less than an updated version of the basic paradox of the printing press ... Libraries, or at least their reference and academic sections, once served as a kind of first cut through the noise of the marketplace. The Internet, however, is less a library than a giant repository where anyone can dump anything. In practice, this means that a search for information will rely on algorithms usually developed by for-profit companies using opaque criteria."
He doesn't explain how this is different to the bias of traditional publishing houses or newspaper proprietors, he merely asserts that it is much worse: "The Internet is the printing press at the speed of fiber optics", which is as meaningless as his ahistoric use of "marketplace" in respect of knowledge and ideas. Nichols' trawl through history does at least identify his true focus, which is not expertise in general (despite weak asides about cognitive bias and anti-vaccine nutters) but politics: "Over a half century ago, the historian Richard Hofstadter wrote that 'the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and comprehendingly perform for himself ... In the original American populistic dream, the omnicompetence of the common man was fundamental and indispensable. It was believed that he could, without much special preparation, pursue the professions and run the government'". That mythical common man obviously didn't include slaves, native Americans or even white indentured labour.
Hofstadter was the author of The Paranoid Style in American Politics (published in the same year that Dr Strangelove was released), which discussed the instrumental use of conspiracy theories. Without irony, Nichols notes that "Conspiracy theories are attractive to people who have a hard time making sense of a complicated world and little patience for boring, detailed explanations", which might seem to reinforce Hofstadter's point about omnicompetence were it not for the indisputable fact that conspiracy theorists actually have huge patience for boring, detailed explanations, from fluoridation to email servers. This appetite for "theory" sits uneasily with the characterisation of the demos as ignorant and lazy, such as in Nichols' citing of a Washington Post poll on American intervention in Ukraine: "Only one in six could identify Ukraine on a map ... the respondents favored intervention in direct proportion to their ignorance. Put another way, the people who thought Ukraine was located in Latin America or Australia were the most enthusiastic about using military force there".
This is a classic party trick, like asking people to estimate the population or point due north. Most people get this wrong simply because the information isn't necessary to them in their daily lives. It doesn't mean that they are stupid or that their views should carry less weight. Nichols concludes by invoking another trope first deployed by Plato, the demos as children, and uses this both to justify technocracy and excuse it as the inevitable response to populism: "Americans (and many other Westerners) have become almost childlike in their refusal to learn enough to govern themselves or to guide the policies that affect their lives. ... In the absence of informed citizens, for example, more knowledgeable administrative and intellectual elites do in fact take over the daily direction of the state and society. ... Today, however, this situation exists by default rather than design. And populism actually reinforces this elitism. ... Faced with a public that has no idea how most things work, experts disengage, choosing to speak mostly to one another."
This argument seeks to reverse the causal relationship, suggesting that what Peter Mair, in Ruling the Void, called "the withdrawal of the elites" has been occasioned by a recent failure of the public to maintain sufficient knowledge of policy, rather than public disengagement (as measured in falling turnouts and party membership) being the result of the professionalisation of party politics. This idea of a secular decline in public competence competes in the marketplace of conservative ideas with the theory of structural disincentives: "the probability that our votes will make a difference is, for most of us in most major elections, vanishingly small. ... In short, the reason people are mostly ignorant and biased about politics is that the incentives are all wrong. Democracies make it so that no individual voters' votes (or political beliefs) make a difference. As a result, no individual is punished for being ignorant or irrational, and no individual is rewarded for becoming informed and rational. Democracies incentivizes us to be 'dumb'" (that there are no real-world political systems that incentivise everyone to be well-informed suggests an elite bias against popular knowledge).
Though this appears to condemn us all to ignorance, given that we're each subject to the same disincentives, the unspoken assumption is that a reduced, more "qualified" electorate would fix the problem. But qualified in what? While contemporary epistocrats talk about educational achievement, the more traditional Platonists advocate the return of property qualifications or votes proportionate to tax contribution. This reflects the fact that politics is not a natural science with observable laws but a social construct and therefore both contestable and malleable. The problem with the critique of ignorance is that what is considered consequential is politically determined. In other words, the people can be alienated from politics by defining it in terms that are preferential to elites. To that end, "expertise" can play a role in isolating politics rather than opening it up to general understanding. The obvious example is foreign affairs, the continuation of aristocratic governance by other means, though this often backfires when the public does take an interest - hence the "Do you even know where Ukraine is?" manoeuvre.
As a social and thus historically-situated construct politics is also subject to structural change. Two recent examples are the impact of globalisation and neoliberal practice. The transfer of powers to Brussels may have been exaggerated by a Eurosceptic media, but it was none the less real, as was the role of privatisation in removing housing and much of economic management from public influence. The growth of "independent" regulation and technocratic management has substituted expert scrutiny for public oversight but at the cost of regulatory capture and groupthink. While experts can legitimately complain about assertive ignorance in the face of scientific evidence, e.g. in respect of climate change or vaccination, this is more difficult to do in the realm of politics when so much of policy has been deliberately steered into areas beyond public purview. The problem is not that people are rejecting the evidence of the experts, but that the evidence is increasingly unavailable to the public.
By-elections have long been promoted as opportunities for a "protest vote". In recent years this has started to morph into the idea of by-elections as exercises in attention-seeking in which electors lash out in frustration: a cry of pain rather than a specific demand. In other words, emotion has got the better of intellect. A hilarious example of this framing was a Guardian-sponsored focus group in Stoke, made up of 10 wavering Labour voters who were subjected to the infantilising exercise of "drawing the parties as cars", which produced this conclusion: "they all agreed that a Ukip win would have an impact on a national level as it would force people to listen to the area's concerns". The rhetorical inflation from "send a message" to "force people to listen" pays lip-service to the instrumental theory of by-elections (though it isn't clear how the election of Paul Nuttall would force anything), but it transfers the electoral outcome from the realm of reason to that of emotion. This is the flip-side of the "listening to people's concerns" cant: the concerns are never interrogated and thus properly engaged with, because the people are only capable of emotional spasms not reasoned argument.
There is a perceived tension at the heart of representative democracy between the need to emotionally reflect the people and intellectually constitute the state. This is the head/heart dichotomy beloved of the self-proclaimed pragmatists and it reflects Plato's original belief that the people must be guided by their betters because they lack self-control as much as expertise. In practice, many voters are frustrated by their representatives' intellectual timidity. The dissatisfaction with the Article 50 debate and Jeremy Corbyn's election(s) are two different examples of a real appetite for policy. Likewise, many are irritated when politicians indulge in the emotionalism of the state, such as last year's "project fear" or Theresa May holding Donald Trump's hand before wittering on about a special relationship. The people are no more emotional or ignorant today than they have ever been. If they are alienated from politics, then that is the fault of politicians, not the people. I have no idea how the two by-elections will go, but my fear is that turnout may be poor, and not just because of the weather.
He doesn't explain how this is different to the bias of traditional publishing houses or newspaper proprietors, he merely asserts that it is much worse: "The Internet is the printing press at the speed of fiber optics", which is as meaningless as his ahistoric use of "marketplace" in respect of knowledge and ideas. Nichols' trawl through history does at least identify his true focus, which is not expertise in general (despite weak asides about cognitive bias and anti-vaccine nutters) but politics: "Over a half century ago, the historian Richard Hofstadter wrote that 'the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and comprehendingly perform for himself ... In the original American populistic dream, the omnicompetence of the common man was fundamental and indispensable. It was believed that he could, without much special preparation, pursue the professions and run the government'". That mythical common man obviously didn't include slaves, native Americans or even white indentured labour.
Hofstadter was the author of The Paranoid Style in American Politics (published in the same year that Dr Strangelove was released), which discussed the instrumental use of conspiracy theories. Without irony, Nichols notes that "Conspiracy theories are attractive to people who have a hard time making sense of a complicated world and little patience for boring, detailed explanations", which might seem to reinforce Hofstadter's point about omnicompetence were it not for the indisputable fact that conspiracy theorists actually have huge patience for boring, detailed explanations, from fluoridation to email servers. This appetite for "theory" sits uneasily with the characterisation of the demos as ignorant and lazy, such as in Nichols' citing of a Washington Post poll on American intervention in Ukraine: "Only one in six could identify Ukraine on a map ... the respondents favored intervention in direct proportion to their ignorance. Put another way, the people who thought Ukraine was located in Latin America or Australia were the most enthusiastic about using military force there".
This is a classic party trick, like asking people to estimate the population or point due north. Most people get this wrong simply because the information isn't necessary to them in their daily lives. It doesn't mean that they are stupid or that their views should carry less weight. Nichols concludes by invoking another trope first deployed by Plato, the demos as children, and uses this both to justify technocracy and excuse it as the inevitable response to populism: "Americans (and many other Westerners) have become almost childlike in their refusal to learn enough to govern themselves or to guide the policies that affect their lives. ... In the absence of informed citizens, for example, more knowledgeable administrative and intellectual elites do in fact take over the daily direction of the state and society. ... Today, however, this situation exists by default rather than design. And populism actually reinforces this elitism. ... Faced with a public that has no idea how most things work, experts disengage, choosing to speak mostly to one another."
This argument seeks to reverse the causal relationship, suggesting that what Peter Mair, in Ruling the Void, called "the withdrawal of the elites" has been occasioned by a recent failure of the public to maintain sufficient knowledge of policy, rather than public disengagement (as measured in falling turnouts and party membership) being the result of the professionalisation of party politics. This idea of a secular decline in public competence competes in the marketplace of conservative ideas with the theory of structural disincentives: "the probability that our votes will make a difference is, for most of us in most major elections, vanishingly small. ... In short, the reason people are mostly ignorant and biased about politics is that the incentives are all wrong. Democracies make it so that no individual voters' votes (or political beliefs) make a difference. As a result, no individual is punished for being ignorant or irrational, and no individual is rewarded for becoming informed and rational. Democracies incentivizes us to be 'dumb'" (that there are no real-world political systems that incentivise everyone to be well-informed suggests an elite bias against popular knowledge).
Though this appears to condemn us all to ignorance, given that we're each subject to the same disincentives, the unspoken assumption is that a reduced, more "qualified" electorate would fix the problem. But qualified in what? While contemporary epistocrats talk about educational achievement, the more traditional Platonists advocate the return of property qualifications or votes proportionate to tax contribution. This reflects the fact that politics is not a natural science with observable laws but a social construct and therefore both contestable and malleable. The problem with the critique of ignorance is that what is considered consequential is politically determined. In other words, the people can be alienated from politics by defining it in terms that are preferential to elites. To that end, "expertise" can play a role in isolating politics rather than opening it up to general understanding. The obvious example is foreign affairs, the continuation of aristocratic governance by other means, though this often backfires when the public does take an interest - hence the "Do you even know where Ukraine is?" manoeuvre.
As a social and thus historically-situated construct politics is also subject to structural change. Two recent examples are the impact of globalisation and neoliberal practice. The transfer of powers to Brussels may have been exaggerated by a Eurosceptic media, but it was none the less real, as was the role of privatisation in removing housing and much of economic management from public influence. The growth of "independent" regulation and technocratic management has substituted expert scrutiny for public oversight but at the cost of regulatory capture and groupthink. While experts can legitimately complain about assertive ignorance in the face of scientific evidence, e.g. in respect of climate change or vaccination, this is more difficult to do in the realm of politics when so much of policy has been deliberately steered into areas beyond public purview. The problem is not that people are rejecting the evidence of the experts, but that the evidence is increasingly unavailable to the public.
By-elections have long been promoted as opportunities for a "protest vote". In recent years this has started to morph into the idea of by-elections as exercises in attention-seeking in which electors lash out in frustration: a cry of pain rather than a specific demand. In other words, emotion has got the better of intellect. A hilarious example of this framing was a Guardian-sponsored focus group in Stoke, made up of 10 wavering Labour voters who were subjected to the infantilising exercise of "drawing the parties as cars", which produced this conclusion: "they all agreed that a Ukip win would have an impact on a national level as it would force people to listen to the area's concerns". The rhetorical inflation from "send a message" to "force people to listen" pays lip-service to the instrumental theory of by-elections (though it isn't clear how the election of Paul Nuttall would force anything), but it transfers the electoral outcome from the realm of reason to that of emotion. This is the flip-side of the "listening to people's concerns" cant: the concerns are never interrogated and thus properly engaged with, because the people are only capable of emotional spasms not reasoned argument.
There is a perceived tension at the heart of representative democracy between the need to emotionally reflect the people and intellectually constitute the state. This is the head/heart dichotomy beloved of the self-proclaimed pragmatists and it reflects Plato's original belief that the people must be guided by their betters because they lack self-control as much as expertise. In practice, many voters are frustrated by their representatives' intellectual timidity. The dissatisfaction with the Article 50 debate and Jeremy Corbyn's election(s) are two different examples of a real appetite for policy. Likewise, many are irritated when politicians indulge in the emotionalism of the state, such as last year's "project fear" or Theresa May holding Donald Trump's hand before wittering on about a special relationship. The people are no more emotional or ignorant today than they have ever been. If they are alienated from politics, then that is the fault of politicians, not the people. I have no idea how the two by-elections will go, but my fear is that turnout may be poor, and not just because of the weather.
Wednesday, 15 February 2017
Algopops
One of the defining characteristics of the debate on the role of software in modern society is the tendency towards anthropomorphism. Despite the stories about job-stealing robots, what we apparently fear most is not machines that look vaguely like humans, with their metal arms whirling over production lines, but malicious code that exists in a realm beyond the corporeal. We speculate about the questionable motives of algorithms and worry about them going "rogue". Such language reveals a fear of disobedience as much as malevolence, which should indicate its origin in the old rhetoric of class (much like the etymology of the word "robot"). In a similar vein, the trope of the hacked kettle recycles the language of outside agitators and suborned servants. In contrast, artificial intelligence is subject to theomorphism: the idea that in becoming superior to human intelligence it becomes god-like, an event that can occur well short of the technological singularity (as Arthur C. Clarke noted, "Any sufficiently advanced technology is indistinguishable from magic").
This distinction between algorithms and AI has echoes of the "great chain of being", the traditional theory of hierarchy that has enjoyed something of a revival among the neo-reactionary elements of the alt-right, but which can also be found buried deep within the ecological movement and the wider culture of Sci-Fi and fantasy. Given that mix, there should be no surprise that the idea of hierarchy has always been central to the Californian Ideology and its (non-ironic) interpretation of "freedom". If Marxism and anarchism treat class and order as historically contingent, and therefore mutable, the defining characteristic of the party of order - and one that reveals the fundamental affinity between conservatives and liberals - is the belief that hierarchy is natural. Inheritance and competition are just different methods used to sort the array, to employ a software term, and not necessarily incompatible.
Inevitably the cry goes up that we must regulate and control algorithms for the public good, and just as inevitably we characterise the bad that algorithms can do in terms of the threat to the individual, such as discrimination arising from bias. The proposed method for regulating algorithms and AI is impeccably liberal: an independent, third-party "watchdog" (a spot of zoomorphism for you there). Amusingly, this would even contain a hierarchy of expertise: "made up of law, social science, and philosophy experts, computer eggheads, natural scientists, mathematicians, engineers, industry, NGOs, and the public". This presents a number of scale problems. Software is unusual, compared to earlier general purpose technologies such as steam power or electricity, in that what needs regulation is not its fundamental properties but its specific applications. Regulating the water supply means ensuring the water is potable - it doesn't mean checking that it's as effective in cleaning dishes as in diluting lemon squash. When we talk about regulating algorithms we are proposing to review the purpose and operation of a distinct program, not the integrity of its programming language.
In popular use, the term "algorithm" is a synecdoche for the totality of software and data. An application is made up of multiple, inter-dependent algorithms and its consequential behaviour may be determined by data more than code. To isolate and examine the relevant logic a regulator would need an understanding of the program on a par with its programmers. If that sounds a big ask, consider how a regulator would deal with an AI system that "learns" new rules from its data, particularly if those rules are dynamic and thus evanescent. This is not to suggest that software might become "inscrutable", which is just another anthropomorphic trope on the way to the theomorphism of a distracted god, but that understanding its logic may be prohibitively expensive. Perhaps we could automate this to a degree, but that would present a fresh problem of domain knowledge. Software bias isn't about incorrect maths but encoded assumptions that reflect norms and values. This can only be properly judged by humans, but would a regulator have the broad range of expertise necessary to evaluate the logic of all applications?
Initially, a regulator would probably respond to individual complaints after-the-fact, however history suggests that the regime will evolve towards up-front testing, at least within specific industries. The impetus for standards and regulation is typically a joint effort by the state, seeking to protect consumers, and capital, seeking to protect its investment. While the former is dominant to begin with, the latter becomes more dominant over time as the major firms seek to cement their incumbency through regulatory capture and as their investors push for certification and thus indemnities in advance. You'd need a very large regulator (or lots of them) to review all software up-front, and this is amplified by the need to regression test every subsequent software update to ensure new biases haven't crept in. While this isn't inconceivable (if the robots take all the routine jobs, being a software investigator may become an major career choice - a bit like Blade Runner but without the guns), it would represent the largest regulatory development in history.
An alternative approach would be to leverage software engineering itself. While not all software employs strict modularisation or test-driven development, these practices are prevalent enough to expect most programs to come with a comprehensive set of unit tests. If properly constructed (and this can be standardised), the tests should reveal enough about the assumptions encoded within the program logic (the what and why), while not exposing the programming itself (the how), to allow for meaningful review and even direct interrogation using heterogeneous data (i.e. other than the test data employed by the programmers). Unit tests are sufficiently black box-like to prevent reverse engineering and their architecture allows the test suite to be extended. What this means is that the role of regulation could be limited to ensuring that all applications publish standard unit tests within an open framework (i.e. one that could be interfaced with publicly) and perhaps ensuring that certain common tests (e.g. for race, gender or age discrimination) are included by default.
The responsibility for independently running tests, and for developing extended tests to address particular concerns, could then be crowdsourced. Given the complexity of modern software applications, let alone the prospect of full-blown artificial general intelligence systems, it might seem improbable that an "amateur" approach would be effective, but that is to ignore three salient points. First, the vast majority of software flaws are the product of poor development practice (i.e. inadequate testing), the disinterest of manufacturers in preventing vulnerabilities (those hackable kettles), and sheer incompetence in systems management (e.g. TalkTalk). Passing these through the filter of moderately-talented teenagers would weed out most of them. Second, pressure groups with particular concerns could easily develop standard tests that could be applied across multiple applications - for example, checking for gender bias. Third, quality assurance in software development already (notoriously) depends on user testing in the form of feedback on bugs in public releases. Publication of unit tests allows that to be upgraded from a reactive to a proactive process.
Interestingly, the crowdsource approach is already being advocated for fact-checking. While traditional media make a fetish of militant truth and insist on their role as a supervisor of propriety (technically a censor, but you can understand why they avoid that term), some new media organisations are already down with the idea of active public invigilation rather than just passive applause for the gatekeeper. For example, Mr Wikipedia, Jimmy Wales, reckons "We need people from across the political spectrum to help identify bogus websites and point out fake news. New systems must be developed to empower individuals and communities – whether as volunteers, paid staff or both. To tap into this power, we need openness ... If there is any kryptonite to false information, it’s transparency. Technology platforms can choose to expose more information about the content people are seeing, and why they’re seeing it." In other words, power to the people. Of course, near-monopoly Internet platforms, like global media companies, have a vested interest in limiting transparency and avoiding responsibility: the problem of absentee ownership.
The idea that we would do better to rely on many eyes is hardly new, and nor is the belief that collaboration is unavoidable in the face of complexity. As the philosopher Daniel Dennett put it recently, "More and more, the unit of comprehension is going to be group comprehension, where you simply have to rely on a team of others because you can’t understand it all yourself. There was a time, oh, I would say as recently as, certainly as the 18th century, when really smart people could aspire to having a fairly good understanding of just about everything". Dennett's recycling of the myth of "the last man who knew everything" (a reflection of the narrowness of elite experience in the era of Diderot's Encyclopedie) hints at the underlying distaste for the diffusion of knowledge and power beyond "really smart people" that also informs the anxiety over fake news and the long-running caricature of postmodernism as an assault on truth. While this position is being eroded in the media under the pressure of events, it remains firmly embedded in the discourse around the social control of software. We don't need AI watchdogs, we need popular sovereignty over algorithms.
This distinction between algorithms and AI has echoes of the "great chain of being", the traditional theory of hierarchy that has enjoyed something of a revival among the neo-reactionary elements of the alt-right, but which can also be found buried deep within the ecological movement and the wider culture of Sci-Fi and fantasy. Given that mix, there should be no surprise that the idea of hierarchy has always been central to the Californian Ideology and its (non-ironic) interpretation of "freedom". If Marxism and anarchism treat class and order as historically contingent, and therefore mutable, the defining characteristic of the party of order - and one that reveals the fundamental affinity between conservatives and liberals - is the belief that hierarchy is natural. Inheritance and competition are just different methods used to sort the array, to employ a software term, and not necessarily incompatible.
Inevitably the cry goes up that we must regulate and control algorithms for the public good, and just as inevitably we characterise the bad that algorithms can do in terms of the threat to the individual, such as discrimination arising from bias. The proposed method for regulating algorithms and AI is impeccably liberal: an independent, third-party "watchdog" (a spot of zoomorphism for you there). Amusingly, this would even contain a hierarchy of expertise: "made up of law, social science, and philosophy experts, computer eggheads, natural scientists, mathematicians, engineers, industry, NGOs, and the public". This presents a number of scale problems. Software is unusual, compared to earlier general purpose technologies such as steam power or electricity, in that what needs regulation is not its fundamental properties but its specific applications. Regulating the water supply means ensuring the water is potable - it doesn't mean checking that it's as effective in cleaning dishes as in diluting lemon squash. When we talk about regulating algorithms we are proposing to review the purpose and operation of a distinct program, not the integrity of its programming language.
In popular use, the term "algorithm" is a synecdoche for the totality of software and data. An application is made up of multiple, inter-dependent algorithms and its consequential behaviour may be determined by data more than code. To isolate and examine the relevant logic a regulator would need an understanding of the program on a par with its programmers. If that sounds a big ask, consider how a regulator would deal with an AI system that "learns" new rules from its data, particularly if those rules are dynamic and thus evanescent. This is not to suggest that software might become "inscrutable", which is just another anthropomorphic trope on the way to the theomorphism of a distracted god, but that understanding its logic may be prohibitively expensive. Perhaps we could automate this to a degree, but that would present a fresh problem of domain knowledge. Software bias isn't about incorrect maths but encoded assumptions that reflect norms and values. This can only be properly judged by humans, but would a regulator have the broad range of expertise necessary to evaluate the logic of all applications?
Initially, a regulator would probably respond to individual complaints after-the-fact, however history suggests that the regime will evolve towards up-front testing, at least within specific industries. The impetus for standards and regulation is typically a joint effort by the state, seeking to protect consumers, and capital, seeking to protect its investment. While the former is dominant to begin with, the latter becomes more dominant over time as the major firms seek to cement their incumbency through regulatory capture and as their investors push for certification and thus indemnities in advance. You'd need a very large regulator (or lots of them) to review all software up-front, and this is amplified by the need to regression test every subsequent software update to ensure new biases haven't crept in. While this isn't inconceivable (if the robots take all the routine jobs, being a software investigator may become an major career choice - a bit like Blade Runner but without the guns), it would represent the largest regulatory development in history.
An alternative approach would be to leverage software engineering itself. While not all software employs strict modularisation or test-driven development, these practices are prevalent enough to expect most programs to come with a comprehensive set of unit tests. If properly constructed (and this can be standardised), the tests should reveal enough about the assumptions encoded within the program logic (the what and why), while not exposing the programming itself (the how), to allow for meaningful review and even direct interrogation using heterogeneous data (i.e. other than the test data employed by the programmers). Unit tests are sufficiently black box-like to prevent reverse engineering and their architecture allows the test suite to be extended. What this means is that the role of regulation could be limited to ensuring that all applications publish standard unit tests within an open framework (i.e. one that could be interfaced with publicly) and perhaps ensuring that certain common tests (e.g. for race, gender or age discrimination) are included by default.
The responsibility for independently running tests, and for developing extended tests to address particular concerns, could then be crowdsourced. Given the complexity of modern software applications, let alone the prospect of full-blown artificial general intelligence systems, it might seem improbable that an "amateur" approach would be effective, but that is to ignore three salient points. First, the vast majority of software flaws are the product of poor development practice (i.e. inadequate testing), the disinterest of manufacturers in preventing vulnerabilities (those hackable kettles), and sheer incompetence in systems management (e.g. TalkTalk). Passing these through the filter of moderately-talented teenagers would weed out most of them. Second, pressure groups with particular concerns could easily develop standard tests that could be applied across multiple applications - for example, checking for gender bias. Third, quality assurance in software development already (notoriously) depends on user testing in the form of feedback on bugs in public releases. Publication of unit tests allows that to be upgraded from a reactive to a proactive process.
Interestingly, the crowdsource approach is already being advocated for fact-checking. While traditional media make a fetish of militant truth and insist on their role as a supervisor of propriety (technically a censor, but you can understand why they avoid that term), some new media organisations are already down with the idea of active public invigilation rather than just passive applause for the gatekeeper. For example, Mr Wikipedia, Jimmy Wales, reckons "We need people from across the political spectrum to help identify bogus websites and point out fake news. New systems must be developed to empower individuals and communities – whether as volunteers, paid staff or both. To tap into this power, we need openness ... If there is any kryptonite to false information, it’s transparency. Technology platforms can choose to expose more information about the content people are seeing, and why they’re seeing it." In other words, power to the people. Of course, near-monopoly Internet platforms, like global media companies, have a vested interest in limiting transparency and avoiding responsibility: the problem of absentee ownership.
The idea that we would do better to rely on many eyes is hardly new, and nor is the belief that collaboration is unavoidable in the face of complexity. As the philosopher Daniel Dennett put it recently, "More and more, the unit of comprehension is going to be group comprehension, where you simply have to rely on a team of others because you can’t understand it all yourself. There was a time, oh, I would say as recently as, certainly as the 18th century, when really smart people could aspire to having a fairly good understanding of just about everything". Dennett's recycling of the myth of "the last man who knew everything" (a reflection of the narrowness of elite experience in the era of Diderot's Encyclopedie) hints at the underlying distaste for the diffusion of knowledge and power beyond "really smart people" that also informs the anxiety over fake news and the long-running caricature of postmodernism as an assault on truth. While this position is being eroded in the media under the pressure of events, it remains firmly embedded in the discourse around the social control of software. We don't need AI watchdogs, we need popular sovereignty over algorithms.
Friday, 10 February 2017
F is for Fake
Propaganda relies more on reinforcement than persuasion. It doesn't change minds so much as bolster them. It works with the grain, building on existing prejudices and common cognitive biases to provide reassurance in support of already-formed beliefs. Propaganda works where there is a predisposition to believe. For example, Nazi propaganda was effective after 1939 because Germany was at war and the population subconsciously feared retribution. It gradually lost its power after 1943 as defeat became inevitable. In contrast, the propaganda of the USSR in the 70s and 80s was ineffective because the economy was visibly stagnant. The dynamic of reinforcement is key to understanding the current flap over "fake news". The consumers of the product are true believers rather than credulous dupes, but their belief is of a particular sort: they have a unifying theory of everything. In Isaiah Berlin's famous typology they are hedgehogs rather than foxes. If you think the world is explained by a busy God, a conspiracy of seven foot tall lizards or the machinations of the Jews, then you will be more likely to believe news that supports your priors and dismiss anything that conflicts.
This might suggest that fake news is limited to an obsessive minority, but the attitude of "true belief" is found across the political spectrum and not just at the extremes. Centrists who insist that the answer to every policy problem is either "competition" or "education" are also in the grip of this monist delusion. Where the centre differs from the right and the left is in not needing proactive reassurance, though this is simply a reflection of the structural reassurance of hegemony, much as the followers of a state religion tend be theological "don't knows". In other words, centrists don't seek out their fake news because it is pervasive. That said, while the bias of the mainstream media is real, it would be wrong to believe it is simply more insidious, or just more skilful, than that of the extremes. Consumers of fake news often know that what they're seeing or hearing is hyped but they enjoy it none the less because it validates their already formed beliefs, which in turn encourages the producers to push the limits of credibility further. That's why fake news is often ridiculous.
The post-2008 confusion of the political centre owes much to a crisis of confidence over policy, not least in respect of the efficacy of the panaceas of competition and education, but it has yet to undermine a worldview that holds moderation and triangulation as self-evidently good. This leads to the sight of liberals going through the electoral motions but without a programme to speak of, most obviously in Hillary Clinton's failed campaign in the US last year and currently in the boosting of Emmanuel Macron in France. Even Martin Kettle in The Guardian could not quite quell his doubts over this farcical re-run of the Blairite project: "Whether there would be a durable national embrace of what Macron stands for is far from certain. In large part that is because Macron has not said what he stands for". In a similar vein, the Liberal Democrats appear to have made unconditional EU love their sole policy, which is likely to condemn them to the political margins for a generation.
The policy void has placed a premium on attitude and behaviour, but it has also led to a sense of vulnerability in the face of external manipulation. This is evident not just in the imputed fragility of "generation snowflake" but in the assumed stupidity and credulity of the working class in the face of demagogues and the imagined power of online grooming. Everybody is about to be hoodwinked, hence the salience of fake news. This reductive attitude divides the world into those who are secure in their conventional beliefs and those who are empty vessels at risk of being filled with frothing madness. It's a pretty obvious transference of self-doubt by centrists, but it also over-states the power of manipulation. Like the hype around "filter bubbles" and unconscious bias ("check your privilege" etc.), this leads us to forget that believing bat-shit crazy nonsense and dismissing science out of hand are characteristic of a very small minority, not the majority. Just as most people do not believe the Earth is flat, so most people are not actually contemptuous of experts.
Lacking a positive vision, liberals are reduced to the conservative strategy of defending the status quo through project fear: constructing a deplorable enemy to rally support for the centre. But the consequence of the policy vacuum is that it draws the enemy centre-stage. Nigel Farage's prominence, the obsession with Donald Trump's idiot tweets and the attention garnered by a provocateur such as Milo Yiannopoulos are symptom's of the centre's malaise, not a sea-change in society. This is partly driven by structural change as traditional publishers try and adjust to new media, hence fake news is emblematic of poor quality control and so serves to reinforce the role of gatekeepers. But at heart it reflects the dependence of the media on a political centre that is failing to produce "non-fake news" of sufficient calibre. Once centrists deserted the arena of policy, it was inevitable that "squatters" would move in. Leave didn't win the EU referendum because voters believed £350m would be spent on the NHS but because voters wanted more spent on it and one side of the argument was prepared to agree. The point is not dishonesty but the inevitable attraction of the false promise when the establishment promises nothing.
The centrist choice of the right for the role of "deplorable"- the left being marginalised as irrelevant or incompetent (see the collected works of John Harris) - is clearly a projection of unspoken desires: toying with nationalism entails an illicit thrill greater than toying with nationalisation. This is a miscalculation of epic proportions, not because real Fascists will seize power (not even in France), but because it normalises racial and sectarian discrimination as a lesser evil through such formulations as "legitimate concerns". This ultimately coarsens political discourse and corrodes liberal norms. The problem with demanding a "debate about immigration" is that we've been talking about the issue for decades without any satisfactory conclusion. That's because there isn't one. Xenophobes will never be reconciled while "controls" will crumble in the face of economic imperatives. Centrist politicians who suggest there might be some happy medium are being dishonest. This will eventually lead to intellectual exhaustion and the acceptance of dysfunction, like the government's admission this week that it has no housing policy to speak of.
In the current environment - liberal self-doubt, centrist accommodation of right-wing agenda and a media fascinated by conservative outrage - it should come as no surprise that the alt-right have bubbled to the surface of public consciousness. They should not be mis-characterised as a political movement, any more than the victory of Trump should be seen as a popular uprising rather than what it is: a coup by a criminal gang that has ridden a populist wave. The alt-right is not the traditional Fascist right. They are dilettantes rather than social revolutionaries, amateur reactionaries rather than violent community activists. As Walter didn't quite say in The Big Lebowski, "I mean, say what you want about Nazis, Dude, but they always punch back". Just as fake news reflects the vacuity of the centre, the alt-right points to the essential fakery of the "nationalist revival" and its inherent instability, which can be seen in the evident tensions within the Front National between its Fascist core and its metrosexual marketing department.
Watching the Channel 4 News report on the UK alt-right on Wednesday evening I was struck by how middle-class this new generation of white supremacists and crypto-Fascists is. These people have always been around but used to be quarantined in the Young Conservatives or the more outré university clubs. The Internet has provided them with a forum independent of institutional restraints and crucially it has allowed them to organise without the need to join traditional right-wing groups like the BNP. They don't have to suffer the embarrassment of going on marches with skinheads and can indulge their fantasies about eugenics for the lower orders without the risk of encountering them "in real life". Their emphasis on IQ (The Bell Curve featured) and racial and cultural purity (as did Mein Kampf) was as sociologically telling as the insistence that they were actually libertarians (the equality of now) defending Western civilisation (the hierarchy of then). These bedroom Nazis are ridiculous. A twenty-something Paleo-conservative trying to hide their alt-right activities from liberal parents sounds like a pitch for a sitcom. The alt-right may well turn out to be the biggest fake news of the decade.
This might suggest that fake news is limited to an obsessive minority, but the attitude of "true belief" is found across the political spectrum and not just at the extremes. Centrists who insist that the answer to every policy problem is either "competition" or "education" are also in the grip of this monist delusion. Where the centre differs from the right and the left is in not needing proactive reassurance, though this is simply a reflection of the structural reassurance of hegemony, much as the followers of a state religion tend be theological "don't knows". In other words, centrists don't seek out their fake news because it is pervasive. That said, while the bias of the mainstream media is real, it would be wrong to believe it is simply more insidious, or just more skilful, than that of the extremes. Consumers of fake news often know that what they're seeing or hearing is hyped but they enjoy it none the less because it validates their already formed beliefs, which in turn encourages the producers to push the limits of credibility further. That's why fake news is often ridiculous.
The post-2008 confusion of the political centre owes much to a crisis of confidence over policy, not least in respect of the efficacy of the panaceas of competition and education, but it has yet to undermine a worldview that holds moderation and triangulation as self-evidently good. This leads to the sight of liberals going through the electoral motions but without a programme to speak of, most obviously in Hillary Clinton's failed campaign in the US last year and currently in the boosting of Emmanuel Macron in France. Even Martin Kettle in The Guardian could not quite quell his doubts over this farcical re-run of the Blairite project: "Whether there would be a durable national embrace of what Macron stands for is far from certain. In large part that is because Macron has not said what he stands for". In a similar vein, the Liberal Democrats appear to have made unconditional EU love their sole policy, which is likely to condemn them to the political margins for a generation.
The policy void has placed a premium on attitude and behaviour, but it has also led to a sense of vulnerability in the face of external manipulation. This is evident not just in the imputed fragility of "generation snowflake" but in the assumed stupidity and credulity of the working class in the face of demagogues and the imagined power of online grooming. Everybody is about to be hoodwinked, hence the salience of fake news. This reductive attitude divides the world into those who are secure in their conventional beliefs and those who are empty vessels at risk of being filled with frothing madness. It's a pretty obvious transference of self-doubt by centrists, but it also over-states the power of manipulation. Like the hype around "filter bubbles" and unconscious bias ("check your privilege" etc.), this leads us to forget that believing bat-shit crazy nonsense and dismissing science out of hand are characteristic of a very small minority, not the majority. Just as most people do not believe the Earth is flat, so most people are not actually contemptuous of experts.
Lacking a positive vision, liberals are reduced to the conservative strategy of defending the status quo through project fear: constructing a deplorable enemy to rally support for the centre. But the consequence of the policy vacuum is that it draws the enemy centre-stage. Nigel Farage's prominence, the obsession with Donald Trump's idiot tweets and the attention garnered by a provocateur such as Milo Yiannopoulos are symptom's of the centre's malaise, not a sea-change in society. This is partly driven by structural change as traditional publishers try and adjust to new media, hence fake news is emblematic of poor quality control and so serves to reinforce the role of gatekeepers. But at heart it reflects the dependence of the media on a political centre that is failing to produce "non-fake news" of sufficient calibre. Once centrists deserted the arena of policy, it was inevitable that "squatters" would move in. Leave didn't win the EU referendum because voters believed £350m would be spent on the NHS but because voters wanted more spent on it and one side of the argument was prepared to agree. The point is not dishonesty but the inevitable attraction of the false promise when the establishment promises nothing.
The centrist choice of the right for the role of "deplorable"- the left being marginalised as irrelevant or incompetent (see the collected works of John Harris) - is clearly a projection of unspoken desires: toying with nationalism entails an illicit thrill greater than toying with nationalisation. This is a miscalculation of epic proportions, not because real Fascists will seize power (not even in France), but because it normalises racial and sectarian discrimination as a lesser evil through such formulations as "legitimate concerns". This ultimately coarsens political discourse and corrodes liberal norms. The problem with demanding a "debate about immigration" is that we've been talking about the issue for decades without any satisfactory conclusion. That's because there isn't one. Xenophobes will never be reconciled while "controls" will crumble in the face of economic imperatives. Centrist politicians who suggest there might be some happy medium are being dishonest. This will eventually lead to intellectual exhaustion and the acceptance of dysfunction, like the government's admission this week that it has no housing policy to speak of.
In the current environment - liberal self-doubt, centrist accommodation of right-wing agenda and a media fascinated by conservative outrage - it should come as no surprise that the alt-right have bubbled to the surface of public consciousness. They should not be mis-characterised as a political movement, any more than the victory of Trump should be seen as a popular uprising rather than what it is: a coup by a criminal gang that has ridden a populist wave. The alt-right is not the traditional Fascist right. They are dilettantes rather than social revolutionaries, amateur reactionaries rather than violent community activists. As Walter didn't quite say in The Big Lebowski, "I mean, say what you want about Nazis, Dude, but they always punch back". Just as fake news reflects the vacuity of the centre, the alt-right points to the essential fakery of the "nationalist revival" and its inherent instability, which can be seen in the evident tensions within the Front National between its Fascist core and its metrosexual marketing department.
Watching the Channel 4 News report on the UK alt-right on Wednesday evening I was struck by how middle-class this new generation of white supremacists and crypto-Fascists is. These people have always been around but used to be quarantined in the Young Conservatives or the more outré university clubs. The Internet has provided them with a forum independent of institutional restraints and crucially it has allowed them to organise without the need to join traditional right-wing groups like the BNP. They don't have to suffer the embarrassment of going on marches with skinheads and can indulge their fantasies about eugenics for the lower orders without the risk of encountering them "in real life". Their emphasis on IQ (The Bell Curve featured) and racial and cultural purity (as did Mein Kampf) was as sociologically telling as the insistence that they were actually libertarians (the equality of now) defending Western civilisation (the hierarchy of then). These bedroom Nazis are ridiculous. A twenty-something Paleo-conservative trying to hide their alt-right activities from liberal parents sounds like a pitch for a sitcom. The alt-right may well turn out to be the biggest fake news of the decade.
Subscribe to:
Posts (Atom)