All media are driven by the opinion of others. Factual news is expensive to acquire, unreliable in its supply, and tends towards the dull but worthy. Opinion, on the other hand, is inexhaustible, cheap and reliably contentious. In a variation on the City speculators' mantra, there is always a "greater fool" who will respond to Richard Littlejohn or George Monbiot (I plead guilty on numerous counts, m'lud). We turn gratefully from pictures and testimony of suffering in Gaza to verbal bunfights between Israeli spokesmen and TV news anchors, where the likelihood of a mind being changed is precisely zero (and, surprisingly, not likely to be helped by Mia Farrow and other slebs).
The first newspapers, which were gazettes of court announcements, were of limited interest compared to illustrated chapbooks detailing Catholic atrocities or tracts promising the imminence of God's Kingdom. Though literate snark soon came to the fore (The Spectator was launched in 1711), most newspapers continued to rely on adverts as much as editorial to pique interest until nineteenth century mass-literacy led to the "human interest story" and a realisation of the power of mobilised opinion to sway elected legislatures. The long twentieth century, from the launch of Tit-Bits in 1881 to the Web 2.0 media-moment in 2004, was distinguished by gossip, the privileged opinions of the high and mighty, and messages from "our sponsors".
The Internet has supposedly democratised opinion, providing a variety of platforms for "everyone" to have their say, but this just means a vast increase in content (gossip, opinion, ads) and thus an even greater value accorded to aggregators amd filters, which is what Tit-Bits was and arguably Addison and Steele's imaginary spectator was too. Plus ça change. Though some services claim that you, the consumer, are now able to act as your own aggregator (choosing who to follow, specifying your interests etc), the trope of "content overload" obviously serves the purposes of those who would "pre-curate" your content stream based on algorithms that analyse your history and relationships. Of course, such algorithmic precision is a myth, which even some services are happy to admit.
The churn in technologies and corporate providers obscures the persistence of opinion as the major driver of media. Thus incumbent providers, like the press, lift up their skirts and screech at the thought of Facebook manipulating a user's stream, while music nostalgists bemoan the death of the album (curation by Big Music) under the onslaught of streaming service playlists (dominated by sleb curation, which is the new face of Big Music). The success of Twitter, which is a pretty ropey piece of technology, is down to the simple fact that it provides raw, instant opinion, the slebbier the better (accept it: Rihanna is a better football pundit than you are).
Search engine results can be thought of as a type of playlist, dynamically curated by an algorithm that aggregates and orders relevant pages based on the "opinion" of other pages. The plea by Google that they should not be held responsible for the opinion of others (which has predictably found favour with the House of Lords) depends on a belief that the algorithm is an accurate reflection of that aggregated opinion (impossible to know) and not subject to any manipulation or systemic bias (clearly untrue). The myth of the Internet is that everyone has an opinion (because property is universal) and that the expression of that opinion should be free and unrestricted (because the state should not limit your rights in your property).
The key change that occurred a decade ago was that we moved from a culture in which being a passive consumer of the opinion of privileged others was deemed sufficient to one in which we must now all express an opinion as well, whether on Gaza or a friend's new cardigan. But that opinion does not need to be a reasoned analysis (we don't seriously seek to dethrone the aristocracy of pundits, merely to vote them up or down by the weight of our comments). A simple preference will do, either through a binary "like" or a top 10 ordering (the modern "listicle" is simply a way to teach us correct practice). Creating lists, and curating them on an ongoing basis, has become a social obligation: performative opinionising as a way of defining who we are. By their public playlists ye shall know them.
Popular Tropes
▼
And now for something completely different ...
▼
Wednesday, 30 July 2014
Thursday, 24 July 2014
L'Etat C'est Moi
Tony Blair's speech this week, twenty years to the day after he was elected Labour Party leader, prompted a number of retrospectives by his supporters in the press, which in turn prompted the usual venom beneath the fold about duplicitous war-mongering. Both sides largely ignored the speech itself, which was delivered in memory of focus group impressario Philip Gould and was eerily anachronistic in its promotion of a supra-ideological "third way", the false dichotomy of the individual and the collective, and the notion that the "zeitgeist" is some sort of constructed reality amenable to policy. It's as if Blair has barely been in the country of late.
Most of his defenders feel that his progressive record has been unfairly tarnished by the fallout from Iraq, which the former PM mentions precisely once in his speech, in a clause starting with "whatever". John Rentoul in The Independent was typical: "The country has changed, mostly for the better, in 20 years and much of it is because of Tony Blair. Unexpectedly, the change was best summed up by David Cameron in his words on entering No 10 four years ago, when he said that the country he inherited was 'more open at home and more compassionate abroad' than it had been". This is Great Man history, which marginalises the contribution of others (the snub to Gordon Brown is deliberate), wallows in nostalgia ("the sun shone more under the old king"), and equates the national mood with a personal style.
Janan Ganesh in The FT suggests that Blair was more pragmatic than he is given credit for: "He governed with the grain of history, nudging it along from time to time, but never upending a country that was functioning well enough". This is vapid insofar as almost all heads of government do exactly the same. It would serve just as well as a testament to John Major. It might appear paradoxical that a self-styled progressive like Rentoul would laud the impact of the individual on history, while a conservative like Ganesh emphasises structural forces, but the former is merely the neoliberal valorisation of "talent" as a proxy for class, while the latter believes that the exceptional Thatcher changed the course of history, tearing up the postwar settlement and embarking on a social and economic revolution that Blair merely continued (the reality was that she rode the wave of global structural change as much as she vigorously turned the tiller).
Much of Blair's achievement is simply down to longevity. If you cling to office long enough, you will get the credit for all sorts of secular changes and social shifts that simply coincided with your watch, while observers will marvel that you haven't stayed exactly the same ("from Bambi to Bliar"). In The Telegraph, Stephen Bush credits Blair with installing a security door on his childhood block of flats, much as good harvests were once attributed to the beneficence of the distant monarch. Similarly, Rentoul reckons Blair "achieved the near-impossible in Northern Ireland" with the signing of The Good Friday Agreement, ignoring the patient build up since the Downing Street Declaration in 1993, not to mention the obvious readiness of the paramilitaries for a face-saving peace deal long before Blair's accession in 1997.
Ganesh claims that Blair "did not come from anywhere in particular", despite the public school and Oxford background, though his classlessness and cosmopolitan ease were more apparent to journalists than to the average voter. Cameron's belief that he could model himself on Blair sprang from a realisation that he was not a million miles away from the upper middle class lawyer in style and experience. Chris Dillow identifies Blair's "managerialist ideology" as his weakness, leading to over-confidence and poor decision-making. This ideology was part of a wider commitment to neoliberalism at home and abroad, which included the privileging of The City, his mugging by US neocons over Iraq, and his subsequent ascension to the global 0.01%. (In The Guardian, Michael White said: "Someone told me recently he'd brokered an oligarch's yacht sale". The point is not that this tale is true, but that it is credible).
Once they got beyond the noise about the horrors of the Saddam regime (i.e. "he had it coming"), Blair's supporters initially defended the decision to go to war in Iraq on the grounds that it was made in good faith, the absence of WMD notwithstanding. As the cost of failure mounted, that tactical error was subsumed beneath the massive strategic blowback, suggesting that either the plan all along was to trash the region or else the architects of intervention were drunk on their own power. Blair seems oblivious to the irony of his contemporary words: "Third way politics begins with an analysis of the world shaped by reality not ideology, not by delusionary thoughts based on how we want the world to be, but by hardheaded examination of the world as it actually is".
A feature of Blair's delusion, which is being eagerly advanced by his supporters in the press, is that all the structural failings that he ignored or even encouraged, such as the growth of in-work benefits and the indulgence of The City, were the fault of Gordon Brown. This is a monarchical defence, in which the good king is undermined by his bad ministers. For Blairists among the Tories, this allows Ed Miliband to be tainted by association with Brown, though they struggle to square the idea of him as Cardinal Richelieu's Eminence Grise with the Wallace meme. For Labour Blairists, it holds out the prospect that the king over the water (in Jerusalem, mostly) may one day return, like De Gaulle recalled from Colombey. What odds a Blair-Sarkozy entente by 2020?
Most of his defenders feel that his progressive record has been unfairly tarnished by the fallout from Iraq, which the former PM mentions precisely once in his speech, in a clause starting with "whatever". John Rentoul in The Independent was typical: "The country has changed, mostly for the better, in 20 years and much of it is because of Tony Blair. Unexpectedly, the change was best summed up by David Cameron in his words on entering No 10 four years ago, when he said that the country he inherited was 'more open at home and more compassionate abroad' than it had been". This is Great Man history, which marginalises the contribution of others (the snub to Gordon Brown is deliberate), wallows in nostalgia ("the sun shone more under the old king"), and equates the national mood with a personal style.
Janan Ganesh in The FT suggests that Blair was more pragmatic than he is given credit for: "He governed with the grain of history, nudging it along from time to time, but never upending a country that was functioning well enough". This is vapid insofar as almost all heads of government do exactly the same. It would serve just as well as a testament to John Major. It might appear paradoxical that a self-styled progressive like Rentoul would laud the impact of the individual on history, while a conservative like Ganesh emphasises structural forces, but the former is merely the neoliberal valorisation of "talent" as a proxy for class, while the latter believes that the exceptional Thatcher changed the course of history, tearing up the postwar settlement and embarking on a social and economic revolution that Blair merely continued (the reality was that she rode the wave of global structural change as much as she vigorously turned the tiller).
Much of Blair's achievement is simply down to longevity. If you cling to office long enough, you will get the credit for all sorts of secular changes and social shifts that simply coincided with your watch, while observers will marvel that you haven't stayed exactly the same ("from Bambi to Bliar"). In The Telegraph, Stephen Bush credits Blair with installing a security door on his childhood block of flats, much as good harvests were once attributed to the beneficence of the distant monarch. Similarly, Rentoul reckons Blair "achieved the near-impossible in Northern Ireland" with the signing of The Good Friday Agreement, ignoring the patient build up since the Downing Street Declaration in 1993, not to mention the obvious readiness of the paramilitaries for a face-saving peace deal long before Blair's accession in 1997.
Ganesh claims that Blair "did not come from anywhere in particular", despite the public school and Oxford background, though his classlessness and cosmopolitan ease were more apparent to journalists than to the average voter. Cameron's belief that he could model himself on Blair sprang from a realisation that he was not a million miles away from the upper middle class lawyer in style and experience. Chris Dillow identifies Blair's "managerialist ideology" as his weakness, leading to over-confidence and poor decision-making. This ideology was part of a wider commitment to neoliberalism at home and abroad, which included the privileging of The City, his mugging by US neocons over Iraq, and his subsequent ascension to the global 0.01%. (In The Guardian, Michael White said: "Someone told me recently he'd brokered an oligarch's yacht sale". The point is not that this tale is true, but that it is credible).
Once they got beyond the noise about the horrors of the Saddam regime (i.e. "he had it coming"), Blair's supporters initially defended the decision to go to war in Iraq on the grounds that it was made in good faith, the absence of WMD notwithstanding. As the cost of failure mounted, that tactical error was subsumed beneath the massive strategic blowback, suggesting that either the plan all along was to trash the region or else the architects of intervention were drunk on their own power. Blair seems oblivious to the irony of his contemporary words: "Third way politics begins with an analysis of the world shaped by reality not ideology, not by delusionary thoughts based on how we want the world to be, but by hardheaded examination of the world as it actually is".
A feature of Blair's delusion, which is being eagerly advanced by his supporters in the press, is that all the structural failings that he ignored or even encouraged, such as the growth of in-work benefits and the indulgence of The City, were the fault of Gordon Brown. This is a monarchical defence, in which the good king is undermined by his bad ministers. For Blairists among the Tories, this allows Ed Miliband to be tainted by association with Brown, though they struggle to square the idea of him as Cardinal Richelieu's Eminence Grise with the Wallace meme. For Labour Blairists, it holds out the prospect that the king over the water (in Jerusalem, mostly) may one day return, like De Gaulle recalled from Colombey. What odds a Blair-Sarkozy entente by 2020?
Monday, 21 July 2014
Damnatio Memoriae
Each time a celebrity is jailed for sexual abuse, or is definitively found guilty by a posthumous review, you can guarantee a flurry of commentaries on how their memory will now be effaced, their image edited out of the Top of the Pops archive and embarrassing memorials removed from view, thereby enacting symbolic violence against their person. Though we may be uncomfortable with the overlap of this practice with Stalinist airbrushing and the "unperson", there is also a sense that removing the disgraced's representation from the public record is a punishment with an impeccable pedigree. As a political act, this goes back to the Roman's damnatio memoriae and the cartouche tippexing of the Ancient Egyptians. As a social act, this is merely a larger-scale example of the willed forgetting and photo-chopping that follows on from bitter divorces and family estrangements. Rewriting the historical record to excise painful memories is normal.
In the early days of this process, while the evil nature of the disgraced is being established beyond doubt, nothing is allowed that might contradict the reduction of the person to the essence of their evil acts. Once the reputation is sufficiently blackened, contrasting imagery begins to reappear. The bigger the evil, and thus the more secure it is from revision, the more of this hinterland can be accommodated over time. A good example of this is Adolf Hitler, whose documentary appearances in recent decades have featured more incidental scenes with children and animals, not to mention the now clichéd horsing around with Eva Braun at Berchtesgaden, rather than just stock footage of ranting or map-pondering. A Channel 4 documentary about Hitler's dogs is surely in development already.
Presenting a rounded and contextual picture of a person is good history, but it also serves to blur the boundary between the person and their environment, i.e. other people. One reason for the extreme editing of the early years is the fear of contamination: the embarrassment of the other DJ seen grinning as Jimmy Savile leers at a young girl on TOTP, or the variety show host introducing his "very good friend" Rolf Harris. Effacement is always a cover-up, obscuring the connivance and negligence of others as much as the memory of the disgraced. This extends to the common stock of social memory, with associates now claiming they "hardly knew" X or had "little to do with" Y.
This willed oblivion will be much harder in future due to the inexhaustible memory of the Internet. The aftermath of the EU Court of Justice ruling on Google and the misnamed "right to be forgotten" has seen a campaign by the US search company to convert the issue from one of data property to one of "press censorship", at the same time as the UK government has forced through the Data Retention and Investigatory Powers (Drip) Bill, which will provide Google et al with the legal cover to continue supplying GCHQ with data. The framing of the EU judgement as a threat to "free speech" and "fair comment" is in the interests of both Google, which uses "freedom" to enclose the commons, and a print media that is largely anti-Leveson, anti-EU and keen to adopt the moral high ground relative to the search giant.
In practice, the threat of censorship arising directly from the ECJ ruling is non-existent, not least because the obligations on Google and others have yet to be established in the national courts. The US company is deliberately and provocatively jumping the gun in order to trigger an anti-ECJ groundswell. It serves Google's interests for people to assume that it and the Internet are one, rather than it being simply a glorified index that only covers a fraction of the Web and is already biased by the needs of advertising and existing government regulation. Google's difficulty with the ECJ is the product of its ambition to construct a persona for all users, i.e. a single profile encompassing both the individual's behaviour in Google applications and the wider "file" of that individual fragmented across the Web. It cannot admit the trivial impact of the ruling without simultaneously robbing its brand of the mystique of omniscience. There is a sound commercial reason why Google does not even acknowledge the existence of its competitors in the search market.
In contrast, while Google gets agitated about something that its users (mostly) don't care about, Facebook finds itself criticised for treating its users like guinea pigs, even though its now notorious "experiment" was simply a standard A/B test of the sort routinely carried out by commercial websites: just substitute the words "interested" and "bored" for "happy" and "sad" (the infantilism of the experiment is noteworthy). The "emotion contagion" finding is hardly news, and the focus on manipulation by Facebook alone is odd when editing content to stimulate interest or reaction is what all media organisations do every day of the week (the negative reaction by old media may indicate a fear that new media is much more effective at shaping public opinion).
The basis of the ethical storm is that while users have willingly alienated their data, they did not believe they were agreeing to be experimented on when they signed the terms of service (which they didn't read anyway). The difference this highlights is that Facebook see the users as an extension of the data (which they ultimately own), while the users still believe they are independent, even if they have ceded rights over that data. Google has the same view as Facebook, hence the importance they give to treating the online persona as superior to the offline person. The ECJ ruling embodies a different philosophy, which sees the data as the inalienable property of the citizen. The dispute is thus being framed as a conflict between previously complementary liberal principles: free speech and private property. Of course, free speech, in the particular form of a "free press", is essentially a property right, so the whole affair (including Drip) boils down to a tussle over property.
Because personal data is increasingly the property of others, I suspect that the convention of damnatio memoriae may well be on its way out. The instinct and desire to obliterate will remain at the individual level, but that just means "unfriending" and deleting accounts so that the offensive memory is hidden from your own view. At the societal level, organisations like the BBC, with a perceived responsibility to curate the historical record to reflect public opinion, will find their actions increasingly irrelevant and difficult to defend. That epigraph is the property of Google, that cartouche the property of Facebook.
In the early days of this process, while the evil nature of the disgraced is being established beyond doubt, nothing is allowed that might contradict the reduction of the person to the essence of their evil acts. Once the reputation is sufficiently blackened, contrasting imagery begins to reappear. The bigger the evil, and thus the more secure it is from revision, the more of this hinterland can be accommodated over time. A good example of this is Adolf Hitler, whose documentary appearances in recent decades have featured more incidental scenes with children and animals, not to mention the now clichéd horsing around with Eva Braun at Berchtesgaden, rather than just stock footage of ranting or map-pondering. A Channel 4 documentary about Hitler's dogs is surely in development already.
Presenting a rounded and contextual picture of a person is good history, but it also serves to blur the boundary between the person and their environment, i.e. other people. One reason for the extreme editing of the early years is the fear of contamination: the embarrassment of the other DJ seen grinning as Jimmy Savile leers at a young girl on TOTP, or the variety show host introducing his "very good friend" Rolf Harris. Effacement is always a cover-up, obscuring the connivance and negligence of others as much as the memory of the disgraced. This extends to the common stock of social memory, with associates now claiming they "hardly knew" X or had "little to do with" Y.
This willed oblivion will be much harder in future due to the inexhaustible memory of the Internet. The aftermath of the EU Court of Justice ruling on Google and the misnamed "right to be forgotten" has seen a campaign by the US search company to convert the issue from one of data property to one of "press censorship", at the same time as the UK government has forced through the Data Retention and Investigatory Powers (Drip) Bill, which will provide Google et al with the legal cover to continue supplying GCHQ with data. The framing of the EU judgement as a threat to "free speech" and "fair comment" is in the interests of both Google, which uses "freedom" to enclose the commons, and a print media that is largely anti-Leveson, anti-EU and keen to adopt the moral high ground relative to the search giant.
In practice, the threat of censorship arising directly from the ECJ ruling is non-existent, not least because the obligations on Google and others have yet to be established in the national courts. The US company is deliberately and provocatively jumping the gun in order to trigger an anti-ECJ groundswell. It serves Google's interests for people to assume that it and the Internet are one, rather than it being simply a glorified index that only covers a fraction of the Web and is already biased by the needs of advertising and existing government regulation. Google's difficulty with the ECJ is the product of its ambition to construct a persona for all users, i.e. a single profile encompassing both the individual's behaviour in Google applications and the wider "file" of that individual fragmented across the Web. It cannot admit the trivial impact of the ruling without simultaneously robbing its brand of the mystique of omniscience. There is a sound commercial reason why Google does not even acknowledge the existence of its competitors in the search market.
In contrast, while Google gets agitated about something that its users (mostly) don't care about, Facebook finds itself criticised for treating its users like guinea pigs, even though its now notorious "experiment" was simply a standard A/B test of the sort routinely carried out by commercial websites: just substitute the words "interested" and "bored" for "happy" and "sad" (the infantilism of the experiment is noteworthy). The "emotion contagion" finding is hardly news, and the focus on manipulation by Facebook alone is odd when editing content to stimulate interest or reaction is what all media organisations do every day of the week (the negative reaction by old media may indicate a fear that new media is much more effective at shaping public opinion).
The basis of the ethical storm is that while users have willingly alienated their data, they did not believe they were agreeing to be experimented on when they signed the terms of service (which they didn't read anyway). The difference this highlights is that Facebook see the users as an extension of the data (which they ultimately own), while the users still believe they are independent, even if they have ceded rights over that data. Google has the same view as Facebook, hence the importance they give to treating the online persona as superior to the offline person. The ECJ ruling embodies a different philosophy, which sees the data as the inalienable property of the citizen. The dispute is thus being framed as a conflict between previously complementary liberal principles: free speech and private property. Of course, free speech, in the particular form of a "free press", is essentially a property right, so the whole affair (including Drip) boils down to a tussle over property.
Because personal data is increasingly the property of others, I suspect that the convention of damnatio memoriae may well be on its way out. The instinct and desire to obliterate will remain at the individual level, but that just means "unfriending" and deleting accounts so that the offensive memory is hidden from your own view. At the societal level, organisations like the BBC, with a perceived responsibility to curate the historical record to reflect public opinion, will find their actions increasingly irrelevant and difficult to defend. That epigraph is the property of Google, that cartouche the property of Facebook.
Wednesday, 16 July 2014
Curb Your Enthusiasm
Paul Krugman, who has long cultivated the fertile lands between Sci-Fi and economics, has been thinking about the implications of "IT-mediated car services" like Uber. He avoids mentioning robot drivers, perhaps to prevent the discussion tail-spinning into angst about capital-labour substitution before the madness of cars gets a proper airing: "when you think about it, for most people owning a car is quite wasteful. It’s an expensive item of equipment that sits idle most of the time; it requires parking (and often a parking structure) both at origin and at destination; it requires maintenance and is a big hassle all around". In other words, it is an inefficient use of capital that involves a significant waste of land.
The promise of Uber et al is that "reliable, quick-response chauffeur services could free many people from the need to tie up all those resources in a consumer durable that they only use now and then. And from a social point of view it would avoid the need to tie up so much capital that sits unused most of the time". The implication is that the capital could be put to more productive use (from a social point of view) elsewhere, though Krugman is coy about recommending increased investment in public transport, which would simply bring the anti-state harpies down upon him. Instead, he seems keen to get marginalists onside by echoing the thoughts of Tyler Cowen, who previously advocated the incentive pricing of parking, in respect of capacity.
"There is, however, an obvious problem: rush hour. Peak car use comes twice a day, and that would seem to dictate that we have nearly as many cars as we do now even if they’re supplied by the likes of Uber. But here’s where surge pricing comes in. If traveling during peak hours is more expensive than off-peak, people will have an incentive to shave off those peaks. People who aren’t commuting to work will avoid travel at peak hours; some people will find other ways to travel; some people (and businesses) will rearrange their schedules to take advantage of cheaper off-peak travel".
As with all incentive-based arguments, the first question to ask is: who are these "people" of whom you speak? If traveling during peak hours is more expensive than off-peak, then peak-time travel will be dominated by the better-off. You could spin this as a form of progressive taxation, but the reality is that peak hours will only command a premium if they are a more attractive good. In other words, this is a system in which the less well-off exchange relative inconvenience for a discount on travel costs (unlike the democratic basis of public transport). While there will always be some individuals whose time preference makes this a win-win, aggregate satisfaction must be reflected in the time-slot price. The most popular slots will cost the most.
Though his blog post is provocatively titled "Life without cars", what Krugman actually envisages is "a society that still relies mainly on cars to get around, but manages to do this with significantly fewer cars than we need at present", which is a pretty accurate thumbnail sketch of New York, where the incentives of housing density, land value and loss of utility (i.e. traffic jams), as well as decent public transport, mean that more households are car-less than own a vehicle. It would be easy to suggest that familiarity with limos makes Nobel laureates comfortable with services like Uber, and easier still to suggest that well-paid economics professors like the idea of buying marginal services while investing their capital for yield, but Krugman is clearly in tune with the zeitgeist.
The wider secular trends are the move from ownership to subscription (already seen in many other areas) and consequent concentration of capital; the emergence of a generation that considers ownership of both homes and cars to be unfeasible in the short-term, if only because all their capital went into tuition fees; growing concerns about the environmental impact of cars (which is not unrelated to the decline of car manufacturing in developed nations); and the increased physical control of city centres and public spaces, variously sold on the grounds of security, "traffic management" and pollution reduction (London is well ahead of the field here).
Massively open online car schemes (hey, MOOCS!) may prove to be a stepping-stone to the introduction of robot cars in metropolitan areas - i.e. first get the infrastructure of car pools in place and then substitute the drivers. They promise a brighter and better Sci-Fi future, with minimal discussion about the changing landscape of power. As Krugman himself said (in the Wired interview linked to above), "Evil will come in stylish, Steve Jobs-inspired designs".
The promise of Uber et al is that "reliable, quick-response chauffeur services could free many people from the need to tie up all those resources in a consumer durable that they only use now and then. And from a social point of view it would avoid the need to tie up so much capital that sits unused most of the time". The implication is that the capital could be put to more productive use (from a social point of view) elsewhere, though Krugman is coy about recommending increased investment in public transport, which would simply bring the anti-state harpies down upon him. Instead, he seems keen to get marginalists onside by echoing the thoughts of Tyler Cowen, who previously advocated the incentive pricing of parking, in respect of capacity.
"There is, however, an obvious problem: rush hour. Peak car use comes twice a day, and that would seem to dictate that we have nearly as many cars as we do now even if they’re supplied by the likes of Uber. But here’s where surge pricing comes in. If traveling during peak hours is more expensive than off-peak, people will have an incentive to shave off those peaks. People who aren’t commuting to work will avoid travel at peak hours; some people will find other ways to travel; some people (and businesses) will rearrange their schedules to take advantage of cheaper off-peak travel".
As with all incentive-based arguments, the first question to ask is: who are these "people" of whom you speak? If traveling during peak hours is more expensive than off-peak, then peak-time travel will be dominated by the better-off. You could spin this as a form of progressive taxation, but the reality is that peak hours will only command a premium if they are a more attractive good. In other words, this is a system in which the less well-off exchange relative inconvenience for a discount on travel costs (unlike the democratic basis of public transport). While there will always be some individuals whose time preference makes this a win-win, aggregate satisfaction must be reflected in the time-slot price. The most popular slots will cost the most.
Though his blog post is provocatively titled "Life without cars", what Krugman actually envisages is "a society that still relies mainly on cars to get around, but manages to do this with significantly fewer cars than we need at present", which is a pretty accurate thumbnail sketch of New York, where the incentives of housing density, land value and loss of utility (i.e. traffic jams), as well as decent public transport, mean that more households are car-less than own a vehicle. It would be easy to suggest that familiarity with limos makes Nobel laureates comfortable with services like Uber, and easier still to suggest that well-paid economics professors like the idea of buying marginal services while investing their capital for yield, but Krugman is clearly in tune with the zeitgeist.
The wider secular trends are the move from ownership to subscription (already seen in many other areas) and consequent concentration of capital; the emergence of a generation that considers ownership of both homes and cars to be unfeasible in the short-term, if only because all their capital went into tuition fees; growing concerns about the environmental impact of cars (which is not unrelated to the decline of car manufacturing in developed nations); and the increased physical control of city centres and public spaces, variously sold on the grounds of security, "traffic management" and pollution reduction (London is well ahead of the field here).
Massively open online car schemes (hey, MOOCS!) may prove to be a stepping-stone to the introduction of robot cars in metropolitan areas - i.e. first get the infrastructure of car pools in place and then substitute the drivers. They promise a brighter and better Sci-Fi future, with minimal discussion about the changing landscape of power. As Krugman himself said (in the Wired interview linked to above), "Evil will come in stylish, Steve Jobs-inspired designs".
Sunday, 13 July 2014
The Coming Terror
The "Westminster Paedophile Scandal" appears to be galloping through the traditional narrative arc of "conspiracy in high places" in record time. As with the "Red Scare" of Joseph McCarthy, the classic modern template, there are three notable features: the suggestion that Westminster is riddled with the guilty; that some parts of government are dodgier than others (McCarthy had it in for the US State Department, his British successors for the Home Office); and that official inquiries are likely to be compromised, if not part of an active cover-up by "the establishment" (a word enjoying a mini-revival). You'll know the scandal is mid-flight over the shark when someone mentions the Freemasons or the Bilderberg Group.
This is not to say that there aren't paedophiles at Westminster or in the Civil Service (statistical probability suggests otherwise and convictions have already occurred), but there is clearly more at issue here than individual crimes, just as Watergate wasn't "about" a break-in. The question is whether the institution can be held responsible for the behaviour of its members, but it is doubtful that culpability goes beyond the willed obliviousness that the BBC exhibited in respect of Savile et al. It seems unlikely that there is anything on a par with the connivance of Richard Nixon, let alone that party whips at Westminster were routinely hushing-up criminal acts in order to strongarm MPs.
A legacy of the 1960s "sexual revolution" is the increased public salience of child abuse in the "doings of the mighty". Unconventional forms of consensual sex (adultery, homosexuality) have lost their power to embarrass or provide political leverage (McCarthy also pursued homosexuals, in what became known as the "Lavendare Scare"), while at the same time popular therapy has advanced the idea of childhood as a time of unique vulnerability. The structural origin of the "paedo threat" can be found in the increased tabloidisation of the media in the 1980s (the arrival of Today, the start of breakfast television and the evolution of the "sleb"), which recycled historic cases such as The Moors Murders and provided ample exposure for flimsy claims of "satanic ritual abuse". As these modern witch-hunts collapsed under the weight of their own hyperbole, focus shifted to institutional abuse.
Institutions are sites of power, so abuse of all sorts will always be found there. What I'm interested in is the selectivity of the media: sexual abuse is having a moment, but not in its most prevalent form, which is abuse and coercion in the workplace. Just as satanic ritual abuse was an all-to-obvious opportunity to berate social workers (first for incompetence, then for over-reaction), and by extension local councils, so the current focus on claims of a "ring" involving MPs, civil servants and local authority care homes corrals what might be termed "the usual suspects" in the eyes of the anti-state brigade. But this linkage is tenuous at best. The fact that Cyril Smith was an abuser does not implicate Parliament as a whole, nor should it excuse the media's eliding of child sexual abuse with a wider "loss of trust" in politicians and the establishment. Expenses-fiddling does not segue into kiddy-fiddling.
Andrew Rawnsley in The Observer today bemoans the loss of public trust, but manages to exhibit the commentariat's conflicted instincts within a single paragraph: "I like to think I am always on my guard against hysteria ... but I wouldn't be at all surprised if it is shown that there were cases of offenders among MPs whose criminal depredations were hushed up by whips and other party managers for the usual self-serving reasons". This is prurience masquerading as cynicism, which further fuels the nonsensical idea that whips are omniscient and omnipotent. Francis Urquhart is a work of fiction.
Rawnsley attempts perspective: "What is often called the decline of trust is really an evaporation of deference. Where once there was a reflexive respect for authority and a willingness to give it the benefit of the doubt, there is now a default to distrust". Deference was never about giving the benefit of the doubt, i.e. suppressing scepticism, but about the unquestioning acceptance of authority, often in flat contradiction of the facts. It is the reactionary frame of mind. The 60s are meant to have ushered in an era marked by the "healthy distrust of authority", though the roots of this can be traced back to World War One. In practice, the scepticism of the age soon became the motor of modern capitalism: the "me generation", those "crazy individualists", "because you're worth it" etc.
Rawnsley appears oblivious to the role of capitalism, in its anti-state and pro-market form, in the creation of what David Brooks in The New York Times admiringly refers to as a "personalistic culture". According to Rawnsley, who cites Brooks as an expert on modern manners, "It seems we'd now rather trust an individual we don't know than a big institution that we have come to know much too well". Really? You might like to help me with a small problem I have transferring a billion Dollars out of Nigeria.
In an amusing juxtaposition, Rawnsley's piece in the print edition appears opposite an editorial on BBC funding, which promotes an impeccably neoliberal position on the benefits of commodistisation: "The more various and often frankly free market ways of raising money the corporation develops, the more free from sticky political fingers it will become". In other words, only the market can save institutions from the further erosion of trust consequent on their association with the state. The ideological logic of modern capitalism, of rational utility maximisers and transactional calculation, is for trust to become a commodity that can be secured or exchanged for money. Filthy lucre becomes a disinfectant, an idea that would have fascinated Sigmund Freud, who first popularised the psychological trinity of childhood, sex and money.
This is not to say that there aren't paedophiles at Westminster or in the Civil Service (statistical probability suggests otherwise and convictions have already occurred), but there is clearly more at issue here than individual crimes, just as Watergate wasn't "about" a break-in. The question is whether the institution can be held responsible for the behaviour of its members, but it is doubtful that culpability goes beyond the willed obliviousness that the BBC exhibited in respect of Savile et al. It seems unlikely that there is anything on a par with the connivance of Richard Nixon, let alone that party whips at Westminster were routinely hushing-up criminal acts in order to strongarm MPs.
A legacy of the 1960s "sexual revolution" is the increased public salience of child abuse in the "doings of the mighty". Unconventional forms of consensual sex (adultery, homosexuality) have lost their power to embarrass or provide political leverage (McCarthy also pursued homosexuals, in what became known as the "Lavendare Scare"), while at the same time popular therapy has advanced the idea of childhood as a time of unique vulnerability. The structural origin of the "paedo threat" can be found in the increased tabloidisation of the media in the 1980s (the arrival of Today, the start of breakfast television and the evolution of the "sleb"), which recycled historic cases such as The Moors Murders and provided ample exposure for flimsy claims of "satanic ritual abuse". As these modern witch-hunts collapsed under the weight of their own hyperbole, focus shifted to institutional abuse.
Institutions are sites of power, so abuse of all sorts will always be found there. What I'm interested in is the selectivity of the media: sexual abuse is having a moment, but not in its most prevalent form, which is abuse and coercion in the workplace. Just as satanic ritual abuse was an all-to-obvious opportunity to berate social workers (first for incompetence, then for over-reaction), and by extension local councils, so the current focus on claims of a "ring" involving MPs, civil servants and local authority care homes corrals what might be termed "the usual suspects" in the eyes of the anti-state brigade. But this linkage is tenuous at best. The fact that Cyril Smith was an abuser does not implicate Parliament as a whole, nor should it excuse the media's eliding of child sexual abuse with a wider "loss of trust" in politicians and the establishment. Expenses-fiddling does not segue into kiddy-fiddling.
Andrew Rawnsley in The Observer today bemoans the loss of public trust, but manages to exhibit the commentariat's conflicted instincts within a single paragraph: "I like to think I am always on my guard against hysteria ... but I wouldn't be at all surprised if it is shown that there were cases of offenders among MPs whose criminal depredations were hushed up by whips and other party managers for the usual self-serving reasons". This is prurience masquerading as cynicism, which further fuels the nonsensical idea that whips are omniscient and omnipotent. Francis Urquhart is a work of fiction.
Rawnsley attempts perspective: "What is often called the decline of trust is really an evaporation of deference. Where once there was a reflexive respect for authority and a willingness to give it the benefit of the doubt, there is now a default to distrust". Deference was never about giving the benefit of the doubt, i.e. suppressing scepticism, but about the unquestioning acceptance of authority, often in flat contradiction of the facts. It is the reactionary frame of mind. The 60s are meant to have ushered in an era marked by the "healthy distrust of authority", though the roots of this can be traced back to World War One. In practice, the scepticism of the age soon became the motor of modern capitalism: the "me generation", those "crazy individualists", "because you're worth it" etc.
Rawnsley appears oblivious to the role of capitalism, in its anti-state and pro-market form, in the creation of what David Brooks in The New York Times admiringly refers to as a "personalistic culture". According to Rawnsley, who cites Brooks as an expert on modern manners, "It seems we'd now rather trust an individual we don't know than a big institution that we have come to know much too well". Really? You might like to help me with a small problem I have transferring a billion Dollars out of Nigeria.
In an amusing juxtaposition, Rawnsley's piece in the print edition appears opposite an editorial on BBC funding, which promotes an impeccably neoliberal position on the benefits of commodistisation: "The more various and often frankly free market ways of raising money the corporation develops, the more free from sticky political fingers it will become". In other words, only the market can save institutions from the further erosion of trust consequent on their association with the state. The ideological logic of modern capitalism, of rational utility maximisers and transactional calculation, is for trust to become a commodity that can be secured or exchanged for money. Filthy lucre becomes a disinfectant, an idea that would have fascinated Sigmund Freud, who first popularised the psychological trinity of childhood, sex and money.
Wednesday, 9 July 2014
The Disambiguation of Belo Horizonte
Last night's demolition of Brazil by Germany in Belo Horizonte was predictable, even if the scoreline was not. For all the wailing and gnashing of teeth, I doubt this will be as traumatic as the famous Maracanazo of 1950, when Brazil narrowly lost the final to an unfancied Uruguay in Rio de Janeiro. There will be plenty of media fuss over the coming days, including predictions of social disorder just short of civil war, but the simple truth is that Brazil looked no better than a last-eight team and were fortunate to have got that far, owing not a little to QPR's second-choice goalkeeper. Germany were excellent, and are now deservedly favourites for the title, but the comprehensive nature of their victory owed much to Brazil's psychological collapse midway through the first half.
The warning signs were obvious earlier in the tournament, from the over-reliance on Neymar The Redeemer through the limitations of Fred (who played like Karl Power), but perhaps the biggest pointer to impending disaster was the sheer weight of emotional anxiety that marked the narrow victories over Chile and Colombia. This went beyond the desire to do well on home turf to a desperation to maintain the ideal of Brazil as the ultimate in footballing art, which translated on the night into suicidal play by the likes of Marcelo and David Luiz. The crowd's booing of Fred was less personal antipathy (he's actually a local boy who played for both Belo Horizonte's main teams) than frustration with the domestic game. It's not producing a lot of talent, and the best players disappear to Europe by the time they're 20.
The great era of Brazilian football, inaugurated by Garrincha and Pele in Sweden in 1958, ended with the team of Socrates, Zico and Falcao, who adorned the 1982 and 1986 tournaments but came away empty-handed. The teams of the last twenty years managed to marry the functional and the (occasionally) elegant, winning in 1994 and 2002, but there was a sense that they were trading on past glories, literally so as the Brazilian FA avoided playing at home to indulge in lucrative overseas friendlies and "Jogo Bonito" became just another piece of royalty-earning IP.
Though the contrast of a freewheeling past and a pragmatic present was a simplification, it did hint at the underlying tensions around class and politics. Jogo Bonito was organic and nativist, but also strongly egalitarian and anti-establishment (during the period of military rule), linked to the 60s Tropicalia movement in culture and exemplified by Socrates' twin roles as imperious midfielder and pro-social doctor. The era bookended by Dunga, as player in 1990 and coach in 2010, saw the adoption of a supposedly more European style in terms of organisation and tactics, however the emphasis on preparation and "risk management" was part of a much wider neoliberal turn towards the technocratic and commercialised.
The class stratification of sport, embodied in the prawn sandwich meme, has effected both developed and developing nations simultaneously. Globalisation means that positional goods, like iPhones or FIFA match tickets, can be made available to the "new middle class" everywhere. TV reports in the build-up to the current tournament gave the impression that ordinary folk had been priced out of attendance only recently, but this has been going on for years. Brazilian football has become an increasingly middle-class preserve, much as cricket has in India and both sports have in the UK.
This, rather than a sudden discovery of the joys of goalless draws, is why football is growing in popularity in the USA. Twenty years after Brazil won the World Cup in Pasadena, and after twenty years of Soccer Moms, the sport may now be sufficiently middle-class to avoid the charge of being foreign and socialist. The sporting and social gap between the semi-pros and immigrants who made up the USA team in their shock victory over England at Belo Horizonte in 1950 and the Brazil team of 1982 is tiny in comparison to the gap between the latter and any tournament team today. One silver lining for England is that the 1950 "disaster" has now been superseded. If only they could get over 1966 as well.
The warning signs were obvious earlier in the tournament, from the over-reliance on Neymar The Redeemer through the limitations of Fred (who played like Karl Power), but perhaps the biggest pointer to impending disaster was the sheer weight of emotional anxiety that marked the narrow victories over Chile and Colombia. This went beyond the desire to do well on home turf to a desperation to maintain the ideal of Brazil as the ultimate in footballing art, which translated on the night into suicidal play by the likes of Marcelo and David Luiz. The crowd's booing of Fred was less personal antipathy (he's actually a local boy who played for both Belo Horizonte's main teams) than frustration with the domestic game. It's not producing a lot of talent, and the best players disappear to Europe by the time they're 20.
The great era of Brazilian football, inaugurated by Garrincha and Pele in Sweden in 1958, ended with the team of Socrates, Zico and Falcao, who adorned the 1982 and 1986 tournaments but came away empty-handed. The teams of the last twenty years managed to marry the functional and the (occasionally) elegant, winning in 1994 and 2002, but there was a sense that they were trading on past glories, literally so as the Brazilian FA avoided playing at home to indulge in lucrative overseas friendlies and "Jogo Bonito" became just another piece of royalty-earning IP.
Though the contrast of a freewheeling past and a pragmatic present was a simplification, it did hint at the underlying tensions around class and politics. Jogo Bonito was organic and nativist, but also strongly egalitarian and anti-establishment (during the period of military rule), linked to the 60s Tropicalia movement in culture and exemplified by Socrates' twin roles as imperious midfielder and pro-social doctor. The era bookended by Dunga, as player in 1990 and coach in 2010, saw the adoption of a supposedly more European style in terms of organisation and tactics, however the emphasis on preparation and "risk management" was part of a much wider neoliberal turn towards the technocratic and commercialised.
The class stratification of sport, embodied in the prawn sandwich meme, has effected both developed and developing nations simultaneously. Globalisation means that positional goods, like iPhones or FIFA match tickets, can be made available to the "new middle class" everywhere. TV reports in the build-up to the current tournament gave the impression that ordinary folk had been priced out of attendance only recently, but this has been going on for years. Brazilian football has become an increasingly middle-class preserve, much as cricket has in India and both sports have in the UK.
This, rather than a sudden discovery of the joys of goalless draws, is why football is growing in popularity in the USA. Twenty years after Brazil won the World Cup in Pasadena, and after twenty years of Soccer Moms, the sport may now be sufficiently middle-class to avoid the charge of being foreign and socialist. The sporting and social gap between the semi-pros and immigrants who made up the USA team in their shock victory over England at Belo Horizonte in 1950 and the Brazil team of 1982 is tiny in comparison to the gap between the latter and any tournament team today. One silver lining for England is that the 1950 "disaster" has now been superseded. If only they could get over 1966 as well.
Sunday, 6 July 2014
By Any Means Necessary
One of the most resonant texts of twentieth century economic speculation is J M Keynes's 1930 essay The Economic Possibilities for our Grandchildren, in which he predicted a 15-hour week within 100 years. The reason for the continuing fascination is the idea that technology and growth can deliver Utopia; that a life of easeful pleasure is within our grasp, even some way short of a post-scarcity economy. The essay got a further mention the other day in a conventional (though witty) column by Andrew Martin advocating a 4-day working week. What was poignant was to see this staring across the fold in The Guardian at an investigation into the growth of food banks.
Following a gradual reduction in working hours over the nineteenth and twentieth centuries, there was a bifurcation in developed economies after 1980. For unskilled labour, average work hours have shrunk, while low-wages have required many to extend their total work hours over multiple jobs. This is the root cause of the growth of food banks: there just aren't enough adequately-paid unskilled jobs to go round and social security is increasingly threadbare. For the skilled, there has been a twin movement of increased time off (paid holidays, parental leave, part-time working etc) and presenteeism (i.e. hanging around for reasons of status rather than productivity).
The high-profile of the latter gives rise to middle class angst summed up by The New Yorker recently as: How did we get so busy? The answer is that work time is increasingly a positional good, i.e. a "fictitious commodity" distinct from labour power. The emblematic forms of this appear first at the margins, in feminised job-shares and nepotistic interns, dissolving the relationship between labour and status. For high-status (and predominantly male) roles, the "always on" joys of corporate email have partially replaced the traditional arenas of performative loyalty and politicking down the pub, club or golf course. A parallel change has been the increasing professionalisation of management - i.e. the construction of a pseudo-scientific discipline through MBAs and rule-books - which means that footloose executives often lack an understanding of a business and value their ignorance of operational detail ("it's about strategic vision"). Rent-seeking remains a growth area of the economy.
Conventional economics sees no mystery in the failure of the 15 hour week to arrive, though it remains fascinated by Keynes's idea. A 2010 MIT collection of essays, Keynes Revisited, quoted in the New Yorker article, explained this failure variously: human inventiveness creates ever more new products, so we keep working to acquire more new stuff; we also work to achieve self-actualisation, which may just mean avoiding our families and meeting new sexual partners; and how much we work is conditioned by social norms as much as practical need, hence the sensual French have more holidays than puritanical Americans. These are essentially sociological explanations, which might seem odd coming from economists, until you consider the influence of rational choice theory (Gary Becker was a key contributor) and that the essays were written in the immediate aftermath of 2008, when conventional economics was trying to avoid catching our eye.
The influence of Becker leads to the assumption that the hours we work are a matter of personal choice, a trade-off necessitated by the inescapable scarcity of time. This might be dismissed as just a determination to keep the concept of scarcity at the heart of orthodox economics at a time of growing capital abundance, however I think it also relates to the ideological importance of competition, present in the idea that we have many demands competing for our attention. A key feature of the digital economy is its dependence on our time as both a scarce resource and a tradable commodity: we spend time with apps as much as we spend money; we pay for free services with our time and our liminal attention; we seek to turbo-boost our social connections, replacing inefficient face-time with instant messages and broadcasts. The boundary between work and leisure has been blurred by social media. This presents a challenge to both Keynes and his critics: where does work stop and non-work start?
This is part of a wider cultural turn since the 1970s centred on the idea of competition: not just with each other in an economic or quasi-economic context, but with ourselves in the internalisation of biopower (diets, gym, lifelogging), or through fetish objects (from Tamagochi to smartphones) and the status competition of social media. As Will Davies notes: "At key moment in the history of neoliberal thought, its advocates shifted from defending markets as competitive arenas amongst many, to viewing society-as-a-whole as one big competitive arena. Under the latter model, there is no distinction between arenas of politics, economics and society. To convert money into political power, or into legal muscle, or into media influence, or into educational advantage, is justifiable, within this more brutal, capitalist model of neoliberalism".
Inequality is then the rightful product of competition. However, this is not the same justification as the "superior rewards to talent" deployed in response to Thomas Piketty, but a more brutal acceptance that winning "by any means necessary" is its own justification. This can be thought of as the oligarch's defence, and has obvious implications for the effectiveness of democracy. It also has implications for the working day. If you believe you are engaged in a competition at all levels of activity - from the absurd "global race" through commercial rivalry to empire-building and jockeying within a company - and that this competition has no rules or constraints, not only will corporate malfeasance be normalised and "gamification" accepted as routine, but you will be constantly working the angles to preserve or improve your own position as "the entrepreneur of yourself". In such circumstances, taking time off is tantamount to conceding the game.
But there is a problem here, an incipient crisis. If automation continues to replace labour with abundant capital, if we reach "peak jobs", working time will inevitably become scarcer, even if we engineer more supernumerary roles. The current ideological valorisation of "hard-working" may be nothing more than the anxiety of musical chairs. What do we do once the music stops? A job guarantee scheme cannot work beyond a stop-gap, unless we want to commit to the creation of a new helot class in perpetuity, alongside an employment elite who are constantly at work in the same sense that the aristocracy once was. A guaranteed basic income might be a more palatable solution for capital, as it does not necessarily compromise ownership or power, but this means it would probably be framed as a subsidy to labour (i.e. a dole), rather than a social dividend (i.e. the individual's share of technological progress), so the tendency towards the parsimonious is likely to lead to social conflict.
If we think of a basic income as "buying off" labour (i.e. a market intervention, akin to a set-aside scheme), the principle of social equality will, ironically, lead to that time being priced at a minimum level - we won't buy back the time of the skilled at a higher price. This means we will buy off the time of the poor (the un- and under-employed) first, so retaining high work hours will be entrenched as a sign of status ("scarce" workers, aka "talent", will be exempted altogether). Buying back time on a pro-rata basis across all jobs would actually be better in the long run (it's a transitional measure that would gradually disappear), but it's a non-starter politically in the short run: why should a lawyer's free time be worth more than a binman's?
Another option is to revive popular capitalism by giving workers shares in the robots as their hours decline ("Workers can earn more of their income from capital than from working - by owning part of the robots that replace them ... rather than rely on government income redistribution policies"), but this looks about as credible as previous attempts at popular share-ownership under Thatcher and Yeltsin, which were clearly never intended to dilute actual capital accumulations. Unless the purchasing power of declining wages increases, shares will soon migrate to cash and ownership will once more be concentrated in the hands of the rich.
The point to remember about Keynes's 1930 essay, and the reason its still resonates, is that it was a defence of capitalism, in the form of technological innovation and the wonders of compound interest, at a time when many were questioning it in light of the 1929 crash. Keynes believed that if it were "managed" (through population control, technocracy and the avoidance of war - i.e. Edwardian liberal shibboleths), then it would surely deliver the "bliss" of a 15-hour week. This ignores the fact that all changes in working hours, from the Victorian Factory Acts to the EU Working Time Directive, have depended on legislation and enforcement by the state. Working time is a social construct, and thus a political choice.
That choice is between a competitive, market-based solution, which cannot but immiserate the majority to preserve the privileges of a minority, or a cooperative solution, which cannot avoid the same outcomes unless it is applied equally across all jobs, which means socialism. Given the ideological investment in our flexible labour market by all political parties since the 1980s, and the continuing obeisance to the interests of "business", it is likely that the former path will be the one taken, with the politics reduced to quibbling over the degree of immiseration. In that sense "austerity", at a point in history when we are collectively richer than ever and perfectly capable of spending our way to both higher growth and greater equality, is looking like a fixture of the century rather than a temporary aberration.
Following a gradual reduction in working hours over the nineteenth and twentieth centuries, there was a bifurcation in developed economies after 1980. For unskilled labour, average work hours have shrunk, while low-wages have required many to extend their total work hours over multiple jobs. This is the root cause of the growth of food banks: there just aren't enough adequately-paid unskilled jobs to go round and social security is increasingly threadbare. For the skilled, there has been a twin movement of increased time off (paid holidays, parental leave, part-time working etc) and presenteeism (i.e. hanging around for reasons of status rather than productivity).
The high-profile of the latter gives rise to middle class angst summed up by The New Yorker recently as: How did we get so busy? The answer is that work time is increasingly a positional good, i.e. a "fictitious commodity" distinct from labour power. The emblematic forms of this appear first at the margins, in feminised job-shares and nepotistic interns, dissolving the relationship between labour and status. For high-status (and predominantly male) roles, the "always on" joys of corporate email have partially replaced the traditional arenas of performative loyalty and politicking down the pub, club or golf course. A parallel change has been the increasing professionalisation of management - i.e. the construction of a pseudo-scientific discipline through MBAs and rule-books - which means that footloose executives often lack an understanding of a business and value their ignorance of operational detail ("it's about strategic vision"). Rent-seeking remains a growth area of the economy.
Conventional economics sees no mystery in the failure of the 15 hour week to arrive, though it remains fascinated by Keynes's idea. A 2010 MIT collection of essays, Keynes Revisited, quoted in the New Yorker article, explained this failure variously: human inventiveness creates ever more new products, so we keep working to acquire more new stuff; we also work to achieve self-actualisation, which may just mean avoiding our families and meeting new sexual partners; and how much we work is conditioned by social norms as much as practical need, hence the sensual French have more holidays than puritanical Americans. These are essentially sociological explanations, which might seem odd coming from economists, until you consider the influence of rational choice theory (Gary Becker was a key contributor) and that the essays were written in the immediate aftermath of 2008, when conventional economics was trying to avoid catching our eye.
The influence of Becker leads to the assumption that the hours we work are a matter of personal choice, a trade-off necessitated by the inescapable scarcity of time. This might be dismissed as just a determination to keep the concept of scarcity at the heart of orthodox economics at a time of growing capital abundance, however I think it also relates to the ideological importance of competition, present in the idea that we have many demands competing for our attention. A key feature of the digital economy is its dependence on our time as both a scarce resource and a tradable commodity: we spend time with apps as much as we spend money; we pay for free services with our time and our liminal attention; we seek to turbo-boost our social connections, replacing inefficient face-time with instant messages and broadcasts. The boundary between work and leisure has been blurred by social media. This presents a challenge to both Keynes and his critics: where does work stop and non-work start?
This is part of a wider cultural turn since the 1970s centred on the idea of competition: not just with each other in an economic or quasi-economic context, but with ourselves in the internalisation of biopower (diets, gym, lifelogging), or through fetish objects (from Tamagochi to smartphones) and the status competition of social media. As Will Davies notes: "At key moment in the history of neoliberal thought, its advocates shifted from defending markets as competitive arenas amongst many, to viewing society-as-a-whole as one big competitive arena. Under the latter model, there is no distinction between arenas of politics, economics and society. To convert money into political power, or into legal muscle, or into media influence, or into educational advantage, is justifiable, within this more brutal, capitalist model of neoliberalism".
Inequality is then the rightful product of competition. However, this is not the same justification as the "superior rewards to talent" deployed in response to Thomas Piketty, but a more brutal acceptance that winning "by any means necessary" is its own justification. This can be thought of as the oligarch's defence, and has obvious implications for the effectiveness of democracy. It also has implications for the working day. If you believe you are engaged in a competition at all levels of activity - from the absurd "global race" through commercial rivalry to empire-building and jockeying within a company - and that this competition has no rules or constraints, not only will corporate malfeasance be normalised and "gamification" accepted as routine, but you will be constantly working the angles to preserve or improve your own position as "the entrepreneur of yourself". In such circumstances, taking time off is tantamount to conceding the game.
But there is a problem here, an incipient crisis. If automation continues to replace labour with abundant capital, if we reach "peak jobs", working time will inevitably become scarcer, even if we engineer more supernumerary roles. The current ideological valorisation of "hard-working" may be nothing more than the anxiety of musical chairs. What do we do once the music stops? A job guarantee scheme cannot work beyond a stop-gap, unless we want to commit to the creation of a new helot class in perpetuity, alongside an employment elite who are constantly at work in the same sense that the aristocracy once was. A guaranteed basic income might be a more palatable solution for capital, as it does not necessarily compromise ownership or power, but this means it would probably be framed as a subsidy to labour (i.e. a dole), rather than a social dividend (i.e. the individual's share of technological progress), so the tendency towards the parsimonious is likely to lead to social conflict.
If we think of a basic income as "buying off" labour (i.e. a market intervention, akin to a set-aside scheme), the principle of social equality will, ironically, lead to that time being priced at a minimum level - we won't buy back the time of the skilled at a higher price. This means we will buy off the time of the poor (the un- and under-employed) first, so retaining high work hours will be entrenched as a sign of status ("scarce" workers, aka "talent", will be exempted altogether). Buying back time on a pro-rata basis across all jobs would actually be better in the long run (it's a transitional measure that would gradually disappear), but it's a non-starter politically in the short run: why should a lawyer's free time be worth more than a binman's?
Another option is to revive popular capitalism by giving workers shares in the robots as their hours decline ("Workers can earn more of their income from capital than from working - by owning part of the robots that replace them ... rather than rely on government income redistribution policies"), but this looks about as credible as previous attempts at popular share-ownership under Thatcher and Yeltsin, which were clearly never intended to dilute actual capital accumulations. Unless the purchasing power of declining wages increases, shares will soon migrate to cash and ownership will once more be concentrated in the hands of the rich.
The point to remember about Keynes's 1930 essay, and the reason its still resonates, is that it was a defence of capitalism, in the form of technological innovation and the wonders of compound interest, at a time when many were questioning it in light of the 1929 crash. Keynes believed that if it were "managed" (through population control, technocracy and the avoidance of war - i.e. Edwardian liberal shibboleths), then it would surely deliver the "bliss" of a 15-hour week. This ignores the fact that all changes in working hours, from the Victorian Factory Acts to the EU Working Time Directive, have depended on legislation and enforcement by the state. Working time is a social construct, and thus a political choice.
That choice is between a competitive, market-based solution, which cannot but immiserate the majority to preserve the privileges of a minority, or a cooperative solution, which cannot avoid the same outcomes unless it is applied equally across all jobs, which means socialism. Given the ideological investment in our flexible labour market by all political parties since the 1980s, and the continuing obeisance to the interests of "business", it is likely that the former path will be the one taken, with the politics reduced to quibbling over the degree of immiseration. In that sense "austerity", at a point in history when we are collectively richer than ever and perfectly capable of spending our way to both higher growth and greater equality, is looking like a fixture of the century rather than a temporary aberration.
Friday, 4 July 2014
Match of the Day
The 1982 World Cup in Spain was the first that enthralled me as a tournament (1974 and 1978 were both a bit grim), largely because of the fascinating contrast in styles, nowhere better exemplified than in the semi-final between France and Germany, though the subsequent final between the Germans and a victorious Italy was also compelling (Marco's Tardelli's manic goal celebration etc). Today's France-Germany clash will do well to stand comparison with a match that finished 3-3 after extra time, featured Schumacher's infamous "bumming" of Battiston, and was the first World Cup match to be settled by a penalty shootout.
France were pretty ordinary at the start of the tournament, beginning with a defeat to England and Bryan Robson's first minute goal. The unusual format, with two group stages before the semis, allowed them time to get their increasingly fluid game together, based around the midfield of Platini, Giresse and Tigana. Though they lost the semi-final, they would go on to deservedly win the European Championship in 1984. Germany had also been unconvincing, losing to Algeria in the first group stage and drawing with England in the second, but would go on to to reach a further two finals in succession, winning in 1990. Though the flowering of the game after Italia 90 owed much to economic and social change, the 1982 tournament in Spain, and that semi-final in particular, was a key step in the aesthetic rehabilitation of football that occurred during the 80s.
If form is any guide, this could be a corker (so assume I've put the mockers on it). For the Arsenal fan, Germany must be favourites (3 squad players versus 2 - Sagna having left the building), but the romantic in me will be just as happy if the French win it after an epic struggle. Today's other quarter-final, Brazil versus Colombia, looks difficult to call, largely because it will probably depend on how well Neymar and Rodriguez play on the night. This highlights a difference between the American and European teams, namely that the former are heavily-dependant on a single player (Argentina without Messi would be distinctly average), while the latter each have a handful of game-changers, some on the bench. For this reason, I suspect we may see a European team win a tournament in the Americas for the first time.
Though only one of France and Germany can go through, both squads, and those of Belgium and Holland, looks to have greater depth and flexibility than Brazil, Colombia, Argentina and Costa Rica. My hunch is that Germany have the edge, mainly because they have more "square pegs" who can spring a surprise, such as Muller, Ozil and Gotze, which is interesting in itself given the national stereotype of precision engineered components. Germany look endearingly shambolic, from their makeshift full-backs, through Mertesacker's lack of a third-gear, to Thomas Muller's socks. Up against Jurgen Klinsmann's Californicated chinos and teeth in the last 16 round, Joachim's Low's impression of an anxious metrosexual restrauteur was a hoot.
In contrast, the French look like they mean business, closer to Les Paras than the Musketeers of 1982. Didier Deschamps is the embodiment of "obdurate", but the team's success owes much to the rapid forward movement from midfield of Matuidi, Valbuena and Pogba. They have a determination that echoes Platini & co. I suspect Germany will sit deep this time, rather than risk a high line and reliance on Neuer's eccentric sweeping, though this will allow France to bring on Giroud later to exploit crosses and produce layoffs. Germany will need some hard running in the middle of the park, so how they use Khedira and Schweinsteiger could prove crucial to the result. It's a fascinating prospect.
France were pretty ordinary at the start of the tournament, beginning with a defeat to England and Bryan Robson's first minute goal. The unusual format, with two group stages before the semis, allowed them time to get their increasingly fluid game together, based around the midfield of Platini, Giresse and Tigana. Though they lost the semi-final, they would go on to deservedly win the European Championship in 1984. Germany had also been unconvincing, losing to Algeria in the first group stage and drawing with England in the second, but would go on to to reach a further two finals in succession, winning in 1990. Though the flowering of the game after Italia 90 owed much to economic and social change, the 1982 tournament in Spain, and that semi-final in particular, was a key step in the aesthetic rehabilitation of football that occurred during the 80s.
If form is any guide, this could be a corker (so assume I've put the mockers on it). For the Arsenal fan, Germany must be favourites (3 squad players versus 2 - Sagna having left the building), but the romantic in me will be just as happy if the French win it after an epic struggle. Today's other quarter-final, Brazil versus Colombia, looks difficult to call, largely because it will probably depend on how well Neymar and Rodriguez play on the night. This highlights a difference between the American and European teams, namely that the former are heavily-dependant on a single player (Argentina without Messi would be distinctly average), while the latter each have a handful of game-changers, some on the bench. For this reason, I suspect we may see a European team win a tournament in the Americas for the first time.
Though only one of France and Germany can go through, both squads, and those of Belgium and Holland, looks to have greater depth and flexibility than Brazil, Colombia, Argentina and Costa Rica. My hunch is that Germany have the edge, mainly because they have more "square pegs" who can spring a surprise, such as Muller, Ozil and Gotze, which is interesting in itself given the national stereotype of precision engineered components. Germany look endearingly shambolic, from their makeshift full-backs, through Mertesacker's lack of a third-gear, to Thomas Muller's socks. Up against Jurgen Klinsmann's Californicated chinos and teeth in the last 16 round, Joachim's Low's impression of an anxious metrosexual restrauteur was a hoot.
In contrast, the French look like they mean business, closer to Les Paras than the Musketeers of 1982. Didier Deschamps is the embodiment of "obdurate", but the team's success owes much to the rapid forward movement from midfield of Matuidi, Valbuena and Pogba. They have a determination that echoes Platini & co. I suspect Germany will sit deep this time, rather than risk a high line and reliance on Neuer's eccentric sweeping, though this will allow France to bring on Giroud later to exploit crosses and produce layoffs. Germany will need some hard running in the middle of the park, so how they use Khedira and Schweinsteiger could prove crucial to the result. It's a fascinating prospect.
Tuesday, 1 July 2014
Monkey Tennis
The victims of Stuart Hall, Max Clifford and Rolf Harris have talked about the mesmeric nature of fame, and how it can make you unable to resist abuse while simultaneously grateful for the attention. The same mechanism was exploited by Jimmy Savile, but his is a more important case because his career of abuse depended on connivance and willed ignorance by others in authority. In this sense, he has more in common with Rebekah Brooks, despite her innocence of phone-hacking, in terms of modus operandi.
Hall and Harris are examples of the petty abuse of power - the corruption common to minor functionaries and bullies - opportunistically taking advantage of the impressionable and vulnerable who crossed their paths. Clifford was more parasitical, exploiting his proximity to celebrities while exhibiting loathing of himself (the obsession with his knob) and many of his erstwhile clients. The common theme is opportunity and a lack of moral scruple where temptation was concerned. In contrast, Savile and Brooks saw power in instrumental and transactional terms and set out to amass it, the one through a strategy of self-promotion and the other through relentless networking. The distinction is between the idea of celebrity as an intrinsic quality - personal talent or prestige - and a form of capital that is accumulated.
As the media for projecting a personality have increased in scope and number over time, celebrity has not concentrated but proliferated. Instead of Big Brother we have the Z-list. Celebrity has become a profuse commodity. In their different ways, both Savile and Brooks appreciated that the growth in bandwidth meant an increasing demand for content. Savile was a hustler and a "self-punter", for ever making grandiose claims and pushing self-interested schemes. It was like Alan Partridge and Monkey Tennis but without the humour. Brooks was ultimately the product of changing economics, as falling paper sales led to a turn away from expensive hard news towards celebrity gossip and synthetic anger. An irony of the phone-hacking trial is the extent to which the tabloids came to rely on premium-rate phonelines, from inhouse competitions to sex-chat adverts. That, rather than the vulnerability of voice-mail, was the significant impact of mobile phone technology on newspapers: it changed the revenue dynamic.
Jeremy Hunt recently insinuated that Savile's long history of abuse in NHS hospitals was the product of institutional failure and personal negligence, and thus consistent with the government's critique of the organisation. He was predictably coy about the evidence that hospitals turned a blind eye because of the fund-raising and publicity that Savile's involvement brought. Suggesting that dependence on external finance might be corrupting is not politically helpful when you're trying to open up the NHS to private providers. Savile wasn't indulged because he was "just Jimmy", but because he was important to revenue. This was transactional.
Savile's abuse was not the product of his actual celebrity. Rather he parlayed his limited celebrity as a third-rate DJ and charmless laughing-stock into a career as a charity fund-raiser. It was this that provided the opportunity to gain privileged access to vulnerable victims, and also access to other power networks that could advance and protect him (Christmas with Thatcher etc). His appetite for public service campaigns (Clunk-Click, The Age of the Train) was mirrored in his walking from Lands End to John O'Groats three times: literally national coverage. Jim'll Fix It was an inspired idea, from his perspective. Initially, the BBC did have concerns about what was being "fixed", but in terms of the risks of product placement by conniving businesses. Throughout his life, Savile was a fixer and influence-peddler, and someone who instinctively understood the commodity nature of celebrity.
Rebekah Brooks's defence in the phone-hacking trial boiled down to persistent ignorance in her role as editor. She had to construct a personality for the jury that was both incompetent (she knew nothing) and sympathetic (she was distracted by fertility treatment, the breakup of her first marriage, her anti-paedophile campaign etc). What this left was the reality of a relentless operator who cultivated relationships for instrumental ends and was incapable of distinguishing between sentimentality and ethical principles. The post-verdict speech - "I've learnt some valuable lessons and hopefully I'm the wiser for it" - was straight out of the embarrassed politician's playbook.
Brooks's rise and subsequent fall has fascinated the broadsheets, who despise her for not being a "real" journalist and for being too much the schmoozer (and Blair's mate). The flame-haired witch trope and the many testaments to her flirtatious charm are the sort of jejune sexism you can expect from ex-public schoolboys who go weak at the knees when a woman touches their arm. Some of the contempt is self-deluding romanticism: an affront to the ideal of the principled, campaigning editor from CP Snow to Lou Grant. Some of it is class contempt: the conflation of the tabloid readership's presumed gullibility and prurience with the amorality of the journos and their seedy sub-contractors. Some of it is unease because her career reveals that running a newspaper is just a branch of PR, which perhaps gives an insight into the empathy that Cameron had for both her and Coulson.
Being a propaganda organ is hardly novel for a newspaper, but this is now pursued on an "industrial scale" by a global corporation like News Corp / Fox News, rather than being simply the byproduct of institutional prejudice. It is noticeable how many "campaigns" in the popular press these days are confected: an over-promoted combination of the unobjectionable and the marginal. We have moved from resonant social issues, such as Rachmanism and Thalidomide, to the hyperbolic sentimentality of Sarah's Law and Our Brave Boys. This is partly economics again: the tabloids are averse to funding major investigations while the broadsheets find it easier and cheaper to operate closer to home, hence the bias towards government scandals (cash for questions, MPs' expenses etc) and the incestuous relationship with The Met, who artlessly provide both investigatory legwork and their own dirty linen. Corporate power has never had it so easy, despite the periodic chuntering about tax avoidance.
David Cameron's apology in respect of Andy Coulson suggests both a man who didn't trouble to enquire too closely into his appointee's past (which feeds the stereotype of the PM as a lazy chancer), but a better interpretation would be that his willed ignorance was simply the product of being at ease with a man he understood. Coulson's attraction was not his empathy for the common man (feeding the stereotype of the cabinet as out-of-touch elitists), which is hardly a rare gift, but his connections with News International and his proven media-management skills. This was transactional. Rupert Murdoch has emerged from the scandal commercially stronger, largely because of the defensive separation of newsprint from TV. The money spent on defending Brooks, Coulson et al is chicken-feed if it provides an effective firebreak against a prosecution under the US Foreign and Corrupt Practises Act.
The different verdicts for Coulson and Brooks reflect their different operational involvement in "editing": the one a hack, the other a networker. Brooks's wilful ignorance about what was going on in her own newspaper would be amusing if it weren't typical of executive behaviour: covering her arse through "empowerment" and plausible deniability. Whereas Coulson was interested in celebrity purely as a commodity, something he could transform into stories and circulation, Brooks was seduced by a belief in her own intrinsic power. She was a user as well as a dealer, trading up through the celeb hierarchy from Piers Morgan via Ross Kemp to Blair and the Chipping Norton set.
There is a contrast to be made here with George Entwhistle, a BBC lifer who was technically well-qualified for the Director General role but completely at sea when trying to handle the politics outside the walls of Broadcasting House: the exact opposite of Brooks. In his case, empowerment was his undoing. An experienced arse-coverer would have issued an email to all staff on day one stating that in light of the Savile allegations any stories concerning sexual abuse would have to be cleared with the DG's office before broadcast. This would have been contrary to his instincts and experience, but exactly the sort of thing that an astute politician would have done.
When confronted with the crimes of Savile, some contemporaries, like Esther Rantzen, have excused their blindness by claiming that it was a more innocent era with different standards of behaviour. This is self-serving nonsense. Sexual abuse is not a recent invention but a common aspect of power relations. What has changed since the 1970s is that authority has shrunk in some institutional areas, such as teaching and medicine, while it has grown in other marketised areas, such as management and celebrity. This is a product of neoliberalism, which has delivered enhanced power to executives while imposing greater inspection on workers. This shift in authority is reflected in changes in opportunity and temptation: more cocaine in the boardroom, less petty pilfering. The result is that there are fewer cases of teachers abusing pupils (though these few get blanket media exposure), but more cases of headteachers abusing funds (routinely hushed-up outside the public sector).
The growth in cases of historic and contemporary sexual abuse by celebrities reflects a growth in the quantum of newsworthy "slebs" and a parallel growth in the demand for salacious content by both bored consumers and cash-strapped media. There has not been a shift in morality. What has happened is the arrival of multi-channel TV, the Internet, and the symbiotic recycling of these two by newspapers. We live in the age of Monkey Tennis.
Hall and Harris are examples of the petty abuse of power - the corruption common to minor functionaries and bullies - opportunistically taking advantage of the impressionable and vulnerable who crossed their paths. Clifford was more parasitical, exploiting his proximity to celebrities while exhibiting loathing of himself (the obsession with his knob) and many of his erstwhile clients. The common theme is opportunity and a lack of moral scruple where temptation was concerned. In contrast, Savile and Brooks saw power in instrumental and transactional terms and set out to amass it, the one through a strategy of self-promotion and the other through relentless networking. The distinction is between the idea of celebrity as an intrinsic quality - personal talent or prestige - and a form of capital that is accumulated.
As the media for projecting a personality have increased in scope and number over time, celebrity has not concentrated but proliferated. Instead of Big Brother we have the Z-list. Celebrity has become a profuse commodity. In their different ways, both Savile and Brooks appreciated that the growth in bandwidth meant an increasing demand for content. Savile was a hustler and a "self-punter", for ever making grandiose claims and pushing self-interested schemes. It was like Alan Partridge and Monkey Tennis but without the humour. Brooks was ultimately the product of changing economics, as falling paper sales led to a turn away from expensive hard news towards celebrity gossip and synthetic anger. An irony of the phone-hacking trial is the extent to which the tabloids came to rely on premium-rate phonelines, from inhouse competitions to sex-chat adverts. That, rather than the vulnerability of voice-mail, was the significant impact of mobile phone technology on newspapers: it changed the revenue dynamic.
Jeremy Hunt recently insinuated that Savile's long history of abuse in NHS hospitals was the product of institutional failure and personal negligence, and thus consistent with the government's critique of the organisation. He was predictably coy about the evidence that hospitals turned a blind eye because of the fund-raising and publicity that Savile's involvement brought. Suggesting that dependence on external finance might be corrupting is not politically helpful when you're trying to open up the NHS to private providers. Savile wasn't indulged because he was "just Jimmy", but because he was important to revenue. This was transactional.
Savile's abuse was not the product of his actual celebrity. Rather he parlayed his limited celebrity as a third-rate DJ and charmless laughing-stock into a career as a charity fund-raiser. It was this that provided the opportunity to gain privileged access to vulnerable victims, and also access to other power networks that could advance and protect him (Christmas with Thatcher etc). His appetite for public service campaigns (Clunk-Click, The Age of the Train) was mirrored in his walking from Lands End to John O'Groats three times: literally national coverage. Jim'll Fix It was an inspired idea, from his perspective. Initially, the BBC did have concerns about what was being "fixed", but in terms of the risks of product placement by conniving businesses. Throughout his life, Savile was a fixer and influence-peddler, and someone who instinctively understood the commodity nature of celebrity.
Rebekah Brooks's defence in the phone-hacking trial boiled down to persistent ignorance in her role as editor. She had to construct a personality for the jury that was both incompetent (she knew nothing) and sympathetic (she was distracted by fertility treatment, the breakup of her first marriage, her anti-paedophile campaign etc). What this left was the reality of a relentless operator who cultivated relationships for instrumental ends and was incapable of distinguishing between sentimentality and ethical principles. The post-verdict speech - "I've learnt some valuable lessons and hopefully I'm the wiser for it" - was straight out of the embarrassed politician's playbook.
Brooks's rise and subsequent fall has fascinated the broadsheets, who despise her for not being a "real" journalist and for being too much the schmoozer (and Blair's mate). The flame-haired witch trope and the many testaments to her flirtatious charm are the sort of jejune sexism you can expect from ex-public schoolboys who go weak at the knees when a woman touches their arm. Some of the contempt is self-deluding romanticism: an affront to the ideal of the principled, campaigning editor from CP Snow to Lou Grant. Some of it is class contempt: the conflation of the tabloid readership's presumed gullibility and prurience with the amorality of the journos and their seedy sub-contractors. Some of it is unease because her career reveals that running a newspaper is just a branch of PR, which perhaps gives an insight into the empathy that Cameron had for both her and Coulson.
Being a propaganda organ is hardly novel for a newspaper, but this is now pursued on an "industrial scale" by a global corporation like News Corp / Fox News, rather than being simply the byproduct of institutional prejudice. It is noticeable how many "campaigns" in the popular press these days are confected: an over-promoted combination of the unobjectionable and the marginal. We have moved from resonant social issues, such as Rachmanism and Thalidomide, to the hyperbolic sentimentality of Sarah's Law and Our Brave Boys. This is partly economics again: the tabloids are averse to funding major investigations while the broadsheets find it easier and cheaper to operate closer to home, hence the bias towards government scandals (cash for questions, MPs' expenses etc) and the incestuous relationship with The Met, who artlessly provide both investigatory legwork and their own dirty linen. Corporate power has never had it so easy, despite the periodic chuntering about tax avoidance.
David Cameron's apology in respect of Andy Coulson suggests both a man who didn't trouble to enquire too closely into his appointee's past (which feeds the stereotype of the PM as a lazy chancer), but a better interpretation would be that his willed ignorance was simply the product of being at ease with a man he understood. Coulson's attraction was not his empathy for the common man (feeding the stereotype of the cabinet as out-of-touch elitists), which is hardly a rare gift, but his connections with News International and his proven media-management skills. This was transactional. Rupert Murdoch has emerged from the scandal commercially stronger, largely because of the defensive separation of newsprint from TV. The money spent on defending Brooks, Coulson et al is chicken-feed if it provides an effective firebreak against a prosecution under the US Foreign and Corrupt Practises Act.
The different verdicts for Coulson and Brooks reflect their different operational involvement in "editing": the one a hack, the other a networker. Brooks's wilful ignorance about what was going on in her own newspaper would be amusing if it weren't typical of executive behaviour: covering her arse through "empowerment" and plausible deniability. Whereas Coulson was interested in celebrity purely as a commodity, something he could transform into stories and circulation, Brooks was seduced by a belief in her own intrinsic power. She was a user as well as a dealer, trading up through the celeb hierarchy from Piers Morgan via Ross Kemp to Blair and the Chipping Norton set.
There is a contrast to be made here with George Entwhistle, a BBC lifer who was technically well-qualified for the Director General role but completely at sea when trying to handle the politics outside the walls of Broadcasting House: the exact opposite of Brooks. In his case, empowerment was his undoing. An experienced arse-coverer would have issued an email to all staff on day one stating that in light of the Savile allegations any stories concerning sexual abuse would have to be cleared with the DG's office before broadcast. This would have been contrary to his instincts and experience, but exactly the sort of thing that an astute politician would have done.
When confronted with the crimes of Savile, some contemporaries, like Esther Rantzen, have excused their blindness by claiming that it was a more innocent era with different standards of behaviour. This is self-serving nonsense. Sexual abuse is not a recent invention but a common aspect of power relations. What has changed since the 1970s is that authority has shrunk in some institutional areas, such as teaching and medicine, while it has grown in other marketised areas, such as management and celebrity. This is a product of neoliberalism, which has delivered enhanced power to executives while imposing greater inspection on workers. This shift in authority is reflected in changes in opportunity and temptation: more cocaine in the boardroom, less petty pilfering. The result is that there are fewer cases of teachers abusing pupils (though these few get blanket media exposure), but more cases of headteachers abusing funds (routinely hushed-up outside the public sector).
The growth in cases of historic and contemporary sexual abuse by celebrities reflects a growth in the quantum of newsworthy "slebs" and a parallel growth in the demand for salacious content by both bored consumers and cash-strapped media. There has not been a shift in morality. What has happened is the arrival of multi-channel TV, the Internet, and the symbiotic recycling of these two by newspapers. We live in the age of Monkey Tennis.