The unintentionally hilarious comments on the Olympics opening ceremony by various right-wing curmudgeons have been revealing. Most attention has gone to Aidan Burley's attack on "leftie multi-cultural crap", but a review of the rest shows a shared worldview that extends beyond a dislike of urban music and miscegenation. The NHS is seen as "socialist" and a "nationalised stranglehold", and its celebration is a party political broadcast for Labour.
The NHS isn't socialist. It's fundamentally a liberal institution, owing more in its design to Bismarck than Marx. Though hospitals are nationalised, the pharmaceutical and medical supply industries are not, and nor is hospital construction, which means private profit is drained from the public sector. Doctors remain private practitioners, with little incentive to invest in preventative care. There is no workers control, the privatisation of ancillary services has been common for decades, and local democratic oversight is now non-existent. Of course, if you believe "socialist" is synonymous with "not privately-owned", then I can see why the confusion arises.
My favourite comment was Rupert Murdoch's: "London Olympic opening surprisingly great, even if a little too politically correct", implying that there might be an appropriate level of political correctness on such an occasion. What, I wonder, would make it less PC in Rupe's eyes? An all-white cast, a celebration of the entrepreneurial spirit of drug dealers, a tableau of famous Sun headlines? Political correctness is a straw man, so this comment is just an example of how it tends to be crow-barred into the conversation at every opportunity. Coincidentally, I came across an interesting quote on the subject from the misanthropic right-wing psychiatrist, Theodore Dalrymple (a sort of minor-key Celine):
"Political correctness is communist propaganda writ small. In my study of communist societies, I came to the conclusion that the purpose of communist propaganda was not to persuade or convince, nor to inform, but to humiliate; and therefore, the less it corresponded to reality the better. When people are forced to remain silent when they are being told the most obvious lies, or even worse when they are forced to repeat the lies themselves, they lose once and for all their sense of probity. To assent to obvious lies is to co-operate with evil, and in some small way to become evil oneself. One's standing to resist anything is thus eroded, and even destroyed. A society of emasculated liars is easy to control. I think if you examine political correctness, it has the same effect and is intended to."
Enforced silence and "assent to obvious lies" are characteristic of the classic Orwellian trope of self-repression in a totalitarian society. This not only "humiliates" the individual, but corrupts them to a point where their ability to resist anything is "destroyed". The problem with this is that it doesn't accord with reality. Did 40 years of Communism result in the people of Eastern Europe losing their sense of probity "once and for all"? Did they "in some small way" become evil? Were they easier to control in 1989 than in 1949?
The segue implies that political correctness should be seen as an organised propaganda effort, with some shadowy bureaucracy coordinating a masterplan. I bet they even use spreadsheets and hold team meetings. Central to this effort is the dissemination of lies and the enforcement of silence on particular topics. Thus political correctness prevents us from criticising Islam for misogyny, or suggesting that rap culture glorifies crime, or that poor people are congenital failures. Nope, you'll never see mention of any of that in newspapers or online.
It's easy to laugh at this paranoid conflation of the Stasi and political correctness, but the point about silence is suggestive. If we are living in a society that is repressed by PC, then the evidence for this would include an unwillingness to talk about certain subjects. Not just minor issues that we can push to the margins, but big issues that affect most people. So what are the things we don't speak of?
We praise democracy and even seek to "spread" it to other countries, sometimes through war, but we have an aversion to it in the workplace. The overwhelming majority of businesses are run as dictatorships. Literally. The word of the guy at the top is law. Dissent will result in sanctions and ultimately expulsion. We secure advancement through flattery and groupthink. The guy at the top doesn't really know what is going on because he is out of touch and we fear speaking truth to power (we even celebrate this in reality TV shows). In this, the NHS is no different to a private business.
One of the themes of the Olympic love-in has been admiration for the opening night supremo, Danny Boyle. Many of the anecdotes tell of his willingness to listen to anyone, his inclusivity and humility. He is, in other words, the ideal boss, and as such he's about as representative as Mary Poppins is of child-minders. Talking of the cast and crew, he said "The show belongs to them, the country belongs to them." A nice sentiment, but neither is true.
Search
Sunday, 29 July 2012
Saturday, 28 July 2012
London Calling
I've been conscious for a while that this blog has been a bit light on the music. Even the Arse has taken a back seat to politics and economics, though that reflects the strange times we are living through. So, I had vaguely planned a rambling roundup of the stuff I've been listening to of late, when I was given an elbow-in-the-ribs by last night's telly. Not only was the Olympic opening ceremony better than the Stars on 45 / Jive Bunny melange that we feared, but BBC4 had an excellent documentary on Krautrock at the same time. Were they trying to bury it?
Coincidentally, I had been listening this week to the best of Can (Anthology 1968-93). I had arrived at this after following a musical chain reaction that started with the collected works of the Buzzcocks, progressed via Magazine, then took a back-flip via Iggy Pop (Lust for Life and The Idiot) and David Bowie (Low) before alighting on Holger Czukay (the sublime Movies and On the Way to the Peak of Normal), from which I bifurcated forward to Jah Wobble (I Could Have Been a Contender) and backward to Can, with odd excursions to Captain Beefheart, Pere Ubu and Wire along the way. As you can see, I like the weird stuff.
At t'Olympic Park, there was no doubting the power of Going Underground, Pretty Vacant, Heroes, and I Bet You Look Good on the Dance Floor, and even Macca wasn't too awful, though mainly because he wasn't Elton John. Danny Boyle's slightly more grownup than usual treatment of history will no doubt go down a treat, though the gaps were telling: plenty of suffragettes, not so many trades unionists; a parachuting Queen, but no mention of Cromwell (ironically, Michael Wood broached the 17th century earlier in the evening on BBC2, and made a passionate case for remembrance of the Levellers).
The focus on the NHS and music was both popular and clever, distracting attention from the diminishing role of industry (you can't dance a hedge fund) and the absence of a modern-day Brunel (Tim Berners-Lee really isn't comparable and Dizee Rascal is not a renaissance man). Ultimately, it will all be forgotten as we fixate on the german doing the Nazi salute, with Camilla and Boris pissing themselves in the background. What larks.
The Krautrock documentary took a more sober view, noting the effect that superficial denazification had on cultural life and the political development of the Red Army Faction, aka the Baader-Meinhof gang. One member of the gang, Astrid Proll, would end up hiding out in London till her arrest in 1978, at a time when The Clash were wearing RAF emblems for shock value. London Calling was a year away.
On the music front of more recent vintage I would recommend Sharon van Etten: Serpents and Give Out; Dirty Projectors: Gun Has No Trigger; Torche: Kicking; and Washed Out: Amor Fati. I do like Frank Ocean's Pyramids, but I still prefer his Songs for Women.
Coincidentally, I had been listening this week to the best of Can (Anthology 1968-93). I had arrived at this after following a musical chain reaction that started with the collected works of the Buzzcocks, progressed via Magazine, then took a back-flip via Iggy Pop (Lust for Life and The Idiot) and David Bowie (Low) before alighting on Holger Czukay (the sublime Movies and On the Way to the Peak of Normal), from which I bifurcated forward to Jah Wobble (I Could Have Been a Contender) and backward to Can, with odd excursions to Captain Beefheart, Pere Ubu and Wire along the way. As you can see, I like the weird stuff.
At t'Olympic Park, there was no doubting the power of Going Underground, Pretty Vacant, Heroes, and I Bet You Look Good on the Dance Floor, and even Macca wasn't too awful, though mainly because he wasn't Elton John. Danny Boyle's slightly more grownup than usual treatment of history will no doubt go down a treat, though the gaps were telling: plenty of suffragettes, not so many trades unionists; a parachuting Queen, but no mention of Cromwell (ironically, Michael Wood broached the 17th century earlier in the evening on BBC2, and made a passionate case for remembrance of the Levellers).
The focus on the NHS and music was both popular and clever, distracting attention from the diminishing role of industry (you can't dance a hedge fund) and the absence of a modern-day Brunel (Tim Berners-Lee really isn't comparable and Dizee Rascal is not a renaissance man). Ultimately, it will all be forgotten as we fixate on the german doing the Nazi salute, with Camilla and Boris pissing themselves in the background. What larks.
The Krautrock documentary took a more sober view, noting the effect that superficial denazification had on cultural life and the political development of the Red Army Faction, aka the Baader-Meinhof gang. One member of the gang, Astrid Proll, would end up hiding out in London till her arrest in 1978, at a time when The Clash were wearing RAF emblems for shock value. London Calling was a year away.
On the music front of more recent vintage I would recommend Sharon van Etten: Serpents and Give Out; Dirty Projectors: Gun Has No Trigger; Torche: Kicking; and Washed Out: Amor Fati. I do like Frank Ocean's Pyramids, but I still prefer his Songs for Women.
Friday, 27 July 2012
Enough with the Flying Cars, Where did the 15-hour Week Go?
Paul Krugman got in on the flying car lament this week: "If you look at what futurists were predicting 40 or 45 years ago, they somewhat underpredicted progress in IT (except for the artificial intelligence thing), but wildly overpredicted progress in dealing with the material world. Weren’t we supposed to have underwater cities, commercial space flight, and flying cars by now". His tongue was obviously wedged in his cheek, but the point about the difference between IT and material science is significant.
Futurists are clairvoyants. In other words, they use cold reading techniques to pick up clues from the here and now in order to make suggestive guesses, so future predictions tend to reflect contemporary concerns and expectations. In the 60s, underwater cities and space flight were reasonable extrapolations of developing technology and also reflected both the positive New Frontier vibe and concerns about over-population. By the 70s, the Green Revolution had eased the latter and attention shifted to fears about energy (following the oil shock of 1973), environmental stress and social breakdown (as globalisation and deindustrialisation kicked in), with a pervading sense that technology was a double-edged sword and perhaps beyond human control (Future Shock).
Underwater cities found no takers (it's easier to use subsea robots), while space flight has devolved to strip-mining asteroids (the new New Frontier). The problem isn't doing it, but finding a viable reason to do it. Today we speculate about the singularity, bioengineering and immersive virtual-reality entertainment systems, which (I think) reflects a belief that a post-work/post-scarcity world is coming, assuming we sort out limitless energy.
And flying cars? Traffic management is bad enough in two dimensions without adding a third (planes need air traffic controllers, cars don't). And how exactly will the car fly? You can't generate enough speed to produce lift in a 30mph zone, and anti-gravity simply isn't going to happen in this particular quantum universe. The future is probably driverless electric cars and automatic flow management, which holds out the prospect of increased capacity with fewer jams and less pollution. A bit dull, really, but it would avoid the need for Olympic lanes, and it might even reintroduce drink-driving.
The flying car lament is so pervasive that you have to suspect some ideological resonance. The opening title sequence of The Jetsons is significant not because of the flying car, but its use as a way to get to work, drop the kids off at school, and drop the wife off at the store (this was the early 60s, just before second cars became common in the US), a sequence parodied by The Simpsons (with two cars). Flying cars look futuristic, but they're part of a highly conservative worldview in which we still labour for 40 hours a week, the wife might work but is still primarily a "homemaker", and the kids get a high-quality education so they too can progress to full-time jobs.
John Maynard Keynes, in his 1930 essay The Economic Possibilities for our Grandchildren, envisaged a future in which technological advance and the wonders of compound interest applied to capital accumulation would allow us to reduce the working week to 15 hours within 100 years. Despite being a high-minded member of the Bloomsbury set, who went on to become the founding chairman of the Arts Council, he was cautious about prescribing how we should "live wisely and agreeably and well", but chamber music, good books and country walks probably featured. This circumspection lives on among modern advocates, though comments about the "good life" and sustainability are indicative of the moral foundation. We should be honest and cut the value judgements altogether. If you want to spend your days playing on your X-box and reading Heat, then so be it.
Many proposals to reduce the working week present it as a trade-off with growth, i.e. a zero-growth model means shorter hours, and vice versa. I think this is wrong. I suspect we can have both positive growth and shorter hours. The trope of techno-pessimism has been with us a long time. Keynes in his essay opens with "It is common to hear people say that the epoch of enormous economic progress which characterised the nineteenth century is over; that the rapid improvement in the standard of life is now going to slow down". That was in 1930, remember. Since the 1970s, I believe we have been offsetting rapid technological advance by creating supernumerary white-collar jobs at the same time as we have automated or offshored blue-collar ones. This has served to depress productivity growth rates, maintain the standard working week, and has also contributed to stagnant median wage growth.
This strategy can be interpreted as a compact between capital and the middle class. Though the former would prefer to maximise profit and thus accumulation, they need the support of the key electoral bloc to maintain the economic order. Job creation for the middle class is tolerated as a quid pro quo, a form of clientelism. The key difference between Northern and Southern Italy is that "jobs for the boys" (and girls) are mainly through the private sector in one and the public sector in the other. That geographical distinction has become more apparent in the UK over the last 15 years: more hospital administrators in Leeds and more corporate social responsibility managers in London.
As work has increasingly become a token for access to economic rent, ideology has taken on a more moralistic tone centred on just desserts. If you are unemployed, you probably deserve it ("there are plenty of jobs"); if you are poor, it's your own fault ("that's the market rate"); if you're on benefits, you don't deserve them ("they're all cheats"). At the other end of the scale, bonuses are paid for turning up to work and regardless of company performance ("you have to retain the talent"). The 40-hour week isn't necessary for the economy as a whole, but it is necessary to preserve the unequal distribution of work.
If work was rationed, it would be fairer to spread it equally across all those who wish to work, but it would also make sense to not allocate it to those who won't/can't make use of it, as that just wastes an opportunity for someone else. If brussel sprouts were rationed, I'd be happy to forgo my portion for someone who actually likes them. A rationed approach would inevitably lead to a basic income model, i.e. an unconditional living wage for all. This would allow working hours to be reduced to their underlying (real) level of productivity, while maintaining workers income. However, that would also mean paying the feckless, which would cause many to gag (like me with sprouts). In truth, we always pay them anyway. We just humiliate them before we allow them to not starve on our watch.
A 15-hour working week is a lot more feasible than flying cars, but it stands no better chance of being implemented any time soon. To do so would require a more egalitarian approach to work than any mainstream political party seems prepared to advance, largely because of the fear of a moral backlash. The demonisation of benefit recipients is less about reducing public expenditure and more about preserving the loyalty of those in work, so you can expect it to get worse. Unlike future predictions, divide and rule never ages.
Futurists are clairvoyants. In other words, they use cold reading techniques to pick up clues from the here and now in order to make suggestive guesses, so future predictions tend to reflect contemporary concerns and expectations. In the 60s, underwater cities and space flight were reasonable extrapolations of developing technology and also reflected both the positive New Frontier vibe and concerns about over-population. By the 70s, the Green Revolution had eased the latter and attention shifted to fears about energy (following the oil shock of 1973), environmental stress and social breakdown (as globalisation and deindustrialisation kicked in), with a pervading sense that technology was a double-edged sword and perhaps beyond human control (Future Shock).
Underwater cities found no takers (it's easier to use subsea robots), while space flight has devolved to strip-mining asteroids (the new New Frontier). The problem isn't doing it, but finding a viable reason to do it. Today we speculate about the singularity, bioengineering and immersive virtual-reality entertainment systems, which (I think) reflects a belief that a post-work/post-scarcity world is coming, assuming we sort out limitless energy.
And flying cars? Traffic management is bad enough in two dimensions without adding a third (planes need air traffic controllers, cars don't). And how exactly will the car fly? You can't generate enough speed to produce lift in a 30mph zone, and anti-gravity simply isn't going to happen in this particular quantum universe. The future is probably driverless electric cars and automatic flow management, which holds out the prospect of increased capacity with fewer jams and less pollution. A bit dull, really, but it would avoid the need for Olympic lanes, and it might even reintroduce drink-driving.
The flying car lament is so pervasive that you have to suspect some ideological resonance. The opening title sequence of The Jetsons is significant not because of the flying car, but its use as a way to get to work, drop the kids off at school, and drop the wife off at the store (this was the early 60s, just before second cars became common in the US), a sequence parodied by The Simpsons (with two cars). Flying cars look futuristic, but they're part of a highly conservative worldview in which we still labour for 40 hours a week, the wife might work but is still primarily a "homemaker", and the kids get a high-quality education so they too can progress to full-time jobs.
John Maynard Keynes, in his 1930 essay The Economic Possibilities for our Grandchildren, envisaged a future in which technological advance and the wonders of compound interest applied to capital accumulation would allow us to reduce the working week to 15 hours within 100 years. Despite being a high-minded member of the Bloomsbury set, who went on to become the founding chairman of the Arts Council, he was cautious about prescribing how we should "live wisely and agreeably and well", but chamber music, good books and country walks probably featured. This circumspection lives on among modern advocates, though comments about the "good life" and sustainability are indicative of the moral foundation. We should be honest and cut the value judgements altogether. If you want to spend your days playing on your X-box and reading Heat, then so be it.
Many proposals to reduce the working week present it as a trade-off with growth, i.e. a zero-growth model means shorter hours, and vice versa. I think this is wrong. I suspect we can have both positive growth and shorter hours. The trope of techno-pessimism has been with us a long time. Keynes in his essay opens with "It is common to hear people say that the epoch of enormous economic progress which characterised the nineteenth century is over; that the rapid improvement in the standard of life is now going to slow down". That was in 1930, remember. Since the 1970s, I believe we have been offsetting rapid technological advance by creating supernumerary white-collar jobs at the same time as we have automated or offshored blue-collar ones. This has served to depress productivity growth rates, maintain the standard working week, and has also contributed to stagnant median wage growth.
This strategy can be interpreted as a compact between capital and the middle class. Though the former would prefer to maximise profit and thus accumulation, they need the support of the key electoral bloc to maintain the economic order. Job creation for the middle class is tolerated as a quid pro quo, a form of clientelism. The key difference between Northern and Southern Italy is that "jobs for the boys" (and girls) are mainly through the private sector in one and the public sector in the other. That geographical distinction has become more apparent in the UK over the last 15 years: more hospital administrators in Leeds and more corporate social responsibility managers in London.
As work has increasingly become a token for access to economic rent, ideology has taken on a more moralistic tone centred on just desserts. If you are unemployed, you probably deserve it ("there are plenty of jobs"); if you are poor, it's your own fault ("that's the market rate"); if you're on benefits, you don't deserve them ("they're all cheats"). At the other end of the scale, bonuses are paid for turning up to work and regardless of company performance ("you have to retain the talent"). The 40-hour week isn't necessary for the economy as a whole, but it is necessary to preserve the unequal distribution of work.
If work was rationed, it would be fairer to spread it equally across all those who wish to work, but it would also make sense to not allocate it to those who won't/can't make use of it, as that just wastes an opportunity for someone else. If brussel sprouts were rationed, I'd be happy to forgo my portion for someone who actually likes them. A rationed approach would inevitably lead to a basic income model, i.e. an unconditional living wage for all. This would allow working hours to be reduced to their underlying (real) level of productivity, while maintaining workers income. However, that would also mean paying the feckless, which would cause many to gag (like me with sprouts). In truth, we always pay them anyway. We just humiliate them before we allow them to not starve on our watch.
A 15-hour working week is a lot more feasible than flying cars, but it stands no better chance of being implemented any time soon. To do so would require a more egalitarian approach to work than any mainstream political party seems prepared to advance, largely because of the fear of a moral backlash. The demonisation of benefit recipients is less about reducing public expenditure and more about preserving the loyalty of those in work, so you can expect it to get worse. Unlike future predictions, divide and rule never ages.
Saturday, 21 July 2012
des Engländers Angst vor dem Elfmeter
Are penalties a fair way to decide a football match? Are England just plain unlucky? I was reading an article that looked at their poor record (which in passing mentioned a wonderful sounding research paper on The effect of rugby match outcome on spectator aggression and intention to drink
alcohol) and noticed a helpful link to a detailed analysis conducted after they made a total "Ashley" of it back in June. What caught my eye was not England's abysmal record (a 14% win ratio) but the almost equally bad record of the Dutch (only 20%).
This should be evidence enough that the problem is not lack of technique. The only slightly better record of the Italians (38%) also shows that familiarity is not decisive either. Italy took part in 8 shootouts over the period analysed (from 1990), compared to England's 7 and Holland's 5. Germany took part in 6 but have a win ratio of 83%. I think we can also dismiss the idea that because England do more running (they don't - this is just an assumption based on the poor technique belief), they are more tired and thus less effective at the end of extra-time. Everyone is shattered.
Looking at the scoring rates, both England and Holland miss 1 in 3 of their penalties, while Germany only miss once in roughly every two shootouts. Conversely, England's opponents only miss 1 in 5 while Germany's miss 1 in 3. On the face of it, this implies that the "problem" is equal parts scoring and stopping. However, the norm seems to be a scoring and conceding rate of about 80%, i.e. you miss 1 in 5 and so does the opposition. Spain and France show that this will produce an overall win ratio of around 50%, which is what you should expect if the shootout were a lottery. Given that England also only concede 1 in 5, this points the finger back at scoring, though this leaves you wondering why Germany's opponents are regularly sub-par.
The evidence, I think, supports the "bottling it" theory, but not in the sense that players simply lack courage. Just being on the pitch requires that. England regularly fail at penalty shootouts because they expect to. It's worth remembering that their only competitive success since 1990 was against Spain in the quarter-final of Euro 96. I happened to be at that game at Wembley (I was a business freeloader), and I think it's fair to say that everyone's expectation was that England would get to the final. It should also be remembered that in the semi-final, England were running well-above their trend level, having scored 9 in a row (4 against Spain, 5 against Germany). Their failure was the result of a single miss at the start of sudden-death against a team that normally doesn't miss until that stage.
Germany's record is unusual in that they have both a good scoring (93%) and conceding (69%) rate. The two don't often go together - e.g. the Czech's have a 100% scoring rate but conceded 84%, while Portugal conceded only 55% but scored an average-looking 75%. Of course, this highlights that teams who compete in only a few shootouts (3 for the Czechs, 2 for the Portuguese), because they make it to the knock-out stage less often, stand a better chance of a high win ratio overall. Brazil, with a 60% win ratio over 10 shootouts, has an average scoring rate (75%) and a good conceding rate (68%).
What I think these figures may show is that not only are the Germans confident they'll go through, but their opponents seem pretty resigned to the same outcome. In contrast, Brazil's opponents seem slightly more confident in the Selecao than Brazil do themselves. England's opponents are not more confident than average (with the exception of the Germans). Since 1996, they have faced (and failed against) Argentina, Portugal (twice) and Italy. Portugal have a good record, but that is entirely the result of their two contests with England.
The recent match against Italy was typical of the pattern. Everyone was pleased that England had done better than expected, but few genuinely thought the team capable of going any further, and that opinion seemed to be shared by the players, to judge by their performance over the game. Being hard to defeat conclusively in a knockout match, penalties are always a likely outcome for England, and they tend to come at the point when the team has run out of ideas and consequently the belief that they can step up another level. Last 8 seems to be the realistic limit of ambition. In that sense, you have to say that penalties are fair.
So, my conclusion is simple and based on irrefutable evidence. England only win penalty shootouts when I'm in the stadium. If the FA would like to fly me out to Brazil in 2014, I would be happy to oblige.
This should be evidence enough that the problem is not lack of technique. The only slightly better record of the Italians (38%) also shows that familiarity is not decisive either. Italy took part in 8 shootouts over the period analysed (from 1990), compared to England's 7 and Holland's 5. Germany took part in 6 but have a win ratio of 83%. I think we can also dismiss the idea that because England do more running (they don't - this is just an assumption based on the poor technique belief), they are more tired and thus less effective at the end of extra-time. Everyone is shattered.
Looking at the scoring rates, both England and Holland miss 1 in 3 of their penalties, while Germany only miss once in roughly every two shootouts. Conversely, England's opponents only miss 1 in 5 while Germany's miss 1 in 3. On the face of it, this implies that the "problem" is equal parts scoring and stopping. However, the norm seems to be a scoring and conceding rate of about 80%, i.e. you miss 1 in 5 and so does the opposition. Spain and France show that this will produce an overall win ratio of around 50%, which is what you should expect if the shootout were a lottery. Given that England also only concede 1 in 5, this points the finger back at scoring, though this leaves you wondering why Germany's opponents are regularly sub-par.
The evidence, I think, supports the "bottling it" theory, but not in the sense that players simply lack courage. Just being on the pitch requires that. England regularly fail at penalty shootouts because they expect to. It's worth remembering that their only competitive success since 1990 was against Spain in the quarter-final of Euro 96. I happened to be at that game at Wembley (I was a business freeloader), and I think it's fair to say that everyone's expectation was that England would get to the final. It should also be remembered that in the semi-final, England were running well-above their trend level, having scored 9 in a row (4 against Spain, 5 against Germany). Their failure was the result of a single miss at the start of sudden-death against a team that normally doesn't miss until that stage.
Germany's record is unusual in that they have both a good scoring (93%) and conceding (69%) rate. The two don't often go together - e.g. the Czech's have a 100% scoring rate but conceded 84%, while Portugal conceded only 55% but scored an average-looking 75%. Of course, this highlights that teams who compete in only a few shootouts (3 for the Czechs, 2 for the Portuguese), because they make it to the knock-out stage less often, stand a better chance of a high win ratio overall. Brazil, with a 60% win ratio over 10 shootouts, has an average scoring rate (75%) and a good conceding rate (68%).
What I think these figures may show is that not only are the Germans confident they'll go through, but their opponents seem pretty resigned to the same outcome. In contrast, Brazil's opponents seem slightly more confident in the Selecao than Brazil do themselves. England's opponents are not more confident than average (with the exception of the Germans). Since 1996, they have faced (and failed against) Argentina, Portugal (twice) and Italy. Portugal have a good record, but that is entirely the result of their two contests with England.
The recent match against Italy was typical of the pattern. Everyone was pleased that England had done better than expected, but few genuinely thought the team capable of going any further, and that opinion seemed to be shared by the players, to judge by their performance over the game. Being hard to defeat conclusively in a knockout match, penalties are always a likely outcome for England, and they tend to come at the point when the team has run out of ideas and consequently the belief that they can step up another level. Last 8 seems to be the realistic limit of ambition. In that sense, you have to say that penalties are fair.
So, my conclusion is simple and based on irrefutable evidence. England only win penalty shootouts when I'm in the stadium. If the FA would like to fly me out to Brazil in 2014, I would be happy to oblige.
Friday, 20 July 2012
Cities in Flight
Apparently, "if the entire population of the planet – estimated to have passed 7 billion last year – lived like the residents of Tower Hamlets or Kensington and Chelsea, they would all fit in an area the size of France". The point being made concerns population density, and the fact that despite over 50% of the world's people now inhabiting towns and cities, urbanisation remains relatively diffuse. Mega-cities are very much the exception. For example, only 8% of the US population live in cities larger than 1 million inhabitants. The image of a concreted over France got me musing about sci-fi cities and how they differ from real cities. What I'm mainly thinking of is the self-contained unit, or arcology, with high-density and a precisely defined boundary. This is a common trope of sci-fi, from glass-domed Martian biospheres to Judge Dredd's Mega City One. Sometimes the boundary serves to shield the population from a hostile environment; sometimes it is the encompassing wall of a prison. This precision of the city limits has allowed it to slip its planetary moorings to become mobile or even morph into a spaceship (James Blish's Cities in Flight). |
What this indicates may be nothing more than a simple prejudice on the part of SF writers against suburbia, the familiar zone between the relative excitements of city and country, where most of them grew up. While some got to grips with urban sprawl after the 1960s, this tended to be used as a backdrop to resource depletion and social breakdown, which was a realistic concern in New York and some other cities in the 70s. This culminated in the 80s in the iconic vision of the San Francisco Bay area in Bladerunner, a version of Philip K Dick's Do Androids Dream of Electric Sheep? Since then attention has wandered past regenerated city centres and suburbia to the edgelands, where the urban frays into a confusion of slip-roads, light industrial parks and scrub land, celebrated in the works of JG Ballard, Iain Sinclair and Patrick Keiller.
The desire to precisely mark territory goes back to prehistory, but it appears to be less about defining the state (a modern concept) and more about defining the transition between the sacred and profane in religious complexes. This evolved into the liberties and sanctuaries of medieval cities, and the beating of parish bounds. Through ghettos, phalansteries and kibbutzim, to modern gated communities, the desire to mark "in" from "out", the community from the other, is obviously a social and political imperative. This reached an apogee of sorts in cities divided, rather than surrounded, by walls, such as Berlin, Belfast and Jerusalem. These cities were nothing if not grounded, pinned down by concrete and metal bindings.
Today, in London, we have the spectacle of the Olympic lanes, which seem both physical and yet strangely virtual. Not only do they shift in and out of existence in places, as the reality of traffic flow defeats the intention, but it appears that no earthly power is responsible for them. If we all lived in a France with the housing density of Tower Hamlets, you can still imagine Le Tour taking place, but we'd never be able to cope with Olympic lanes. Of course, this might provide the ideal opportunity to permanently site the games at a rebuilt Olympia (the one in Greece, not West London), a sort of sporting Center Parcs in the deep countryside. IOC grandees and other junket-wallahs would then have no trouble getting to their plushly upholstered seats, and the rest of us wouldn't have to get out of their way.
Wednesday, 18 July 2012
Way Out East
So I got out. Today the wife and I took a trip on the recently opened Emirates Airline, the rather grandiosely-named cable-car that runs from North Greenwich across the Thames to the Royal Docks. Predictably, the sky looked like somebody's grubby fingers had been all over it, and a slight but steady drizzle did its best to lubricate all moving things. It's still a cracking view, mind.
The sequence in chronological order moves from family, through community, via deserted wasteland to multiple communities and personal isolation. You get the sense that Bailey has ambivalent feelings about the evolution of the East End. It's a good exhibition, worth the trip (particularly via the cable-car), but do take a brolly.
We were on the way to the Compressor House gallery, to see the exhibition of David Bailey's East End photos. I had vaguely thought of walking along the docks to get in the mood, but the realisation that this would only provide an uninterrupted view of the Excel Centre, together with the drizzle, persuaded us to board the DLR for a few stops. The area seemed to be heavily populated by Olympic volunteers, all dressed in their purple and vivid red jackets, though not all wearing their regulation beige slacks (which do look a bit middle-aged, to be fair). Bailey's selected photos are made up of work from the 60s, 80s and recent years. The earliest include a number of large scale colour prints of cockernee characters, including those nice boys, the Krays, plus the inhabitants of various boozers and card clubs. They're amusing, but they try too hard. I imagine they were originally intended for photo essays in magazines: "How the other half lives", that sort of thing. The better works are the black and white shots of decrepit areas and their equally ruined denizens, those these too shade into caricature at times. There are old Jewish shop signs aplenty, and images of Brick Lane that wouldn't have been out of place in the Lodz ghetto. It's obvious Bailey was keen to capture what was visibly disappearing through slum clearance and city development. |
The 80s shots are concentrated on the Royal Docks area. This was the fallow period between the docks decline and the LDDC regeneration, and before London City Airport arose in Silvertown just across from the Compressor House. The photos are appropriately depopulated except for a handful featuring Bailey's new wife in ironic, noirish compositions, dressed to the nines and holding a handgun (a small echo of the Krays, perhaps). The skies are generally lowering and the buildings sinister. Not wholly unlike today.
These are the set of photos that show Bailey in his professional pomp, wholly in control of the subject and able to extract visual interest from even the bleakest environment. They lack the exuberance of the 60s set, and the wit of his more recent work, but they have a cold and compelling beauty to them. The use of black and white might appear regressive, even nostalgic, but it's better suited to what are essentially sculptural subjects, old pubs in otherwise flattened landscapes, dock gates and railings, and the serried ranks of cranes in the background. |
The recent photos were large and in colour, many picking up the vibrant hues of saris and modern shops frontages, but interspersed with junk and jetsam from earlier eras. There's no theme here, these are random shots after all, but there is an air of sadness. Heads are usually turned away from the camera, in contrast with the face-on 60s work, and sometimes cropped out of shot altogether. There is also a single panel of photos of Bailey's family from the 40s and 50s, including the first picture he took, a group shot on the beach. These look like the record of a destroyed Mitteleuropa. His first recognisable art photo is a shot of his sister, Thelma, in silhouette. This photo hints at the faceless compositions to come. |
The sequence in chronological order moves from family, through community, via deserted wasteland to multiple communities and personal isolation. You get the sense that Bailey has ambivalent feelings about the evolution of the East End. It's a good exhibition, worth the trip (particularly via the cable-car), but do take a brolly.
Tuesday, 17 July 2012
I'm seeing things
You know those optical illusions where a vase turns into two faces? I was reminded of that when looking at some charts on the age structure of the UK population, which runs from 1971 and includes projections by the Office of National Statistics up to 2085.
1971 looks like a Dalek, largely because of the dent in the population aged around 53 in that year (i.e. people in their early 20s who died in the war) and the protuberance of people who were 24 (i.e. the product of lots of sex in 1946).
If you run the chart animation forward to today, you see Tony Hancock in a homburg or perhaps it's Telly Savalas as Kojak in a trilby. By 2032 it's turned into a gorilla. 2085 just looks like a pot belly, with a fine embonpoint for both male and female.
I need to get out more. Damn that rain.
1971 looks like a Dalek, largely because of the dent in the population aged around 53 in that year (i.e. people in their early 20s who died in the war) and the protuberance of people who were 24 (i.e. the product of lots of sex in 1946).
If you run the chart animation forward to today, you see Tony Hancock in a homburg or perhaps it's Telly Savalas as Kojak in a trilby. By 2032 it's turned into a gorilla. 2085 just looks like a pot belly, with a fine embonpoint for both male and female.
I need to get out more. Damn that rain.
Sunday, 15 July 2012
As far as I could throw him
I've noticed a mini upswell in concerns about the erosion of trust lately. This is unsurprising given the background of the LIBOR scandal, phone-hacking, MPs expenses and the like, and the more recent panic over border controls and Olympic security. It appears the Army may have become the irreducible kernel of public trust in British society, which is worrying given the way that sort of thinking tends to pan out. It was not so long ago that the Egyptian Army was being lauded as the defenders of the revolution. I doubt we're headed for a military coup, but the fact that we struggle to find others worthy of our trust is telling.
The degree to which trust has collapsed in the banking industry was nicely illustrated by the shop-front of the "Bank of Dave" (which isn't a bank). Otherwise known as Burnley Savings & Loan (which makes me think of the Bailey Building & Loan in It's a Wonderful Life - perhaps deliberately), this sold itself not just on old-fashioned values but explicitly on "Captain Mainwaring". What this shows is that trust is much more than a contractual adornment between consenting adults. It's a proxy for wider social relationships, for solidarity and shared values, for culture.
From the beginning of the 2008 crisis trust was seen as a critical component of capitalism that had somehow been mislaid, and policy was framed in part as an exercise to restore it. Thus the banks must trust each other and lend on the interbank market to prevent a credit crunch. Whereas confidence and demand are seen as effects, i.e. behavioural responses to the economic situation that fluctuate naturally, trust is seen a first-order element, a cause, that must be present in the beginning. Without trust how could a market mechanism evolve?
In the modern era, personal core values have been subsumed (or suppressed) by corporate core values that mean nothing beyond the maximisation of shareholder return. Thus Bob Diamond loves Barclays. He identifies with the corporate entity and sees his trustworthiness as an extension of the corporate whole, the Barclays' Borg. In another age, a banker would be expected to exhibit a more distanced (more aristocratic) relationship, valuing his own honour and good name above the day job. My point is not that the old merchant banker model was better, but that it located trust in the social realm, independent of the contingencies of commerce.
Trust is developed in a social context, through relationships and prior behaviour, but its history is generally tacit and its exercise generally implicit. The greater the trust, the less it is openly referred to. The acme of trust is Caeser's wife. Not only can she be trusted, not only is she seen to be trustworthy, but we should not even need to question it. This offers an interesting contrast with social media, which claims to provide quantifiable trust through behavioural tracking and the wisdom of the crowd (your seller rating on eBay etc). In fact, this is just translating trust into a tradeable commodity a la Hayek.
What one might call high quality trust depends on proximity (we talk of trust in terms of distance: "wouldn't trust him an inch"). This means frequent interaction, the knowledge that you'll have to face the people you deal with again, the fear of a bad reputation among your peers etc. Of course such a degree of trust is not always available, particularly in a globalised market, so you (as a supplier) will seek to mitigate the risk that I (as a buyer) may fail to pay you by checking on my reputation. In the past, this meant relying on the opinion of others (whom you trust), who had experience of dealing with me. In the modern era, this essentially social process (which depended on formal and informal networks from the chamber of commerce to the golf club) became increasingly commoditised through the development of the credit rating industry.
The recent financial crisis has called into question the usefulness of credit ratings, notably in respect of CDOs and other derivatives, but the unreliability of such data goes all the way back to the industry's origins in the 19th century. Credit ratings have never been accurate predictors of future outcomes. How could they be? So why do we invest them with so much authority?
The reliance on credit rating agencies went well beyond pragmatic use, i.e. acceptance that they were the least worst option available. Indeed, credit ratings were raised to the status of holy writ, not just in respect of financial securities but in respect of sovereign viability. We have today the bizarre spectacle of nation states responding to a rating agency downgrade like a teenager who has been dumped, and agency ratings being written into regulations despite the agencies own reluctance to bear such a burden of authority.
The growth of the credit rating industry over the last 50 years is seen as a symptom of generally benign commercial growth, and in particular the growth in credit, but perhaps we should see it more as a growth in the level of mistrust and the desperate search for authority, for faith, in an increasingly marketised society.
We, as individuals, never used to have a formal credit rating because we never had commercial debt, i.e. credit cards and mortgages (in most cases). Our reputation tended to be little more than the consensus view of the street we lived on. Now, we are encouraged to worry about and tend our public credit rating. It has become a mysterious object, an alienated commodity in its own right, that exists independently of us and seems at times both unrecognisable and uncontrollable, like Mister Hyde appeared to Doctor Jekyll. It seems we can't even trust ourselves.
The degree to which trust has collapsed in the banking industry was nicely illustrated by the shop-front of the "Bank of Dave" (which isn't a bank). Otherwise known as Burnley Savings & Loan (which makes me think of the Bailey Building & Loan in It's a Wonderful Life - perhaps deliberately), this sold itself not just on old-fashioned values but explicitly on "Captain Mainwaring". What this shows is that trust is much more than a contractual adornment between consenting adults. It's a proxy for wider social relationships, for solidarity and shared values, for culture.
From the beginning of the 2008 crisis trust was seen as a critical component of capitalism that had somehow been mislaid, and policy was framed in part as an exercise to restore it. Thus the banks must trust each other and lend on the interbank market to prevent a credit crunch. Whereas confidence and demand are seen as effects, i.e. behavioural responses to the economic situation that fluctuate naturally, trust is seen a first-order element, a cause, that must be present in the beginning. Without trust how could a market mechanism evolve?
Trust is an interesting concept in a market society as it extends beyond the transactional to the realm of faith. Indeed, there is something fundamentally suspect about it, if you follow the thinking of Friedrich Hayek and the Austrian School, because it presumes tacit knowledge beyond the rationally calculable. For them, trust must be reduced to a commodity whose value can be determined by the market. The homo economicus model assumes that each party pursues their self-interest and makes balanced decisions in respect of the marginal utility of each transaction. But trust is not explicitly defined by the terms of a contract, nor is it contingent for each transaction. Trustworthiness is deemed to be a quality of the trustee that is general and persistent. In late Victorian society, one's good name was considered to be a core value that transcended the merely transactional, while the terrifying obverse of this was the character whose morality (and name) shifted depending on the transaction: Doctor Jekyll one moment, Mister Hyde the next. |
Trust is developed in a social context, through relationships and prior behaviour, but its history is generally tacit and its exercise generally implicit. The greater the trust, the less it is openly referred to. The acme of trust is Caeser's wife. Not only can she be trusted, not only is she seen to be trustworthy, but we should not even need to question it. This offers an interesting contrast with social media, which claims to provide quantifiable trust through behavioural tracking and the wisdom of the crowd (your seller rating on eBay etc). In fact, this is just translating trust into a tradeable commodity a la Hayek.
What one might call high quality trust depends on proximity (we talk of trust in terms of distance: "wouldn't trust him an inch"). This means frequent interaction, the knowledge that you'll have to face the people you deal with again, the fear of a bad reputation among your peers etc. Of course such a degree of trust is not always available, particularly in a globalised market, so you (as a supplier) will seek to mitigate the risk that I (as a buyer) may fail to pay you by checking on my reputation. In the past, this meant relying on the opinion of others (whom you trust), who had experience of dealing with me. In the modern era, this essentially social process (which depended on formal and informal networks from the chamber of commerce to the golf club) became increasingly commoditised through the development of the credit rating industry.
The recent financial crisis has called into question the usefulness of credit ratings, notably in respect of CDOs and other derivatives, but the unreliability of such data goes all the way back to the industry's origins in the 19th century. Credit ratings have never been accurate predictors of future outcomes. How could they be? So why do we invest them with so much authority?
The reliance on credit rating agencies went well beyond pragmatic use, i.e. acceptance that they were the least worst option available. Indeed, credit ratings were raised to the status of holy writ, not just in respect of financial securities but in respect of sovereign viability. We have today the bizarre spectacle of nation states responding to a rating agency downgrade like a teenager who has been dumped, and agency ratings being written into regulations despite the agencies own reluctance to bear such a burden of authority.
The growth of the credit rating industry over the last 50 years is seen as a symptom of generally benign commercial growth, and in particular the growth in credit, but perhaps we should see it more as a growth in the level of mistrust and the desperate search for authority, for faith, in an increasingly marketised society.
We, as individuals, never used to have a formal credit rating because we never had commercial debt, i.e. credit cards and mortgages (in most cases). Our reputation tended to be little more than the consensus view of the street we lived on. Now, we are encouraged to worry about and tend our public credit rating. It has become a mysterious object, an alienated commodity in its own right, that exists independently of us and seems at times both unrecognisable and uncontrollable, like Mister Hyde appeared to Doctor Jekyll. It seems we can't even trust ourselves.
Friday, 13 July 2012
Still doomed, slightly less ill
The Office for Budget Responsibility has published its annual update to the Fiscal Sustainability Report. I wrote a piece earlier this year on the 2011 report, and in particular the way it was used by Newsnight's resident Tory stooge, Allegra Stratton, to assure us that the welfare state was doomed. I made three key points:
1) The OBR's brief means it cannot make assumptions about anything that is not a direct extrapolation of current policy. Thus it can note that some revenue streams will decline (fuel and tobacco duty), but it cannot make provision for new or substitute taxes in the future (e.g. a land value tax or a tax on jet-packs). This means its projections on future tax revenues are understated.
2) The OBR assumes increased longevity would result in increased morbidity - i.e. every extra year of life would be spent in illness. But the reason we are living longer is because we are becoming healthier, so this does not follow.
3) The belief that health sector wages will grow faster than productivity is based on the premise that the health sector is relatively labour-intensive and will remain so. However, the past is not always a guide to the future. It is just as reasonable to anticipate future productivity gains coming through greater preventative care (labour effectiveness) and more investment in technology (labour efficiency).
This year's report attempts to address points 2 and 3 in an appendix. I don't imagine Robert Chote reads this blog, so I must assume lots of other people (presumably including some who know what they're talking about) made the same points, hence why he and his team have taken the trouble.
The OBR now accepts (B.26) that an increase in longevity may not lead to an expansion in morbidity but may instead lead to "an increase in healthy life expectancy". No shit, Sherlock. The grudging improvement in costs is about 1% of GDP. However, they exact their revenge by suggesting that their assumptions about health productivity last year may have been at the optimistic end of the spectrum, and that consequently their projections should now assume lower growth and thus higher costs that more than offset the life expectancy gain (B.35).
The central premise behind their pessimism over health productivity is the labour-intensive nature of the industry and its consequent susceptibility to Baumol's Cost Disease - i.e. capital investment to improve productivity through automation runs at a below-average rate, so wage inflation tends to rise faster than productivity. The OBR's difficulty in pointing to an agreed historical record of health productivity highlights part of the problem. There's no specific product, nor is there a market price for public health that stands proxy for that product. Measures of health activities and opinions on quality are fuzzy at best and open to manipulation at worst.
For me, the biggest flaw is that their model simply extends the past, which means the continuation of what remains at heart a Victorian model of health care (the dominance of hospitals and high-status consultants), which was itself a replica of a medieval religious model (charity hospitals, severity and priests). The foundation of the NHS famously required that Nye Bevan allow this model to continue in order to secure the doctors' cooperation. What was added was a modern bureaucracy. I was amused last night watching a documentary on the human gut that a consultant, doing high-tech keyhole surgery, was still being referred to (even by the narrator) as "Mister", which in the NHS is superior to mere "Doctor". But I digress. The point is that the future may look radically different.
In respect of the first point, declining tax revenues, this year's report still has its hands tied with regard to new or substitute taxes. However, they have started to speculate about the future beyond simple extrapolation of recent trends. Unfortunately, this is limited to the prospect of two further revenue declines.
The first area of concern is the impact of globalisation on corporation tax: "global corporation tax rates have been on a declining trend as governments around the world compete to attract mobile profits and capital. If a similar pattern were to persist whilst the UK headline rate remained unchanged, the incentive to draw profits away from the UK would reduce corporation tax receipts over time". (section 4.58, pg 114). Luckily, the Tories are cutting corporation tax from 28 to 22 per cent over the life of the parliament. However, further reductions to maintain a differential with other leading countries would be counter-productive as the loss of domestic tax revenue would be greater than the gains from profit-shifting (i.e. foreign profits immigrating and domestic profits not emigrating) (section 4.37, pg 106). Looks like George Osborne has got it spot on.
In plain English, the OBR thinks the UK corporate tax rate shouldn't go below 22%. As they expect other leading countries to stay around 28%, this means in effect that we're committed to a policy designed to attract profits (and foreign direct investment) to shift to the UK. Happy days for the City and its tax haven offshoots. The clear trend, which the financial crisis has done nothing to arrest, is for more and more profits (and thus capital) to be promoted to a de facto supra-national realm. This questions the capability of nation states to manage global capital.
The second area of concern is that "another possible effect of globalisation has been to reduce the price of tradeable goods relative to other goods and services. Most tradeable goods are subject to the standard rate of VAT, so if international trade were to exert downward pressure on such prices, and households spent relatively less money on such goods as a consequence, VAT receipts would fall modestly as a share of GDP". (section 4.58, pg 114). In other words, commodity deflation.
This is likely to be a red herring. If we have more money left over after buying ever-cheaper iPads, we'll probably spend it on one of those new-fangled jet-packs, so the percentage of disposable income that attracts VAT need not change. As the recent reports on the minimum acceptable standard of income show, the material wealth of the developed world continues to race ahead. Commodity deflation (more precisely commoditisation) simply shifts goods from the category of luxury, to nice-to-have, to necessity. As this occurs, new luxuries are developed to back-fill demand at the premium end of the market. The mobile phone has made that journey in little over a couple of decades.
If material wealth is increasing, why is public spending unsustainable? The real issue is one of distribution. The concern of declining VAT receipts masks a fear of continuing stagnation in median wages. If broad inflation remains ahead of wage inflation, this will largely be because of increases in the cost of food, petrol, utilities and housing. As not all of these attract VAT, increases in the cost of these necessities will leave less in the household budget for VATable items.
The reductions in corporation tax, and similar rich-friendly measures, together with median wage stagnation, will result in widening income inequality. It is this which puts pressure on public expenditure and leads to the ideologically-motivated claims that the welfare state is unsustainable. These are policy decisions, not inescapable facts of nature.
1) The OBR's brief means it cannot make assumptions about anything that is not a direct extrapolation of current policy. Thus it can note that some revenue streams will decline (fuel and tobacco duty), but it cannot make provision for new or substitute taxes in the future (e.g. a land value tax or a tax on jet-packs). This means its projections on future tax revenues are understated.
2) The OBR assumes increased longevity would result in increased morbidity - i.e. every extra year of life would be spent in illness. But the reason we are living longer is because we are becoming healthier, so this does not follow.
3) The belief that health sector wages will grow faster than productivity is based on the premise that the health sector is relatively labour-intensive and will remain so. However, the past is not always a guide to the future. It is just as reasonable to anticipate future productivity gains coming through greater preventative care (labour effectiveness) and more investment in technology (labour efficiency).
This year's report attempts to address points 2 and 3 in an appendix. I don't imagine Robert Chote reads this blog, so I must assume lots of other people (presumably including some who know what they're talking about) made the same points, hence why he and his team have taken the trouble.
The OBR now accepts (B.26) that an increase in longevity may not lead to an expansion in morbidity but may instead lead to "an increase in healthy life expectancy". No shit, Sherlock. The grudging improvement in costs is about 1% of GDP. However, they exact their revenge by suggesting that their assumptions about health productivity last year may have been at the optimistic end of the spectrum, and that consequently their projections should now assume lower growth and thus higher costs that more than offset the life expectancy gain (B.35).
The central premise behind their pessimism over health productivity is the labour-intensive nature of the industry and its consequent susceptibility to Baumol's Cost Disease - i.e. capital investment to improve productivity through automation runs at a below-average rate, so wage inflation tends to rise faster than productivity. The OBR's difficulty in pointing to an agreed historical record of health productivity highlights part of the problem. There's no specific product, nor is there a market price for public health that stands proxy for that product. Measures of health activities and opinions on quality are fuzzy at best and open to manipulation at worst.
For me, the biggest flaw is that their model simply extends the past, which means the continuation of what remains at heart a Victorian model of health care (the dominance of hospitals and high-status consultants), which was itself a replica of a medieval religious model (charity hospitals, severity and priests). The foundation of the NHS famously required that Nye Bevan allow this model to continue in order to secure the doctors' cooperation. What was added was a modern bureaucracy. I was amused last night watching a documentary on the human gut that a consultant, doing high-tech keyhole surgery, was still being referred to (even by the narrator) as "Mister", which in the NHS is superior to mere "Doctor". But I digress. The point is that the future may look radically different.
In respect of the first point, declining tax revenues, this year's report still has its hands tied with regard to new or substitute taxes. However, they have started to speculate about the future beyond simple extrapolation of recent trends. Unfortunately, this is limited to the prospect of two further revenue declines.
The first area of concern is the impact of globalisation on corporation tax: "global corporation tax rates have been on a declining trend as governments around the world compete to attract mobile profits and capital. If a similar pattern were to persist whilst the UK headline rate remained unchanged, the incentive to draw profits away from the UK would reduce corporation tax receipts over time". (section 4.58, pg 114). Luckily, the Tories are cutting corporation tax from 28 to 22 per cent over the life of the parliament. However, further reductions to maintain a differential with other leading countries would be counter-productive as the loss of domestic tax revenue would be greater than the gains from profit-shifting (i.e. foreign profits immigrating and domestic profits not emigrating) (section 4.37, pg 106). Looks like George Osborne has got it spot on.
In plain English, the OBR thinks the UK corporate tax rate shouldn't go below 22%. As they expect other leading countries to stay around 28%, this means in effect that we're committed to a policy designed to attract profits (and foreign direct investment) to shift to the UK. Happy days for the City and its tax haven offshoots. The clear trend, which the financial crisis has done nothing to arrest, is for more and more profits (and thus capital) to be promoted to a de facto supra-national realm. This questions the capability of nation states to manage global capital.
The second area of concern is that "another possible effect of globalisation has been to reduce the price of tradeable goods relative to other goods and services. Most tradeable goods are subject to the standard rate of VAT, so if international trade were to exert downward pressure on such prices, and households spent relatively less money on such goods as a consequence, VAT receipts would fall modestly as a share of GDP". (section 4.58, pg 114). In other words, commodity deflation.
This is likely to be a red herring. If we have more money left over after buying ever-cheaper iPads, we'll probably spend it on one of those new-fangled jet-packs, so the percentage of disposable income that attracts VAT need not change. As the recent reports on the minimum acceptable standard of income show, the material wealth of the developed world continues to race ahead. Commodity deflation (more precisely commoditisation) simply shifts goods from the category of luxury, to nice-to-have, to necessity. As this occurs, new luxuries are developed to back-fill demand at the premium end of the market. The mobile phone has made that journey in little over a couple of decades.
If material wealth is increasing, why is public spending unsustainable? The real issue is one of distribution. The concern of declining VAT receipts masks a fear of continuing stagnation in median wages. If broad inflation remains ahead of wage inflation, this will largely be because of increases in the cost of food, petrol, utilities and housing. As not all of these attract VAT, increases in the cost of these necessities will leave less in the household budget for VATable items.
The reductions in corporation tax, and similar rich-friendly measures, together with median wage stagnation, will result in widening income inequality. It is this which puts pressure on public expenditure and leads to the ideologically-motivated claims that the welfare state is unsustainable. These are policy decisions, not inescapable facts of nature.
Thursday, 12 July 2012
A degree is for life
Should there be a relationship between further education and the retirement age? Specifically, if you get a degree, should you be expected to work longer?
This train of thought was triggered by a new McKinsey report on the future global labour market. As the work of a cheerleader for neoliberalism, this is interesting for the assumptions it makes. The key one is that we face a challenge to increase productivity due to the supply of educated young workers declining, in relative terms, as a result of an ageing population. This is obviously not helped by an absolute decline in student numbers, which may be the consequence of the introduction of tuition fees.
Education is the magic bean of productivity in McKinsey's worldview: "Advanced economies will need to double the pace at which the number of young people earning college degrees is rising—and find ways to graduate more students in science, engineering, and other technical fields". But for the non-college-educated, problems loom. We need to "create more jobs for those who aren’t as highly educated". Solutions include "finding opportunities for workers without a college education to participate in fast-growing fields—such as health care and home-based personal services". The servant class looks like it might be about to enjoy a comeback.
The central demographic assumption is that the numbers of skilled workers joining the labour force will be offset by increasing numbers leaving it for retirement. It is for this reason that tertiary education needs to be expanded at an even faster rate, as well as adopting complementary strategies to further increase female participation, re-train workers mid-career and (of course) defer retirement. This last point emphasises that the motivation for increasing the retirement age is not solely about cutting anticipated pension costs. It's also about extending the useful life of the asset.
There is no explicit mention of class, unsurprisingly, though you can see the references to college education as a proxy. As I've mentioned before, extending the state pension age is a class issue as it imposes an unequal burden. Similarly, the effect of increased longevity on the labour market has a strong class dimension. “We’re all living longer” and “we’ll all have to work longer” masks the reality of widening inequality in both longevity and working years. The better-off will retire earlier (because they can afford to) and will, if trends to date are anything to go by, enjoy faster growth in longevity than the poor. Thus the higher your education and skills (the more middle class you are), the earlier you will retire on average. Together with the deferred start to your working life, because you were in further education, this means that your working years are likely to be 5-10 fewer than a manual worker who left school at 15 or 16.
Given McKinsey’s warnings about the skills crunch and the exacerbation of this by oldies leaving the jobs market, it would actually make more economic sense to increase the retirement age for people with tertiary education faster than the rest of the population, or even to limit the increase to just this cohort. That would increase the number of skilled workers as a percentage of the active population, would decrease the number of the unskilled, and would increase the average wage and therefore tax revenues.
As a quid pro quo, we could dispense with student loans. Instead of going into debt, you just accept a deferral of your state pension, with the additional years of tax and avoided benefits offsetting the initial investment. According to the ONS, the longevity difference between the top and bottom socio-economic classes is 5 years, so we could keep the standard retirement age at 65 and increase it for those who went to college to 70.
The wage differential for those with tertiary education (compared to those with GCSEs) is 45%, rising to 85% for those with a degree. Working an extra 5 years (or forgoing the state pension for that period - you might still choose to retire at 65 if you have a private pension) seems a reasonable exchange for 44 years earning roughly one and a half to twice as much as you'd otherwise have got, particularly if you received your education for free and skipped 5 years of NICs.
This train of thought was triggered by a new McKinsey report on the future global labour market. As the work of a cheerleader for neoliberalism, this is interesting for the assumptions it makes. The key one is that we face a challenge to increase productivity due to the supply of educated young workers declining, in relative terms, as a result of an ageing population. This is obviously not helped by an absolute decline in student numbers, which may be the consequence of the introduction of tuition fees.
Education is the magic bean of productivity in McKinsey's worldview: "Advanced economies will need to double the pace at which the number of young people earning college degrees is rising—and find ways to graduate more students in science, engineering, and other technical fields". But for the non-college-educated, problems loom. We need to "create more jobs for those who aren’t as highly educated". Solutions include "finding opportunities for workers without a college education to participate in fast-growing fields—such as health care and home-based personal services". The servant class looks like it might be about to enjoy a comeback.
The central demographic assumption is that the numbers of skilled workers joining the labour force will be offset by increasing numbers leaving it for retirement. It is for this reason that tertiary education needs to be expanded at an even faster rate, as well as adopting complementary strategies to further increase female participation, re-train workers mid-career and (of course) defer retirement. This last point emphasises that the motivation for increasing the retirement age is not solely about cutting anticipated pension costs. It's also about extending the useful life of the asset.
There is no explicit mention of class, unsurprisingly, though you can see the references to college education as a proxy. As I've mentioned before, extending the state pension age is a class issue as it imposes an unequal burden. Similarly, the effect of increased longevity on the labour market has a strong class dimension. “We’re all living longer” and “we’ll all have to work longer” masks the reality of widening inequality in both longevity and working years. The better-off will retire earlier (because they can afford to) and will, if trends to date are anything to go by, enjoy faster growth in longevity than the poor. Thus the higher your education and skills (the more middle class you are), the earlier you will retire on average. Together with the deferred start to your working life, because you were in further education, this means that your working years are likely to be 5-10 fewer than a manual worker who left school at 15 or 16.
Given McKinsey’s warnings about the skills crunch and the exacerbation of this by oldies leaving the jobs market, it would actually make more economic sense to increase the retirement age for people with tertiary education faster than the rest of the population, or even to limit the increase to just this cohort. That would increase the number of skilled workers as a percentage of the active population, would decrease the number of the unskilled, and would increase the average wage and therefore tax revenues.
As a quid pro quo, we could dispense with student loans. Instead of going into debt, you just accept a deferral of your state pension, with the additional years of tax and avoided benefits offsetting the initial investment. According to the ONS, the longevity difference between the top and bottom socio-economic classes is 5 years, so we could keep the standard retirement age at 65 and increase it for those who went to college to 70.
The wage differential for those with tertiary education (compared to those with GCSEs) is 45%, rising to 85% for those with a degree. Working an extra 5 years (or forgoing the state pension for that period - you might still choose to retire at 65 if you have a private pension) seems a reasonable exchange for 44 years earning roughly one and a half to twice as much as you'd otherwise have got, particularly if you received your education for free and skipped 5 years of NICs.
Tuesday, 10 July 2012
Official: old people not worth tuppence
The kite-flying on benefits by Nick Boles, which is generally deemed to be indicative of government thinking, coincides with another gripe-fest about the quality of ICT teaching in schools. There is a connection between the two.
The ICT debate has been framed in largely instrumental terms. Kids need to learn coding because that will lead to future prosperity: "Digital literacy must take its place alongside reading, writing and numeracy as a really valuable component of general economic success". The Guardian's roundtable discussion, sponsored by those anti-evil peeps at Google, identifies teachers as part of the problem: "The point was made that in secondary schools many of the existing cohort of ICT teachers started out teaching business studies – they had little or no coding expertise and may be reluctant to teach the new curriculum". This highlights the danger of an instrumental curriculum, which is what is being proposed. Again. I recall doing "Commerce" at school in the 70s, which meant preparing us for a clerical role rather than understanding trade flows. The world it envisaged, which wouldn't have been unfamiliar to Bob Cratchit, was transformed by IT. Of course, you cannot teach around technologies that have yet to be invented, but that is precisely why instrumental education is problematic.
Public education always exhibits a tension between the desire of the idealists to teach children how to learn and the desire of politicians and business people to teach those practical skills that industry needs now. The latter is vulnerable to redundancy, and arguably at an increasing rate. This same tension is exhibited in Nick Boles's suggestion that public expenditure should be judged on its economic value, its return on investment. Thus resources should be prioritised for workers, to help improve productivity, instead of being spent on un-productive people such as the elderly or children: "Productivity and competitiveness are my lodestars because I am convinced that the only way we can restore sustained improvement in living standards is if most working people in Britain can command high and steadily increasing wages in the market place."
This rhetoric helps explain the real background to the debate that counterposes universal and means-tested benefits. While many old people fear the latter because they remember the abuse and arbitrariness it gave rise to, not to mention the deliberately cultivated shame, most young people struggle to see why we should spend public money on benefits for the elderly rich. The point is that capital has an interest in the state providing benefits for workers only. The NHS keeps workers healthy, therefore boosting productivity. Similarly, state education provides employees who have already been partially trained for the job.
Where neither of these exist, capital must either accept unhealthier and less able workers, which means lower productivity and thus lower profits, or it must invest itself in health care schemes and training. The paternalism of the Victorians is held up by many on the right as evidence that freed of the shackles of the state, charity (aka Big Society) would fill the void for public goods. In fact, as history shows, the pressure of competition between businesses means that some become free-riders, benefiting from the social investment of others.
The evolution of the welfare state could not have happened without the realisation of capital that state provision was preferable because it was more efficient and prevented free-riding. The decision to fund this provision through general taxation also meant that a large part of the cost could be recouped from the workers themselves, rather than being paid for by business. Ultimately, capital has no interest in any public expenditure on the elderly (unless they work) or any investment in education that isn't of measurable value to business. There is no ROI in the old, or in the teaching of critical theory. Means-testing provides a mechanism for the progressive erosion of such benefits.
Interestingly, Boles's thinking is already evoking criticism on the right because it reneges on the quid pro quo of contributory taxes such as NICs, though the purpose of this bleat appears to be merely to recommend that we all rely on private provision from the off. No doubt means-testing will be presented as a "fair" compromise short of the full horror of privatisation. You'd almost think they were in cahoots.
The ICT debate has been framed in largely instrumental terms. Kids need to learn coding because that will lead to future prosperity: "Digital literacy must take its place alongside reading, writing and numeracy as a really valuable component of general economic success". The Guardian's roundtable discussion, sponsored by those anti-evil peeps at Google, identifies teachers as part of the problem: "The point was made that in secondary schools many of the existing cohort of ICT teachers started out teaching business studies – they had little or no coding expertise and may be reluctant to teach the new curriculum". This highlights the danger of an instrumental curriculum, which is what is being proposed. Again. I recall doing "Commerce" at school in the 70s, which meant preparing us for a clerical role rather than understanding trade flows. The world it envisaged, which wouldn't have been unfamiliar to Bob Cratchit, was transformed by IT. Of course, you cannot teach around technologies that have yet to be invented, but that is precisely why instrumental education is problematic.
Public education always exhibits a tension between the desire of the idealists to teach children how to learn and the desire of politicians and business people to teach those practical skills that industry needs now. The latter is vulnerable to redundancy, and arguably at an increasing rate. This same tension is exhibited in Nick Boles's suggestion that public expenditure should be judged on its economic value, its return on investment. Thus resources should be prioritised for workers, to help improve productivity, instead of being spent on un-productive people such as the elderly or children: "Productivity and competitiveness are my lodestars because I am convinced that the only way we can restore sustained improvement in living standards is if most working people in Britain can command high and steadily increasing wages in the market place."
This rhetoric helps explain the real background to the debate that counterposes universal and means-tested benefits. While many old people fear the latter because they remember the abuse and arbitrariness it gave rise to, not to mention the deliberately cultivated shame, most young people struggle to see why we should spend public money on benefits for the elderly rich. The point is that capital has an interest in the state providing benefits for workers only. The NHS keeps workers healthy, therefore boosting productivity. Similarly, state education provides employees who have already been partially trained for the job.
Where neither of these exist, capital must either accept unhealthier and less able workers, which means lower productivity and thus lower profits, or it must invest itself in health care schemes and training. The paternalism of the Victorians is held up by many on the right as evidence that freed of the shackles of the state, charity (aka Big Society) would fill the void for public goods. In fact, as history shows, the pressure of competition between businesses means that some become free-riders, benefiting from the social investment of others.
The evolution of the welfare state could not have happened without the realisation of capital that state provision was preferable because it was more efficient and prevented free-riding. The decision to fund this provision through general taxation also meant that a large part of the cost could be recouped from the workers themselves, rather than being paid for by business. Ultimately, capital has no interest in any public expenditure on the elderly (unless they work) or any investment in education that isn't of measurable value to business. There is no ROI in the old, or in the teaching of critical theory. Means-testing provides a mechanism for the progressive erosion of such benefits.
Interestingly, Boles's thinking is already evoking criticism on the right because it reneges on the quid pro quo of contributory taxes such as NICs, though the purpose of this bleat appears to be merely to recommend that we all rely on private provision from the off. No doubt means-testing will be presented as a "fair" compromise short of the full horror of privatisation. You'd almost think they were in cahoots.
Sunday, 8 July 2012
On Novelty, Abundance and Destruction
The techno-dammerung theme ("Oh my God!, we've stopped inventing stuff") got another airing in a recent issue of the American magazine The Baffler. What was interesting about David Graeber's Of Flying Cars and the Declining Rate of Profit was his yoking together of technological disappointment with a critique of the current economic crisis.
Graeber starts with the usual list of stuff that failed to turn up (force fields, teleportation, Mars colonies) before suggesting that cynicism stalks the land. "Might the cultural sensibility that came to be referred to as postmodernism best be seen as a prolonged meditation on all the technological changes that never happened?" This is an intriguing thought, but I think the answer to the question is "no", for two reasons. First, "meditate" (as opposed to pissing about) is hardly the active verb you'd associate with postmodernism. Second, postmodernism was more practice than theory, in the sense that it was largely about stuff that was actually built/made/produced, however ironically it was done. The vast hinterland of the speculative that accompanied modernism (including the golden age of sci-fi) presumed an extensible set of rules that could colonise the future. Postmodernism, with its emphasis on the subjective and relative, questioned the existence of such rules. It focused on really existing modernism rather than potential.
Graeber interprets postmodernism as an attempt to simulate novelty as technology failed to deliver flying cars: the optimism of The Jetsons gives way to the cynicism of The Simpsons. He also believes that the technical advances of the post-WW2 period were increasingly biased towards technologies of simulation, among which he numbers IT and medicine, as opposed to the genuinely labour-saving technologies, such as robotics, that would have led to a post-work society (incidentally, he suggests that 95% of robotics research was devoted to military applications, such as drones). There's an element of techno-snobbery here, treating modern technologies (which are largely incomprehensible to non-experts) as somehow less honest that retro metal-bashing. If robotics isn't simulative, I don't know what is.
It is at this point he introduces the economic dimension, noting that IT and containerisation allowed "industrial jobs to be outsourced to East Asia, Latin America, and other countries where the availability of cheap labor allowed manufacturers to employ much less technologically sophisticated production-line techniques than they would have been obliged to employ at home" (his italics). This is nonsense. Foxconn do not manufacture iPhones using bamboo tweezers and chicken guts. Low-grade technology might have been tolerated at the beginning of offshoring, but the normal pressures of capitalism have meant investment and retooling by firms to stay ahead of the competition. The model is high-tech and low-wage, not low-wage instead of high-tech. The march of offshoring up the value chain from low-tech, labour-intensive jobs to high-tech, creative roles is already well advanced.
Graeber makes the mistake of comparing different eras of the past ahistorically. Thus Jules Verne and H.G. Wells could envisage flying machines and trips to the Moon, but our mid-twentieth century dreams of jet-packs and robot butlers have been unfulfilled. The point is that rockets and flight were already proven technologies when Verne and Wells wrote. The Saturn rocket was a huge advance on that of William Congreve (employed in the Napoleonic Wars), but it was a logical extrapolation. Similarly, the Wright brothers powered flight was largely due to breakthroughs in glider control, in effect using technology that had been around for years. Jet-packs were eventually developed in the 1960s but proved impractical. It was easier to walk, or use a ladder. Likewise, robot butlers have got nowhere because there isn't the demand - humans remain cheaper and more effective. The problem isn't the technology, it's the sci-fi.
Marx's theory that capitalism suffers from a falling rate of profit over time (profit comes from the surplus value of labour, automation reduces the labour component of production, hence less profit) gets only a passing reference, despite its prominence in the article title: "if it is true, then the decision by industrialists not to pour research funds into the invention of the robot factories that everyone was anticipating in the sixties, and instead to relocate their factories to labor-intensive, low-tech facilities in China or the Global South makes a great deal of sense". If you accept the broader definition of a robot as an automated agent, rather than the narrower definition of an android, then at all times you are statistically closer to a robot than a rat. As already noted, industrialists have sought both automation and low-wages as a combined strategy to maintain the rate of profit. The move into offshore high-tech work is itself a further defensive strategy to maintain profit margins as Asian workers demand higher wages.
Graeber, who is an academic (and author of Debt: The First 5000 Years), gets impassioned when he turns to another feature of the modern world, the growth in bureaucracy and modern management techniques, which he characterises as a process of marketisation, the constant need to sell yourself and your work: "You will spend your time writing proposals rather than doing research. . . . It is proverbial that original ideas are the kiss of death for a proposal, because they have not yet been proved to work. That pretty much answers the question of why we don’t have teleportation devices or antigravity shoes". Actually, I think the non-appearance of the these two might have something to do with basic physics, such as the difficulty of disassembling and reassembling matter and the need for a force that does not fit into the standard quantum model. That said, his grumble about MBA bollocks and the internal market paradigm is sound.
His really interesting insight concerns the nature of neoliberalism, which he sees as a bureaucratic, managerialist construct, whose primary goal is to squelch dissent. He also sees it as consequently inimical to technological progress, largely because such progress might lead to demands for a post-work society, or at least a reduction in the working week and a fairer division of the spoils, which would challenge the interests that neoliberalism protects. The central tenet of neoliberal faith is that no other economic system is possible, and for that reason "it needs to suppress not just any idea of an inevitable redemptive future, but any radically different technological future. Yet there’s a contradiction. Defenders of capitalism cannot mean to convince us that technological change has ended—since that would mean capitalism is not progressive. No, they mean to convince us that technological progress is indeed continuing, that we do live in a world of wonders, but that those wonders take the form of modest improvements (the latest iPhone!), rumors of inventions about to happen (“I hear they are going to have flying cars pretty soon”), complex ways of juggling information and imagery, and still more complex platforms for filling out of forms".
I remain unconvinced that neoliberalism must suppress technological advance, though I agree a central feature of its ideology is the belief that "there is no alternative", and that this in turn may bias motivations and expectations in respect of research, such as the insistence that publicly-funded R&D must "pay its way". But this assumes that the hegemony is effective at directing invention, whereas all the evidence points to innovation being unpredictable and essentially accidental. Providing a fertile environment is obviously necessary, but beyond that we have to accept that creativity usually happens in the cracks, even when stimulated by carefully planned programmes and cooperative teams.
One of the myths of the modern age is the belief that major technical breakthroughs have been suppressed to protect vested interests, from everlasting suit material to free energy. Such crude conspiracy is not credible (in reality, someone would break ranks), but the trope does point to an ideological challenge to neoliberalism that might be referred to as the "coming abundance". You don't need to believe we will achieve a true post-scarcity society to recognise that current trends towards that state will continue for a while yet. Without a corresponding change to the social model, this will exert greater strains on the economy as abundance leads to deflation, which in turn erodes profit, and as the social consequences are persistent unemployment and underemployment. Indeed, there are grounds for believing that we may have reached this stage already and that the current banking crisis is symptomatic of it.
The burden of change will be unevenly distributed: "we can transition to an economy where we work more for ourselves, each other, and in teams. A logical consequence of the financial crisis but also one that could be construed as a silver lining is the rapidly growing rate of self-incorporation". This American sunny optimism ignores the class dimension. Low-wage workers, displaced by offshoring and automation, aren't setting themselves up as limited companies. This is a strategy for the already privileged middle classes (in the British sense of that term), and is usually as much about tax avoidance as it is about flexibility and lifestyle choice.
A common feature of these analyses, from both left and right, is the suspicion that if Marx's theory on declining profits is right, the only solution may well be the destruction of capital value. In other words, lots of bankruptcies and write-offs, or perhaps a major war. This economic tabula rasa would allow capital accumulation to begin again from a lower base, thus allowing for larger profit margins during the period of revived growth. The banking crisis (i.e. the banks' hoarding of capital and reluctance to lend), together with large corporate cash balances and corporate and private deleveraging, may simply represent a collective preparation for that moment of reckoning. The Eurozone crisis is shaping up to be the occasion for that moment across the globe.
Graeber starts with the usual list of stuff that failed to turn up (force fields, teleportation, Mars colonies) before suggesting that cynicism stalks the land. "Might the cultural sensibility that came to be referred to as postmodernism best be seen as a prolonged meditation on all the technological changes that never happened?" This is an intriguing thought, but I think the answer to the question is "no", for two reasons. First, "meditate" (as opposed to pissing about) is hardly the active verb you'd associate with postmodernism. Second, postmodernism was more practice than theory, in the sense that it was largely about stuff that was actually built/made/produced, however ironically it was done. The vast hinterland of the speculative that accompanied modernism (including the golden age of sci-fi) presumed an extensible set of rules that could colonise the future. Postmodernism, with its emphasis on the subjective and relative, questioned the existence of such rules. It focused on really existing modernism rather than potential.
Graeber interprets postmodernism as an attempt to simulate novelty as technology failed to deliver flying cars: the optimism of The Jetsons gives way to the cynicism of The Simpsons. He also believes that the technical advances of the post-WW2 period were increasingly biased towards technologies of simulation, among which he numbers IT and medicine, as opposed to the genuinely labour-saving technologies, such as robotics, that would have led to a post-work society (incidentally, he suggests that 95% of robotics research was devoted to military applications, such as drones). There's an element of techno-snobbery here, treating modern technologies (which are largely incomprehensible to non-experts) as somehow less honest that retro metal-bashing. If robotics isn't simulative, I don't know what is.
It is at this point he introduces the economic dimension, noting that IT and containerisation allowed "industrial jobs to be outsourced to East Asia, Latin America, and other countries where the availability of cheap labor allowed manufacturers to employ much less technologically sophisticated production-line techniques than they would have been obliged to employ at home" (his italics). This is nonsense. Foxconn do not manufacture iPhones using bamboo tweezers and chicken guts. Low-grade technology might have been tolerated at the beginning of offshoring, but the normal pressures of capitalism have meant investment and retooling by firms to stay ahead of the competition. The model is high-tech and low-wage, not low-wage instead of high-tech. The march of offshoring up the value chain from low-tech, labour-intensive jobs to high-tech, creative roles is already well advanced.
Graeber makes the mistake of comparing different eras of the past ahistorically. Thus Jules Verne and H.G. Wells could envisage flying machines and trips to the Moon, but our mid-twentieth century dreams of jet-packs and robot butlers have been unfulfilled. The point is that rockets and flight were already proven technologies when Verne and Wells wrote. The Saturn rocket was a huge advance on that of William Congreve (employed in the Napoleonic Wars), but it was a logical extrapolation. Similarly, the Wright brothers powered flight was largely due to breakthroughs in glider control, in effect using technology that had been around for years. Jet-packs were eventually developed in the 1960s but proved impractical. It was easier to walk, or use a ladder. Likewise, robot butlers have got nowhere because there isn't the demand - humans remain cheaper and more effective. The problem isn't the technology, it's the sci-fi.
Marx's theory that capitalism suffers from a falling rate of profit over time (profit comes from the surplus value of labour, automation reduces the labour component of production, hence less profit) gets only a passing reference, despite its prominence in the article title: "if it is true, then the decision by industrialists not to pour research funds into the invention of the robot factories that everyone was anticipating in the sixties, and instead to relocate their factories to labor-intensive, low-tech facilities in China or the Global South makes a great deal of sense". If you accept the broader definition of a robot as an automated agent, rather than the narrower definition of an android, then at all times you are statistically closer to a robot than a rat. As already noted, industrialists have sought both automation and low-wages as a combined strategy to maintain the rate of profit. The move into offshore high-tech work is itself a further defensive strategy to maintain profit margins as Asian workers demand higher wages.
Graeber, who is an academic (and author of Debt: The First 5000 Years), gets impassioned when he turns to another feature of the modern world, the growth in bureaucracy and modern management techniques, which he characterises as a process of marketisation, the constant need to sell yourself and your work: "You will spend your time writing proposals rather than doing research. . . . It is proverbial that original ideas are the kiss of death for a proposal, because they have not yet been proved to work. That pretty much answers the question of why we don’t have teleportation devices or antigravity shoes". Actually, I think the non-appearance of the these two might have something to do with basic physics, such as the difficulty of disassembling and reassembling matter and the need for a force that does not fit into the standard quantum model. That said, his grumble about MBA bollocks and the internal market paradigm is sound.
His really interesting insight concerns the nature of neoliberalism, which he sees as a bureaucratic, managerialist construct, whose primary goal is to squelch dissent. He also sees it as consequently inimical to technological progress, largely because such progress might lead to demands for a post-work society, or at least a reduction in the working week and a fairer division of the spoils, which would challenge the interests that neoliberalism protects. The central tenet of neoliberal faith is that no other economic system is possible, and for that reason "it needs to suppress not just any idea of an inevitable redemptive future, but any radically different technological future. Yet there’s a contradiction. Defenders of capitalism cannot mean to convince us that technological change has ended—since that would mean capitalism is not progressive. No, they mean to convince us that technological progress is indeed continuing, that we do live in a world of wonders, but that those wonders take the form of modest improvements (the latest iPhone!), rumors of inventions about to happen (“I hear they are going to have flying cars pretty soon”), complex ways of juggling information and imagery, and still more complex platforms for filling out of forms".
I remain unconvinced that neoliberalism must suppress technological advance, though I agree a central feature of its ideology is the belief that "there is no alternative", and that this in turn may bias motivations and expectations in respect of research, such as the insistence that publicly-funded R&D must "pay its way". But this assumes that the hegemony is effective at directing invention, whereas all the evidence points to innovation being unpredictable and essentially accidental. Providing a fertile environment is obviously necessary, but beyond that we have to accept that creativity usually happens in the cracks, even when stimulated by carefully planned programmes and cooperative teams.
One of the myths of the modern age is the belief that major technical breakthroughs have been suppressed to protect vested interests, from everlasting suit material to free energy. Such crude conspiracy is not credible (in reality, someone would break ranks), but the trope does point to an ideological challenge to neoliberalism that might be referred to as the "coming abundance". You don't need to believe we will achieve a true post-scarcity society to recognise that current trends towards that state will continue for a while yet. Without a corresponding change to the social model, this will exert greater strains on the economy as abundance leads to deflation, which in turn erodes profit, and as the social consequences are persistent unemployment and underemployment. Indeed, there are grounds for believing that we may have reached this stage already and that the current banking crisis is symptomatic of it.
The burden of change will be unevenly distributed: "we can transition to an economy where we work more for ourselves, each other, and in teams. A logical consequence of the financial crisis but also one that could be construed as a silver lining is the rapidly growing rate of self-incorporation". This American sunny optimism ignores the class dimension. Low-wage workers, displaced by offshoring and automation, aren't setting themselves up as limited companies. This is a strategy for the already privileged middle classes (in the British sense of that term), and is usually as much about tax avoidance as it is about flexibility and lifestyle choice.
A common feature of these analyses, from both left and right, is the suspicion that if Marx's theory on declining profits is right, the only solution may well be the destruction of capital value. In other words, lots of bankruptcies and write-offs, or perhaps a major war. This economic tabula rasa would allow capital accumulation to begin again from a lower base, thus allowing for larger profit margins during the period of revived growth. The banking crisis (i.e. the banks' hoarding of capital and reluctance to lend), together with large corporate cash balances and corporate and private deleveraging, may simply represent a collective preparation for that moment of reckoning. The Eurozone crisis is shaping up to be the occasion for that moment across the globe.
Thursday, 5 July 2012
36 views of Mount Shard
Today sees the official inauguration of the Shard. Rather than cutting ribbons, this will involve lasers cutting the night air, picking out other London landmarks. Whether this is meant to symbolise architectural solidarity or Nuremburg-style triumphalism is not clear, but the main purpose is presumably a sales pitch for the as-yet unlet offices and apartments. The Shard, by virtue of its size, target clientele and foreign funding, is a handy emblem for the state of the capital.
In yesterday's Guardian, Simon Jenkins laid out the critics' case. While some of this was sound ("money trumping planning", though 'twas ever thus), some of it was just hysterical: "the Shard has slashed the face of London forever". This is a bad case of bonkers metaphor. The Shard is less like a razorboy's calling card and more like a unicorn horn. Perhaps he meant that the Shard itself is a razor, left sticking out of someone's forehead. Nope, still doesn't work.
Jenkins is the chairman of the National Trust, so you'd expect him to harp on about the organic unity of the London skyline: everything in its proper place, well-ordered and well-mannered. Preferably people as well as buildings. However, his nostalgia over the cityscape of Canaletto conveniently ignores almost 300 years of industrial and commercial development, including such skyline-dominating monstrosities (now well-regarded) as Bankside and Battersea power stations. No doubt there were many in the mid-19th century who cited Canaletto's view of London as they regretted all that ugly, modern Victorian architecture. Jenkin's regrets that the Shard will ruin the view from Parliament Hill and Primrose Hill, without noting that the best views on the latter are from the top floors of Victorian terraced houses that now retail for millions.
Planning and architecture always has a class dimension. This point has come out very clearly in the excellent BBC series, The Secret History of Our Streets, which incidentally opens with an image of a dark, ugly Victorian city. Class has been centre stage in terms of misguided planning (well-meaning middle class planners poorly serving working class communities) and the process whereby streets "go up" or "go down" in terms of social class, a largely one-way movement in the modern era of gentrification. The recent episode on Portland Road in Notting Hill was particularly good at the starkness of the class divide and the by turns hilarious and depressing lack of interest or understanding between either end of the street. As one of the council tenants noted, it's as if there is an invisible barrier, a sense you get in many parts of London.
In last night's episode about Reverdy Road in Bermondsey, just down the road from the Shard, one of the older residents noted that the council had effectively followed a sons and daughters policy for council housing during much of the latter part of the 20th century, though he suggested this should be seen as intentionally parochial rather than de facto racist. Today the "respectable working class" of the road are being supplanted by middle-class incomers, rather than ethnic minorities, and are making a tidy packet in the process having bought their houses from the council in the 80s. The resident also noted that the council policy was strongly supported by Bob Mellish, the then Labour MP and a resonant name in recent London history. He was a social conservative, bluntly unsympathetic to anti-racism. Together with his vice chairmanship of the LDDC, this brought conflict with his constituency party as it became younger and more left-wing in the 70s. The result was the bitter 1983 by-election in which Peter Tatchell lost to Simon Hughes.
Mellish's failure to build alliances between the old and new working class, not to mention his facilitation of the Docklands property boom, was of a piece with the gradual social fragmentation of London that started in the 70s. Simon Jenkins' claim that the Shard is out of place assumes a unity in the city that has been lacking for decades now. A walk down Bermondsey Street, away from the Shard, is a good example of this. As it happens, I made this walk yesterday as part of a much larger perambulation around the big pointy building. For purely whimsical reasons, I decided to take 36 photos from different vantage points, in homage to the Japanese artist Hokusai's 36 Views of Mount Fuji.
The full series is here.
Up close the building dwarfs the nearby Guy's Hospital and City Hall, yet it doesn't feel quite as intimidating as you might expect, perhaps because it's so incongruous and sci-fi that you wonder whether it may just be CGI trickery. As you walk away, it quickly shrinks until you struggle to locate it on the horizon in Kennington, a mile and a half to the south.
Most people live their lives at ground level, and rarely get a complete vista of the London skyline, which is another reason why Simon Jenkins' complaints sound a little de haut en bas. As Bermondsey found, trying to defend against change can have the ironic effect of facilitating other, equally profound change. Perhaps in years to come we'll develop an affection for the Shard, like Parisians and the Tour Montparnasse, or perhaps we'll simply decapitate it, lopping off the luxury apartments and hotel on the upper floors, leaving a dead ringer for the Ministry of Truth.
In yesterday's Guardian, Simon Jenkins laid out the critics' case. While some of this was sound ("money trumping planning", though 'twas ever thus), some of it was just hysterical: "the Shard has slashed the face of London forever". This is a bad case of bonkers metaphor. The Shard is less like a razorboy's calling card and more like a unicorn horn. Perhaps he meant that the Shard itself is a razor, left sticking out of someone's forehead. Nope, still doesn't work.
Jenkins is the chairman of the National Trust, so you'd expect him to harp on about the organic unity of the London skyline: everything in its proper place, well-ordered and well-mannered. Preferably people as well as buildings. However, his nostalgia over the cityscape of Canaletto conveniently ignores almost 300 years of industrial and commercial development, including such skyline-dominating monstrosities (now well-regarded) as Bankside and Battersea power stations. No doubt there were many in the mid-19th century who cited Canaletto's view of London as they regretted all that ugly, modern Victorian architecture. Jenkin's regrets that the Shard will ruin the view from Parliament Hill and Primrose Hill, without noting that the best views on the latter are from the top floors of Victorian terraced houses that now retail for millions.
Planning and architecture always has a class dimension. This point has come out very clearly in the excellent BBC series, The Secret History of Our Streets, which incidentally opens with an image of a dark, ugly Victorian city. Class has been centre stage in terms of misguided planning (well-meaning middle class planners poorly serving working class communities) and the process whereby streets "go up" or "go down" in terms of social class, a largely one-way movement in the modern era of gentrification. The recent episode on Portland Road in Notting Hill was particularly good at the starkness of the class divide and the by turns hilarious and depressing lack of interest or understanding between either end of the street. As one of the council tenants noted, it's as if there is an invisible barrier, a sense you get in many parts of London.
In last night's episode about Reverdy Road in Bermondsey, just down the road from the Shard, one of the older residents noted that the council had effectively followed a sons and daughters policy for council housing during much of the latter part of the 20th century, though he suggested this should be seen as intentionally parochial rather than de facto racist. Today the "respectable working class" of the road are being supplanted by middle-class incomers, rather than ethnic minorities, and are making a tidy packet in the process having bought their houses from the council in the 80s. The resident also noted that the council policy was strongly supported by Bob Mellish, the then Labour MP and a resonant name in recent London history. He was a social conservative, bluntly unsympathetic to anti-racism. Together with his vice chairmanship of the LDDC, this brought conflict with his constituency party as it became younger and more left-wing in the 70s. The result was the bitter 1983 by-election in which Peter Tatchell lost to Simon Hughes.
Mellish's failure to build alliances between the old and new working class, not to mention his facilitation of the Docklands property boom, was of a piece with the gradual social fragmentation of London that started in the 70s. Simon Jenkins' claim that the Shard is out of place assumes a unity in the city that has been lacking for decades now. A walk down Bermondsey Street, away from the Shard, is a good example of this. As it happens, I made this walk yesterday as part of a much larger perambulation around the big pointy building. For purely whimsical reasons, I decided to take 36 photos from different vantage points, in homage to the Japanese artist Hokusai's 36 Views of Mount Fuji.
The full series is here.
Up close the building dwarfs the nearby Guy's Hospital and City Hall, yet it doesn't feel quite as intimidating as you might expect, perhaps because it's so incongruous and sci-fi that you wonder whether it may just be CGI trickery. As you walk away, it quickly shrinks until you struggle to locate it on the horizon in Kennington, a mile and a half to the south.
Most people live their lives at ground level, and rarely get a complete vista of the London skyline, which is another reason why Simon Jenkins' complaints sound a little de haut en bas. As Bermondsey found, trying to defend against change can have the ironic effect of facilitating other, equally profound change. Perhaps in years to come we'll develop an affection for the Shard, like Parisians and the Tour Montparnasse, or perhaps we'll simply decapitate it, lopping off the luxury apartments and hotel on the upper floors, leaving a dead ringer for the Ministry of Truth.
Tuesday, 3 July 2012
I blame the job-hoppers
The debate about what to do in response to the LIBOR scandal has predictably begun to fragment. The government continues to try and limit the scope of any inquiry to specific behaviour and nebulous culture, while Labour pushes for Leveson 2. The advertised remit will be "to examine issues of transparency, conflicts of interest and the culture and standards of the financial services industry". Meanwhile, various commentators and insiders pursue their own hobby horses. The bone-dry Terry Smith, for example, continues to advocate the breakup of retail and investment banking, though it's worth remembering that as a broker his client base would increase in size as a result of this.
As I noted the other day, culture (normative behaviour) reflects the structure and purpose of the industry, not the moral habits of the people within it. Bob Diamond's departure this morning from Barclays will not materially affect the culture. Changing the cast will make no difference if you are still putting on the same play in the same theatre, particularly if other theatres continue likewise. It bears repeating that this "culture" is industry-wide, which raises the question: how are cross-company industry norms established and maintained? More specifically, how did collusion over the LIBOR rate across multiple banks come about? In the case of the City there are two structural factors that stand out.
The first is job-hopping. Moving to another bank/brokerage/trader after a year or two is considered perfectly normal, particularly following the annual bonus round. It's no secret that banking recruiters are at their busiest in February/March for this reason. This regular movement helps establish and spread a common set of norms and values, and also spreads “innovative” financial practices (including, it would seem, fiddling the LIBOR rate). Management don't just tolerate this movement, they encourage it, hence the prevalence of head-hunting. The consequence is an incestuous industry that prides itself on being externally distinct but internally homogeneous, hence the common slang and social rituals, from the excess consumption of Bollinger to Spearmint Rhino's.
The second factor is physical proximity, which in turn facilitates the first factor. Investment banking and financial trading tend to concentrate in specific areas, with usually a single, dominant location for each jurisdiction. Deregulation and globalisation have resulted in many more country stock exchanges opening up, but large-scale financial trading has been consolidated in just a few global centres, one of which is London's square mile and its Isle of Dogs annex. In the past, this concentration reflected proximity to merchants, with the result that banking culture and merchant culture overlapped considerably (originally they were one and the same).
Since the introduction of electronic trading in the 70s, and the coincident disappearance of merchant business on the doorstep (notably in both New York and London with containerisation killing the docks), the location rationale has been driven more by the legacy of regulatory privilege (e.g. the Corporation of London) and the proximity of legal, political and commercial elites. In practical terms, financial trading could operate just as well in Guildford, but you'll find few takers for relocation. Even hedge funds, which operate at a remove from day-to-day trading, have only gone as far as Mayfair.
The continued concentration in The City, and the absence of countervailing influences from outside, has led to an inward-looking monoculture within the financial sector. This in turn has fed back to influence (or infect) industry and commerce (notably through IPOs and mergers and acquisitions) and politics (through lobbying and funding, but also indirectly via regulatory capture).
When investment banks’ customers were primarily merchants and industrialists, the bankers needed to exhibit norms that secured the latter’s confidence and custom, hence the importance of “my word is my bond” and general probity. Similarly, when retail bankers focused on small businesses and middle class depositors (before the credit card boom in the 70s), they exhibited appropriate norms: your bank manager took a keen interest in your business, and you liked the idea that your bank was run by incorruptible Quakers.
An interesting subtext of the current fretting over City culture is the snide suggestion that the recruitment of oikish barrow boys since the 80s corrupted the blameless public school ethos that had been the norm before. This ignores who did the recruiting, not to mention the long history of banking scandals over the 19th and 20th centuries. Ironically, the one "alien" input that can legitimately be fingered is the conscious importation of US practices (of which Bob Diamond is emblematic) as the old London houses sold out during the wave of consolidation post-Big Bang.
Fully separating retail and investment banking would go some way to fixing the culture (though it’s worth remembering that culturally a high-street cashier is a world away from a forex trader anyway), but ultimately the rapacity of financial traders is the result of the widespread financialisation of the modern economy. Denying access to retail deposits would only make a small difference to them (though it would make the rest of us feel much more secure), given the huge amounts of money swilling around in pensions, insurance, equity markets and sovereign wealth funds. A lot of money = a lot of transactions = more commission. This is why I am sceptical of claims that the banking omniscandal will lead to the decline of The City and by extension do damage to the UK economy.
The fundamental problem is a surplus of capital, all of it seeking a decent return. This surplus in turn reflects the imbalance in the global terms of trade over the last 30 years. I don't mean the terms of trade between individual countries, or even the (very real) imbalance in savings and consumption between developing and developed nations, but the terms of trade between the classes across all countries. I think it is widely accepted that the increase in global wealth over the period, through technological advances and productivity gains, has been disproportionately captured by the top end of the income scale. Or to put it another way, capitalists, rentiers and the supporting professions are a lot richer now than they were in the 70s. The rest of us are materially better off (because of technology), but relatively poorer in the sense that we haven't gained the full fruits of our labour and have become increasingly dependent on personal debt to maintain lifestyles.
The vast expansion in credit reflects not just the need for lower income earners to augment stagnant wages, but also an opportunity to soak up capital surplus and boost profits. As the net amount of debt at the global level is precisely zero (every debit has a matching credit), the issue is therefore one of distribution. There are many mechanisms that effect this distribution, from price inflation through wage inflation and taxation, but the one that has been on steroids since the 70s (and is now arguably out of control) is financial services.
Fixing banking therefore means fixing capitalism. An inquiry into "the culture and standards of the financial services industry" will barely scratch the surface.
As I noted the other day, culture (normative behaviour) reflects the structure and purpose of the industry, not the moral habits of the people within it. Bob Diamond's departure this morning from Barclays will not materially affect the culture. Changing the cast will make no difference if you are still putting on the same play in the same theatre, particularly if other theatres continue likewise. It bears repeating that this "culture" is industry-wide, which raises the question: how are cross-company industry norms established and maintained? More specifically, how did collusion over the LIBOR rate across multiple banks come about? In the case of the City there are two structural factors that stand out.
The first is job-hopping. Moving to another bank/brokerage/trader after a year or two is considered perfectly normal, particularly following the annual bonus round. It's no secret that banking recruiters are at their busiest in February/March for this reason. This regular movement helps establish and spread a common set of norms and values, and also spreads “innovative” financial practices (including, it would seem, fiddling the LIBOR rate). Management don't just tolerate this movement, they encourage it, hence the prevalence of head-hunting. The consequence is an incestuous industry that prides itself on being externally distinct but internally homogeneous, hence the common slang and social rituals, from the excess consumption of Bollinger to Spearmint Rhino's.
The second factor is physical proximity, which in turn facilitates the first factor. Investment banking and financial trading tend to concentrate in specific areas, with usually a single, dominant location for each jurisdiction. Deregulation and globalisation have resulted in many more country stock exchanges opening up, but large-scale financial trading has been consolidated in just a few global centres, one of which is London's square mile and its Isle of Dogs annex. In the past, this concentration reflected proximity to merchants, with the result that banking culture and merchant culture overlapped considerably (originally they were one and the same).
Since the introduction of electronic trading in the 70s, and the coincident disappearance of merchant business on the doorstep (notably in both New York and London with containerisation killing the docks), the location rationale has been driven more by the legacy of regulatory privilege (e.g. the Corporation of London) and the proximity of legal, political and commercial elites. In practical terms, financial trading could operate just as well in Guildford, but you'll find few takers for relocation. Even hedge funds, which operate at a remove from day-to-day trading, have only gone as far as Mayfair.
The continued concentration in The City, and the absence of countervailing influences from outside, has led to an inward-looking monoculture within the financial sector. This in turn has fed back to influence (or infect) industry and commerce (notably through IPOs and mergers and acquisitions) and politics (through lobbying and funding, but also indirectly via regulatory capture).
When investment banks’ customers were primarily merchants and industrialists, the bankers needed to exhibit norms that secured the latter’s confidence and custom, hence the importance of “my word is my bond” and general probity. Similarly, when retail bankers focused on small businesses and middle class depositors (before the credit card boom in the 70s), they exhibited appropriate norms: your bank manager took a keen interest in your business, and you liked the idea that your bank was run by incorruptible Quakers.
An interesting subtext of the current fretting over City culture is the snide suggestion that the recruitment of oikish barrow boys since the 80s corrupted the blameless public school ethos that had been the norm before. This ignores who did the recruiting, not to mention the long history of banking scandals over the 19th and 20th centuries. Ironically, the one "alien" input that can legitimately be fingered is the conscious importation of US practices (of which Bob Diamond is emblematic) as the old London houses sold out during the wave of consolidation post-Big Bang.
Fully separating retail and investment banking would go some way to fixing the culture (though it’s worth remembering that culturally a high-street cashier is a world away from a forex trader anyway), but ultimately the rapacity of financial traders is the result of the widespread financialisation of the modern economy. Denying access to retail deposits would only make a small difference to them (though it would make the rest of us feel much more secure), given the huge amounts of money swilling around in pensions, insurance, equity markets and sovereign wealth funds. A lot of money = a lot of transactions = more commission. This is why I am sceptical of claims that the banking omniscandal will lead to the decline of The City and by extension do damage to the UK economy.
The fundamental problem is a surplus of capital, all of it seeking a decent return. This surplus in turn reflects the imbalance in the global terms of trade over the last 30 years. I don't mean the terms of trade between individual countries, or even the (very real) imbalance in savings and consumption between developing and developed nations, but the terms of trade between the classes across all countries. I think it is widely accepted that the increase in global wealth over the period, through technological advances and productivity gains, has been disproportionately captured by the top end of the income scale. Or to put it another way, capitalists, rentiers and the supporting professions are a lot richer now than they were in the 70s. The rest of us are materially better off (because of technology), but relatively poorer in the sense that we haven't gained the full fruits of our labour and have become increasingly dependent on personal debt to maintain lifestyles.
The vast expansion in credit reflects not just the need for lower income earners to augment stagnant wages, but also an opportunity to soak up capital surplus and boost profits. As the net amount of debt at the global level is precisely zero (every debit has a matching credit), the issue is therefore one of distribution. There are many mechanisms that effect this distribution, from price inflation through wage inflation and taxation, but the one that has been on steroids since the 70s (and is now arguably out of control) is financial services.
Fixing banking therefore means fixing capitalism. An inquiry into "the culture and standards of the financial services industry" will barely scratch the surface.
Monday, 2 July 2012
Passing to feet may work after all
And so Spain saved the best till last. The Euros ended with an engrossing match and some of the finest passing play ever seen. That tiki taka might be worth perservering with. After playing with what appeared to be a hangover during the group stage, La Roja finally got the bit between the teeth and upped the tempo. As we have seen many times at Arsenal (though not as often as we'd like), it's the gear change that makes all the difference.
The result was 4 exquisite variations on the same scoring move. In each case the goal was produced by a fast, accurate pass, along the floor and into feet in the penalty area, bisecting the Italian defence. Every pass originated in the central area, roughly where Pirlo was, which may not have been a deliberate attempt to rub noses in it, but did serve to emphasise the dominance of the Spanish team. Italy looked very tired by the end, and not just because they were down to 10 men.
The contrast with England is instructive. For all the antique beauty of Gerrard's diagonal cross and Carroll's header, the team's approach to goal was too predictable. The reliance on hard-working but uninspired wingers in Young and Milner, and the lack of a progressive passer playing nearer the opposition penalty area (Parker and Gerrard were too deep and Rooney out of form), meant that England rarely attacked with any menace. Walcott's burst into the box for Wellbeck's goal against Sweden was the closest they came to a Spanish-style move, specifically the Fabregas-Silva combo, but the difference in technique is telling: scampering, ball-bobbling and inspired improvisation versus a daisy-cutter, quick feet and a perfect meeting of man and ball.
My online predictions came a cropper as we entered the final bend, mainly because I ignored my own advice and backed an in-form Germany to over-turn history and beat Italy in the semis. This left me chasing the leaders for the final furlong (I was 10th) and obliged me to go for a major upset, so I plumped for a 1-0 victory for the Italians yesterday. I finished 15th out of 74, but came 2nd in the all-important Thursday Night Footy sub-league (the one whose password was dontletdavewin). I did well enough, but was unable to kick on. Sort of Portugalish.
Looking back, I also went astray in backing Russia to finally get it together in a major tournament (again, history warned me otherwise), and was guilty of sentimentality in assuming that Ireland might get a couple of draws (though history was on my side there). I did correctly predict that Ashley Young would miss a penalty, but there were no points in that due to a woeful oversight in the game design.
Thoughts now turn to the domestic season, though the back pages of the papers will presumably be dominated by tennis this week (I think that robotic-sounding Scotsman is still in it) and then by that over-inflated sports day in the Lea Valley at the end of next month. Bizarrely, Ju Young Park has been selected for South Korea's Olympic team, which means he could end up more than doubling the number of games he has played in England by August. Like I say, a slow news day.
The result was 4 exquisite variations on the same scoring move. In each case the goal was produced by a fast, accurate pass, along the floor and into feet in the penalty area, bisecting the Italian defence. Every pass originated in the central area, roughly where Pirlo was, which may not have been a deliberate attempt to rub noses in it, but did serve to emphasise the dominance of the Spanish team. Italy looked very tired by the end, and not just because they were down to 10 men.
The contrast with England is instructive. For all the antique beauty of Gerrard's diagonal cross and Carroll's header, the team's approach to goal was too predictable. The reliance on hard-working but uninspired wingers in Young and Milner, and the lack of a progressive passer playing nearer the opposition penalty area (Parker and Gerrard were too deep and Rooney out of form), meant that England rarely attacked with any menace. Walcott's burst into the box for Wellbeck's goal against Sweden was the closest they came to a Spanish-style move, specifically the Fabregas-Silva combo, but the difference in technique is telling: scampering, ball-bobbling and inspired improvisation versus a daisy-cutter, quick feet and a perfect meeting of man and ball.
My online predictions came a cropper as we entered the final bend, mainly because I ignored my own advice and backed an in-form Germany to over-turn history and beat Italy in the semis. This left me chasing the leaders for the final furlong (I was 10th) and obliged me to go for a major upset, so I plumped for a 1-0 victory for the Italians yesterday. I finished 15th out of 74, but came 2nd in the all-important Thursday Night Footy sub-league (the one whose password was dontletdavewin). I did well enough, but was unable to kick on. Sort of Portugalish.
Looking back, I also went astray in backing Russia to finally get it together in a major tournament (again, history warned me otherwise), and was guilty of sentimentality in assuming that Ireland might get a couple of draws (though history was on my side there). I did correctly predict that Ashley Young would miss a penalty, but there were no points in that due to a woeful oversight in the game design.
Thoughts now turn to the domestic season, though the back pages of the papers will presumably be dominated by tennis this week (I think that robotic-sounding Scotsman is still in it) and then by that over-inflated sports day in the Lea Valley at the end of next month. Bizarrely, Ju Young Park has been selected for South Korea's Olympic team, which means he could end up more than doubling the number of games he has played in England by August. Like I say, a slow news day.
Sunday, 1 July 2012
The bigger, the better
Michel Platini's suggestion that future European Championships could be held in a dozen cities spread across Europe, rather than in just one country, has been greeted with bafflement, not least because of his praise for Ryanair. It's worth remembering that Platini has now been a UEFA politician for as long as he was a professional player, and he is as adept at misdirection in his current role as he was in France's midfield. The real issue here is the proposed expansion of the tournament from 16 to 24 teams in 2016.
The Euros are generally regarded as superior to the 32-team World Cup during the group stages because the smaller size means there are rarely any whipping boys and thus the results are closer and the surprises greater. This tournament proved the point with the early departures of Russia and Holland, not to mention England topping their group. While Ireland took a near-battering, it's worth remembering that half of their group will now contest the final.
A competition of 24 teams would typically have 6 groups, with the 4 best 3rd place finishers proceeding to the knock-out stage. Each group would logically be based in one city with two or more grounds, with different cities hosting some or all of the knockout games. Platini's suggestion of 12 cities is what you typically get for a 32 team, 8 group tournament, which may be where this is ultimately heading. Though Platini has claimed the change to a multi-country format would be financially responsible (no need for new airports and stadiums), it's pretty clear that this is just an attempt to expand turnover and therefore UEFA's revenue.
This desire to increase the number of "transactions" is hardly new to football. The Euros went from 4 to 8 teams in 1980 and then to 16 in 1996. More and bigger is a theme of modern life, from super-sized food to the plethora of TV channels. It is an irony that this new age of austerity is one characterised by visible plenty. Across the political spectrum there is a desire to return to growth as quickly as possible. There may be disagreement over how to achieve it, but there is no disagreement that more is better, despite the warning of climate change.
Nowhere has this commitment to unimpeded growth been more spectacular than in financial trading. From the early 70s onwards, trading ballooned in volume as a result of free-floating currencies, the expansion of equity markets, the growth of insurance and private pensions, the boom (in volume and value) of mortgages, and the eruption of personal credit. Because of derivatives and securitisation, the monetary value of trading is many times the actual underlying assets, which is why the size of the market in derivatives vastly exceeds global GDP. The point, if you are a financial trader, is that your commission is based on the transactions not the underlying asset. More is better.
Michel Platini cannot expand the Euros indefinitely without debasing the competition, however credit can be created as long as there is a demand for it. The claim that banks manufacture money out of thin air is not quite right: they manufacture credit. Even when the market for credit is weak, as the Bank of England has found with quantitative easing (which has not led to a big increase in commercial loans), opportunities for credit creation and utilisation can be found through proprietary trading by banks.
The LIBOR scandal has two aspects to it. The genuine anger of politicians relates to the abuse of the rate in 2008 to avoid Barclays being bailed out and thus nationalised like RBS and Lloyds. The bank lied, which has shown up the contempt they had for the regulators and (by extension) the politicians. But this abuse in turn revealed the existence of earlier cross-bank collusion to game the system to the advantage of traders. The latter has attracted the most public opprobrium with the now traditional attacks on banking culture and the personal immorality of bankers. As ever, the appeal to morality should make us suspicious.
An industry's culture is not some free-standing ethical framework, written on tablets of stone. It is the normative manifestation of the structure and purpose of the industry, its ideology. In other words, blaming culture is attacking the symptom, confusing cause with effect. Likewise, criticising morality is personalising the failure. While we've moved on from "rotten apples" and "rogues", the tone of the criticism still betrays the desire to find individual scapegoats, hence the call for Bob Diamond's head. Diamond, whatever his faults, did not single-handedly design the modern banking system. It was the regulators, and their political masters, who were responsible for that. Diamond merely took advantage of what they created, and then lobbied for its continuation and further development ("the time for remorse is over").
As ever, the political reality is better exhibited through the proposals for action than in the analysis of failure. Thus Vince Cable does his best old testament prophet act ("the rot was widespread") before suggesting that ring-fencing, together with some robust shareholder activism, will do the trick. The formal separation of proprietary trading and investment banking from personal and commercial banking, while still part of the same business, is a watered-down implementation of the already cautious Vickers reforms. It will not change the culture one iota. Properly splitting the industry in two would help, but only if this were to lead to the development of distinct professions (and thus cultures), not unlike the way it used to be when merchant bankers were City gents and commercial bankers were provincial Quakers. The cultural problem of modern banking is the blending of executive management into a single financial and legal class, which has now spread beyond the financial sector to the wider corporate world and politics. The culture is bigger than banking.
Equally, the idea that the bank's shareholders will introduce high ethical standards is laughable. These are mainly either institutional investors, and thus members of the same financial class as the bankers, or sovereign wealth funds such as the Qataris who bailed Barclays out in 2008. The plea for better stewardship by the likes of Will Hutton runs up against the problem that the day of the discriminating individual shareholder, in it for the long haul and emotionally as well as financially invested, has long gone. The financialisation of the economy, the excess of surplus capital (like Qatari wealth) chasing limited investment opportunities, and the increasing reliance on debt by median income-earners have all contributed to an investment culture that is short-term, bubble-prone and socially obtuse. The quality of shareholder management is inversely proportional to the volume of financial trades. Reduce the latter and you will increase the former.
Meanwhile, David Cameron continues Janus-like to excoriate immoral bankers (though he's stopped short of the full Jimmy Carr this time) while defending the interests of Britain's banking sector from the regulatory predations of those spendthrift/interfering Europeans. The man is living a lie, in full public view. This makes Michel Platini's refusal to consider goal-line technology in place of goal-line officials ("there was only one mistake") look almost reasonable in comparison.
The Euros are generally regarded as superior to the 32-team World Cup during the group stages because the smaller size means there are rarely any whipping boys and thus the results are closer and the surprises greater. This tournament proved the point with the early departures of Russia and Holland, not to mention England topping their group. While Ireland took a near-battering, it's worth remembering that half of their group will now contest the final.
A competition of 24 teams would typically have 6 groups, with the 4 best 3rd place finishers proceeding to the knock-out stage. Each group would logically be based in one city with two or more grounds, with different cities hosting some or all of the knockout games. Platini's suggestion of 12 cities is what you typically get for a 32 team, 8 group tournament, which may be where this is ultimately heading. Though Platini has claimed the change to a multi-country format would be financially responsible (no need for new airports and stadiums), it's pretty clear that this is just an attempt to expand turnover and therefore UEFA's revenue.
This desire to increase the number of "transactions" is hardly new to football. The Euros went from 4 to 8 teams in 1980 and then to 16 in 1996. More and bigger is a theme of modern life, from super-sized food to the plethora of TV channels. It is an irony that this new age of austerity is one characterised by visible plenty. Across the political spectrum there is a desire to return to growth as quickly as possible. There may be disagreement over how to achieve it, but there is no disagreement that more is better, despite the warning of climate change.
Nowhere has this commitment to unimpeded growth been more spectacular than in financial trading. From the early 70s onwards, trading ballooned in volume as a result of free-floating currencies, the expansion of equity markets, the growth of insurance and private pensions, the boom (in volume and value) of mortgages, and the eruption of personal credit. Because of derivatives and securitisation, the monetary value of trading is many times the actual underlying assets, which is why the size of the market in derivatives vastly exceeds global GDP. The point, if you are a financial trader, is that your commission is based on the transactions not the underlying asset. More is better.
Michel Platini cannot expand the Euros indefinitely without debasing the competition, however credit can be created as long as there is a demand for it. The claim that banks manufacture money out of thin air is not quite right: they manufacture credit. Even when the market for credit is weak, as the Bank of England has found with quantitative easing (which has not led to a big increase in commercial loans), opportunities for credit creation and utilisation can be found through proprietary trading by banks.
The LIBOR scandal has two aspects to it. The genuine anger of politicians relates to the abuse of the rate in 2008 to avoid Barclays being bailed out and thus nationalised like RBS and Lloyds. The bank lied, which has shown up the contempt they had for the regulators and (by extension) the politicians. But this abuse in turn revealed the existence of earlier cross-bank collusion to game the system to the advantage of traders. The latter has attracted the most public opprobrium with the now traditional attacks on banking culture and the personal immorality of bankers. As ever, the appeal to morality should make us suspicious.
An industry's culture is not some free-standing ethical framework, written on tablets of stone. It is the normative manifestation of the structure and purpose of the industry, its ideology. In other words, blaming culture is attacking the symptom, confusing cause with effect. Likewise, criticising morality is personalising the failure. While we've moved on from "rotten apples" and "rogues", the tone of the criticism still betrays the desire to find individual scapegoats, hence the call for Bob Diamond's head. Diamond, whatever his faults, did not single-handedly design the modern banking system. It was the regulators, and their political masters, who were responsible for that. Diamond merely took advantage of what they created, and then lobbied for its continuation and further development ("the time for remorse is over").
As ever, the political reality is better exhibited through the proposals for action than in the analysis of failure. Thus Vince Cable does his best old testament prophet act ("the rot was widespread") before suggesting that ring-fencing, together with some robust shareholder activism, will do the trick. The formal separation of proprietary trading and investment banking from personal and commercial banking, while still part of the same business, is a watered-down implementation of the already cautious Vickers reforms. It will not change the culture one iota. Properly splitting the industry in two would help, but only if this were to lead to the development of distinct professions (and thus cultures), not unlike the way it used to be when merchant bankers were City gents and commercial bankers were provincial Quakers. The cultural problem of modern banking is the blending of executive management into a single financial and legal class, which has now spread beyond the financial sector to the wider corporate world and politics. The culture is bigger than banking.
Equally, the idea that the bank's shareholders will introduce high ethical standards is laughable. These are mainly either institutional investors, and thus members of the same financial class as the bankers, or sovereign wealth funds such as the Qataris who bailed Barclays out in 2008. The plea for better stewardship by the likes of Will Hutton runs up against the problem that the day of the discriminating individual shareholder, in it for the long haul and emotionally as well as financially invested, has long gone. The financialisation of the economy, the excess of surplus capital (like Qatari wealth) chasing limited investment opportunities, and the increasing reliance on debt by median income-earners have all contributed to an investment culture that is short-term, bubble-prone and socially obtuse. The quality of shareholder management is inversely proportional to the volume of financial trades. Reduce the latter and you will increase the former.
Meanwhile, David Cameron continues Janus-like to excoriate immoral bankers (though he's stopped short of the full Jimmy Carr this time) while defending the interests of Britain's banking sector from the regulatory predations of those spendthrift/interfering Europeans. The man is living a lie, in full public view. This makes Michel Platini's refusal to consider goal-line technology in place of goal-line officials ("there was only one mistake") look almost reasonable in comparison.
Subscribe to:
Posts (Atom)