VR has the attraction of allowing access to virtual people as much as virtual worlds. At the moment, a sleb's Twitter feed is often the work of a marketing intern. In the future, you will be able to interact with a near flawless bot of Paris Hilton (who will thereby stay forever young). The official announcements on the Oculus deal, and most of the media commentary, have avoided the obvious killer-app for VR technology, which is interactive porn. In time, the Rift headset will be augmented with haptic suits and genital prostheses (the full wank-rig, in other words). The solution to the commercial porn problem of recent years (how do you make money when there is so much free content available) is now obvious: you sell the hardware and software. The real losers will, as usual, be the models. Hentai (Japanese anime porn) is the future, just less cartoonish and more tactile.
Another way of thinking about this is that Google is increasingly in the business of augmentation, while Facebook is moving towards simulation. Koster is smart enough to recognise that this distinction is actually a carve-up; that underneath the applications, Google and Facebook have a common model: "It’s time to wake up to the fact that you’re just another avatar in someone else’s MMO. Worse. From where they stand, all-powerful Big Data analysts that they are, you look an awful lot like a bot. The real race isn’t over the client — the glasses, watches, phones, or goggles. It’s over the servers. It’s over the operating system. The one that understands countless layers of semantic tags upon every object on earth, the one that knows who to show you in Machu Picchu, the one that lets you turn whole visualizations of reality on and off".
Evgeny Morozov caught the flavour of this totalising ambition last year, and also correctly noted that the threat posed by surveillance is not to privacy but to democracy: "The widespread feeling of emancipation through information that many people still attribute to the 1990s was probably just a prolonged hallucination. Both capitalism and bureaucratic administration easily accommodated themselves to the new digital regime; both thrive on information flows, the more automated the better". Both business and the state seek pre-emption: identifying and intercepting a danger before it fully emerges, or identifying and satisfying a want before it fully forms. Both insist that they have our best interests at heart, both insist that it is what we want.
If this sounds gloomily dystopian, there is yet hope. This can be seen in the public reaction to Google Glass. The backlash against "Glassholes" is not just an example of the age-old tendency to give anyone who wants to be different a good kicking (a "hate crime", no less), any more than it is solely a product of socio-economic resentment in an increasingly divided San Francisco. It is clear that what creeps people out is simply the visible sign of intrusion and observation, even when the device is switched off. There may also be a subtext to do with gender and power relations: Glass turns geeks into people who gaze, in the sense defined by John Berger in Ways of Seeing. It is perhaps this whiff of proprietorial entitlement, rather than the corporate logo, that sees Glass bracketed with Google Buses.
There have been attempts to dismiss the backlash as a reactionary spasm in the face of an inevitable future that we will all come to embrace: "The future is on its way, and it is going to be on your face ... Wearables are where we’re going". Glass-wearers have even attributed simple gadget-envy: "Some of the irony is that the people hating on me for wearing Google Glass are probably going to have a pair in six months or a year". But this sounds like misplaced Messianic fervour (it reminds me of Clive Sinclair claiming we'd all be driving C5s). It also fails to address the reality that ubiquity does not make something acceptable. There is a reason why CCTV cameras are out of reach, and it's not just a preference for overhead shots.
The common ground between Glass and Rift, as emblems of their owners' view of the future, is wearable computing. The significance of this is not the technology per se, or even its use in a public space (we've had that since the transistor radio), but the integration with the body and the suggestion that it may not be wholly under the control of the wearer: that perhaps the wearer and others in the public space are subject to it. Naturally, boosters insist that the opposite is true - that wearables make the technology less visible ("out of our way") and more controllable. This worry over wearables is not paranoia about cyborgs, but the long-standing (and historically justified) fear that technology is a means of control. When archaeologists study man's first tools, they are also studying his first weapons.
To come somewhat more up to date, if you think about books as a technology, then the decision of Chris Grayling to limit access to them among prisoners makes perfect sense. The problem with books is that they are asymmetrical. Despite the best efforts of priests and business gurus, they are always subject to the reader, a point appreciated by thinkers from Martin Luther to Jacques Derrida. Books fail to give authority sufficient power over the reader, while potentially they give the reader some power, via knowledge, over authority. In contrast, the degree of control over the subject promised by wearables is enormous. Perhaps in future, instead of books, prisoners will be issued with glasses that flash up encouraging homilies, or deliver brief electric shocks to the recalcitrant.
The wider political context of wearable computing is the tension between privacy and transparency. To be effective, democracy needs both: the ability to determine without coercion (free assembly, the secret ballot) and the ability to interrogate and limit those who seek a mandate. The mediation of transparency has always been the expression of power, which is why the rich own newspapers and governments employ censors. Transparency should be proportionate to power, but the reality is the inverse: the powerless are the most exposed and inspected, while the powerful enjoy the greatest respect for their privacy. The idea that ubiquitous surveillance ("sousveillance") means we can better hold power to account is not only naive, it is positively dangerous because the technology of transparency will always bear more heavily on the powerless than the powerful.
As our ability to interact with the world (both objects and people) is increasingly mediated, transparency, which should contain power, instead serves to increase the cost of privacy, in terms of inconvenience and the loss of utility. Privacy then becomes even more obviously a species of property and thus the preserve of the wealthy. Sometimes, turkeys do vote for Christmas.