Search

Friday 19 February 2016

We Are The Law

The case of the San Bernardino killer's iPhone 5C is being vigorously promoted as a defence of civil liberties and resistance to the spread of the surveillance state. It's neither. The claim of Tim Cook to be making a principled stand is nonsense. The key thing to grasp from the outset is that this is a contest over property, not privacy, and that the state has a contingent right to seize private property under certain circumstances and assuming due process. The principle is embedded across a number a laws, reflecting the variety of property classes and differing methods of seizure, from eminent domain (compulsory purchase in UK parlance) to civil forfeiture (confiscating criminal proceeds even when no specific crime has been proven).

In the context of the Apple affair, we don't need to worry about what those precise circumstances are, or even the specific act being invoked (though it is amusing to see people who bang on about the sanctity of the constitution appalled at the use of a 1789 statute). So long as the state has operated within the existing law, and the courts accept that they have reasonable grounds to exercise the right of confiscation, then their action is legitimate. Where there is a point of contention is whether digital information can really be considered as property and who has title in it. You'll notice that Apple have been quiet on this aspect of the case, and for good reason.

In their "Message to Our Customers" (incidentally an example of American documentary fetishism), the company states: "The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand". Citing the government is disingenuous because it is a Federal court judge who has issued the order. This is not a case of the FBI turning up in Cupertino in dark suits and shades to put the frighteners on Tim Cook. Contrary to Cook's claim, this is not an unprecedented step if we consider it as a request for the surrendering of private property (and let us, for argument's sake, assume that smartphone data is property), even if it may be the first time it has happened in respect of an iPhone (in fact it's probably not). This solipsism is typical of Apple, as is the implied trumping of the government's public security concerns by "the security of our customers". The suggestion seems to be that Apple inhabit a parallel legal universe to the rest of us.


The letter insists on the need for encryption. This is not in dispute. Encryption is a necessary technique to ensure privacy, and we should be concerned by any state attempt to undermine encryption standards or tools, but it simply isn't the issue at hand. The FBI wants to hack a password, not break a code. The letter also warns of the dangers of creating a backdoor that could be subsequently exploited by criminals and "bad actors" (i.e. possibly including an abusive government). This danger is also not in dispute, but Apple's characterisation of the FBI's request as the creation of a "backdoor" is questionable and their risk assessment is scare-mongering.

A backdoor is, by definition, a security vulnerability. But it is also, by implication, a general vulnerability that would be present (if latent) on all smartphones running the same operating system. That is not what the FBI has requested. It is proposing that Apple create a bespoke vulnerability, for installation on a single device, essentially by branching the iOS operating system. This is what hackers routinely do (and no, it's not just limited to opensource software - there are iOS mods out there too). There is no "precedent" being set here. The ongoing security of the iPhone depends on the confidentiality of the iOS source code within Apple (and that won't be changed by the FBI's request) plus the integrity of the system upgrade process (which can check to ensure that bespoke changes aren't accepted).

Privacy means the right to be obscure: to be able to live your life away from the prying eyes of others. This implies the opportunity (i.e time and space) to create your own property, but it does not entail an absolute right over that property. There is a distinction to be made between human rights (which are inalienable) and property rights (which are not). The reason why privacy and property are so easily confused is that we have long treated the human body as a type of property. This did not end with the disappearance of feudalism or the abolition of slavery. Military conscription, prison and compulsory education are ways in which the body as property can still be alienated by the state.


The current justification for mass surveillance is that if you've nothing to hide, you've nothing to fear. This is pernicious because it accepts that surveillance of the innocent should be the norm, which necessarily means the end of any universal right to be obscure. In practice, the powerful can still secure their own privacy through privilege, which means that it becomes a relative right and thus a form of property: something you might or might not possess. What we currently lack is a formal right of obscurity (let's call it the Garbo principle: "I want to be alone"), essentially because this is such an innate expectation that we take it for granted. As technology capable of surveillance becomes more ubiquitous, we need to formalise this right.

The letter paints Apple in a noble and patriotic light: "We feel we must speak up in the face of what we see as an overreach by the U.S. government. We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country ... we fear that this demand would undermine the very freedoms and liberty our government is meant to protect." In fact, a better way of looking at this affair is that Apple has failed its customers. Just as the state has a contingent right to seize property, so the individual has an absolute right to dispose of their property (while they have title) as they see fit. This includes destroying it.

In practice, you can erase all of the data on a smartphone, though you'll usually need special software to do so (i.e. securely overwrite the physical media). The iPhone will automatically erase data if you enter the wrong password repeatedly, but this is a long-winded and clumsy method compared to smashing it to smithereens with a hammer. What the iPhone 5C lacks is a self-destruct timer, a la Mission Impossible. Had this been available, there's a good chance the San Bernardino terrorist would have solved Apple's problem for it. Had he chosen not to avail himself of this feature, Apple could in good conscience have agreed to assist the FBI.


The reason Apple doesn't provide this facility is because it considers every iPhone ever made to still be its property. Legally, it has some grounds for thinking this way. When you buy an iPhone, you do not buy a copy of iOS, or any of the other software on the phone. You merely buy a licence to use. The software remains the intellectual property of Apple. Because the company has always insisted on complete control of its products, both hardware and software, it tends to have a very proprietorial attitude towards them, even though the hardware is incontestably the property of the user after purchase. The obvious recent example is "Error 53", where an iPhone 6 will permanently disable itself if it suspects third-party interference. This is an example of Apple's "overreach" (they've now retreated), trying to maintain a monopoly on handset repairs by claiming spurious security concerns.

Apple is encouraged in this attitude by the current uncertainty of the law in respect of digital property. That uncertainty in turn is the result of lobbying by companies such as Apple to prevent the extension of strong property rights to user data. As it stands, the digital economy depends on the ability of businesses to extract value from the data created by its customers. This is possible because digital data can be infinitely copied without reducing the value of the original to its creator. In other words, what is being alienated is the use of the data rather than the data itself. In theory we already have a legal framework capable of controlling this - intellectual property copyright - but in practice this is treated as a commercial matter rather than an inalienable right. When you click "I agree", you are conceding that the service provider retains its full IP rights in respect of its software, while you freely give up your IP rights in respect of your data.

The San Bernardino affair is just the latest stage in the ongoing struggle between the state and capitalists for the control of society's data. A true defence of civil liberties would restrain both government agencies and Internet companies, not favour one relative to the other. Apple knows that too much accommodation of the state will tarnish its brand image, and it also calculates that a head-on challenge will boost its assumed right to negotiate privileged treatment. The state calculates that a test case centred on domestic terrorism during a Presidential Election campaign is the ideal opportunity to publicise its demands for "equipment interference". The state has a good, narrow case because it is publicly seizing the data of an individual. Apple's "principled" objection, in pursuit of its own commercial interests, risks encouraging the state to legislate more sweeping powers to seize the data of everyone, thereby eroding the privacy the company claims to respect.

2 comments:

  1. Isn't one part of the Apple case the idea that this action is the small part of a very big wedge? That it can be used as a precedent for introducing back doors elsewhere.

    ReplyDelete
    Replies
    1. That's their claim, but there is no evidence for it. The requested action might establish a precedent for a custom modification (i.e. a one-off operation on request), but it does not create the precedent for a backdoor (i.e. a permanent fixture on all devices).

      The irony is that by refusing cooperation on this occasion, Apple may encourage Congress to pass an act enabling general access, which would make backdoors ubiquitous.

      Delete