Société Le 9 mars 2016

Techno-Populism Won’t Help in the Apple vs. FBI Debate

0
0
Techno-Populism Won’t Help in the Apple vs. FBI Debate

When I first read Tim Cook’s “Message to [Apple’s] customers,” I felt on the receiving end of a marketing push. Sure, I agree the government should not be able to read my online diary, and sure, I agree the government should not be able to weaken encryption for everyone. But the real problem here is that there is not enough available knowledge and understanding of the linkages between technology and society to be able to anticipate the implications of such decisions and decide what is socially desirable and what is not.

I see at least two reasons to criticize Apple’s letter, one technical and one political. Technically, it relies on a knowledge gap to mislead its customers into thinking this is only an encryption and master key issue. With additional assurances on what the FBI would and would not be able to do, this could be one event in a long history of cooperation between Apple and law enforcement.

Politically, it tries to push to its customers a notion of what is socially desirable and what is not, which no one gave Apple the mandate to do. I did not elect Apple, nor do I have any means of democratic control over what they feel is a tool for which our society is ready. What’s more: I don’t think it’s socially desirable to put Apple, or any other company, in a position to decide what is socially desirable.

 

This is not exactly an issue of encryption

The FBI is not asking Apple to weaken the algorithm they use for encryption, but rather the software that implements it. It is true that, among government officials, many have beenarguing in favor of weakening algorithms for several years now, in a surprising return of the 1990s debate on whether to use government-weakened crypto system (e.g. the Clipper Chip).This return of the Crypto Wars has in the last couple of years focused on the question of whether or not to introduce backdoors on our devices to which Law Enforcement Agencies (LEAs) would retain a master key. Countless experts have argued that this is, in fact, a terrible idea, and that there is no way to ensure that this master key could not be used by other entities than LEA.

But let’s be clear: what is at stake here is not exactly a master key, at least not in the way Apple’s letter alludes to. It is instead a tailored version of Apple’s operating system, iOS, enabling brute-force attacks to be conducted on their devices – which, crucially, is only possible with Apple’s digital signature. A brute-force attack is what you do when you forget the code of your gym lock: try every single possible combination. It is not breaking the lock, which would be the equivalent of finding zero-day vulnerabilities in the software we use (zero-days are coding mistakes that have never before been exploited or weaponized). It is not weakening the system of locks that everyone else uses, which would be the equivalent of introducing a backdoor. It is simply breaking into your own lock, and only this one.

Say you forgot the pin of your iPhone. Is it not desirable to be able to recover the encrypted data on it? Apple decided that security implied that this would not be possible: under the current version of iOS, you only have a certain number of attempts before having to wait for prohibitive amount of time. The FBI thinks it should be able to try passwords on this specific device at computing speed and asks Apple to build the firmware that allows this. It is, in fact, very different from a backdoor. Once again, the idea that we should introduce a backdoor on all our devices is terrible. The idea that we should weaken encryption standards to enable LEA to access all our devices is equally terrible. These attacks would weaken our entire crypto systems and encryption for everyone, which is not socially desirable. The key here is to understand whether the consequences of what the FBI is asking for will happen at a device level or at a systemic level.

If the FBI is really trying to conduct its attack at a device level in a way that would not be replicable to other devices without another court order, then I think it is socially desirable that Apple complies with the court order. When the systems we use rely on strong encryption, brute-force attacks are computationally hard. They take time. They take computing power. They make it hard for LEAs to break in to all our devices simultaneously, hence favor an approach where they have to select where they allocate their resources in the best possible way for society.

The argument made against weakening encryption by LEA has been the following: if you want access to encrypted messages, break into the end devices (the phones) but don’t break the crypto system that communications rely upon because this has strong negative externalities. Two points are however weakening the FBI’s arguments here. First, the FBI needs to give Apple assurances that this tailor firmware would remain under Apple’s control, which, some argue, might be an issue if the FBI wants to use the data as evidence in court. Second, the FBI should not be able to force Apple to develop such a software if this requires unreasonable engineering resources, and Apple should probably be allowed to bill the FBI for the time spent on developing such a software. With assurances on those two points, breaking into the end devices would be exactly what the FBI is asking Apple to enable them to do. More broadly, we need to make sure that this does not set a legal precedent for the FBI accessing any encrypted data it wants, meaning that eventually, lawmakers will have to decide where we draw that line as they have been preparing for with a March 1st hearing of Apple and FBI officials.

 

How do we know if it socially desirable?

It would be an altogether different problem if the FBI was mandating Apple to update surreptitiously its operating system to enable brute-force attacks at a systemic level on all our devices. In fact, this points to a different and deeper problem that we have to grapple with as a society: how do we make sure that the software we use is not a black box but in fact, operates as we think it operates as transparently as possible? In this case, how do we make sure that, once written, this new version of the operating system is not disseminated to all iPhones around the world by Apple or, worse, by other actors with shadier intentions that would come to be in possession of this software? Creating such an operating system is probably not technically hard, but rather politically so.

Indeed, Apple claims that this software is “too dangerous to create,” and that it is not socially desirable to create such a tool. There is at least one court of justice that disagrees. Apple may stand for a certain set of values to get the trust of their customers but they are in the business of selling phones, “not civil liberties.” They may be willing to help solve some societal problems disconnected with their business model, but they have no commitment or obligation whatsoever to do anything other than maximize their shareholders’ profits. AsYochai Benkler writes in The Guardian, this is not “a conflict between privacy and security. It is a conflict about legitimacy,” specifically the legitimacy to decide what is or is not socially desirable. And I recognize no mandate for Apple to decide what is socially desirable and what is not.

Do we have enough shared knowledge and understanding of the consequences of the creation of a specific technology to make an informed decision about whether or not it should be created? This is clearly not the case. Is the letter that Apple issued helping in that regard? By framing the issue as an encryption one, by relying on the difficult understanding of what those technical tools are, by leveraging the trust of their customers to pit them against LEA, I think Apple’s letter is making this problem harder to solve, not easier. We do not have the democratic institutions enabling us to decide whether such a tool is socially desirable or not. Comprehending the privacy, national security and other implications of the creation of such a technology is far beyond our reach, which is in fact the elephant in the room here.

If there is a solution to be found, Apple and the FBI should work together to find it – Apple itself admitted they had been helping out LEA in the San Bernardino investigations as in many others. Beyond that, they should be transparent enough about what they decide so that we as a society can better understand the decisions they made and augment our shared knowledge and understanding of what’s really at stake. Our common end goal should be to eventually build democratic institutions designed to solve such problems in a sustainable way.

 

This is populism, plain and simple

The FBI position can, and should, be criticized. They should prove that what they are asking Apple is not unreasonable, compensate the time Apple’s engineers will spend on the project, and should not be able to use the extracted data in court, which would reinforce the accusations of them asking for a backdoor.

However, Apple’s choice to publish a letter framing this issue as an encryption one is populism plain and simple. First, it is based on a biased perception of what is socially desirable. Second, it is a simplification of the issue to exploit the knowledge gap of the public around encryption instead of bridging it. Third, it is trying to leverage its consumer base and the feeling that encryption and privacy are rights that people should stand up for against their government to overturn a legitimate decision of justice.

It surely does not hurt Apple’s bottom line to be framed as guardian of our “freedoms and liberty.” We should collectively move past this corporate posture and quickly start building the kind of cooperative institutions that we will need to solve this new kind of techno-political problems with all relevant stakeholders. Postures will do little for our privacy, our security or our democracies.

 

This piece was originally published in the Harvard Kennedy School Review.

Laisser un commentaire

Soyez le premier à laisser un commentaire

Laisser une réponse

Votre adresse de messagerie ne sera pas publiée. Les champs obligatoires sont indiqués avec *
Jet d'Encre vous prie d'inscrire vos commentaires dans un esprit de dialogue et les limites du respect de chacun. Merci.