Mendoza School of Business

Where Does the Greater Risk Lie?: Apple vs. the FBI

Published: February 22, 2016 / Author: Joe Holt



There is currently a high-stakes showdown between Apple and the FBI on the proper balancing of privacy and security concerns. At issue is the question whether Apple should have to create a tool that will help the FBI to break the passcode of a known and deceased terrorist’s iPhone and extract data from the device (which may or may not contain evidence of collaboration in an earlier attack or planning for future attacks). I’d like to lay out the basic facts and the main argument on each side, and to invite you to reflect on where you stand.

The FBI has a locked, encrypted iPhone 5c used by Syed Rizwan Farook, who together with his wife, Tashfeen Malik, planned and executed a mass shooting in San Bernardino, California, on December 2, 2015. The horrific attack left 14 innocent people dead and 22 seriously wounded.

After the attack Apple helped the FBI to collect data, which had been backed up to the cloud from a work iPhone that had been given to one of the shooters. But then the FBI asked for help gathering unspecified further data from the phone that hadn’t been backed up. A federal judge earlier this week granted an order that requires Apple to create the special tool that investigators need to crack the phone’s passcode and extract its data. Apple is refusing to comply with the order. 

To be clear, court order doesn’t require Apple actually to decrypt the phone for the government. The FBI would like do that itself by writing a program that will send the device an indefinite combination of guess passwords until one of them works. The problem is that a protective feature of the device automatically erases all data on the phone after ten failed passcode attempts. The court order requires Apple to create a special tool that will disable that protective feature.

By way of background, Apple’s position on encryption strengthened after the Snowden revelations of how the National Security Agency worked with some tech companies, and hacked others, to access user data. Later stories of certain NSA analysts abusing surveillance capabilities to spy on significant others for personal rather than professional reasons only added to the already high level of concern and anger about unwarranted privacy violations.

In any case, Apple apparently received letters from alarmed customer, fortifying the company’s stance on encryption. In September, 2014, the company rolled out its iOS8 operating system, which basically prevented the company’s engineers from extracting data from the company’s own devices. There was only one key to access the phone and its data, and that key, by design and default, was in the customer’s hands. Apple has since rolled out the iOS9 operating system, which the company can’t decrypt even if it receives a warrant. 

It should be noted that Apple has help law enforcement official access user data in the past. An article in yesterday’s New York Times reports that, according to Apple itself, “In the first half of 2015 alone, the company provided data in response to more than 3,000 law enforcement requests.”

There is, however, apparently a subtle but important difference between what it did in the past and what it is being asked to do now. It is apparently the difference between extracting data from an iPhone without unlocking it – which Apple was capable of doing before the iOS8 rollout – and creating a special tool that will enable the FBI to crack a locked iPhone. Apple says that tool the FBI is requesting does not exist, and that it does not want to create the tool out of concerns for its customers.

Apple’s argument is that the backdoor it is being asked to create to open the phone in question could end up the wrong hands and threaten the privacy, and possibly even the personal safety, of its customers. Both privacy and safety would be threatened if, for example, a stalker is able to access the messages or calendar entries of the person they are targeting. Mr. Cook has also argued that it sets a “dangerous precedent” if a company builds tools for law enforcement officials that weaken security.

Usually this issue is framed as a debate between privacy and security, but Mr. Cook argues that the two are related. As he puts it when explaining why encryption has become increasingly important, “Compromising the security of our personal information can ultimately put our personal safety at risk.”

jholt

Apple also has legitimate concern for its brand value and market position. A significant part of its brand value it tied up with its deserved reputation for superior customer privacy protection. And its market position might well be weakened, at home and abroad, if its devices are seen as more vulnerable. As argued by Apple attorney, Marc Zwillinger, “Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand.”

Finally, there is cause for concern on the global level. Journalists, human rights workers and others working in dangerous places abroad depend on strong encryption for their safety and work. They could be more at risk if malevolent authorities abroad may either gain access to the tool that enables them to open locked phones, or be inspired by the court order to demand that the company help them to pry into the personal data of journalists or human rights workers.

The general argument of the FBI and other law enforcement officials is that increasingly strong encryption technologies limit their ability to prevent and solve crime. Another general argument on the law enforcement side of the debate is that Apple received a lawful order and has to either comply with that order, if it can do so, or ask a judge to vacate the warrant, if it cannot comply with it. According to that argument, companies and people don’t get to choose when to follow the law (though they can challenge the law when they think it is unjust or misapplied).

The particular argument of the FBI and the White House in this case is that they only want to unlock this single device in this particular investigation.  Help with that request, they say, does not require changing the encryption default in Apple devices or providing a “master key” that could unlock all iPhones. Law enforcement further argues that the help they need could be provided by Apple at an Apple facility, thus lowering the chance that the tool used could fall into the wrong hands.

Perhaps the central issue in the debate is whether Apple’s compliance with the court order would or would not create a general security vulnerability that could expose iPhone users in the future. Law enforcement officials generally argue that on such security vulnerability would be created, but Mr. Cook argues otherwise: “Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.” It does not seem that parties on either side of this particular issue can offer a complete guarantee, so the question is what in the face of uncertainty should be done. 

A further argument on the law enforcement side of the debate is presented by Orin Kerr in yesterday’s Washington Post. Kerr argues that Apple has the technical capability to send a software update to the phone that would disable “the optional password-guessing-thwarting function” on the device and enable the FBI to use its computer to make as many guesses as it takes to break the password. As Kerr points out, Apple has not yet written that software update, and is very much opposed to writing it. But apparently they could if they wanted to do so. 

Those are the basic facts of the debates and the main arguments on either side.  Each side argues that the consequences will be worse if the other side prevails in the debate.  Each side believes that it has important duties that it should uphold.   And each side is acting consistently with its purpose and defining values.

Which side do you believe has the better of the argument and why? One side of the debate argues that your safety is more at risk if Apple does not follow the court order, while the other side argue that your safety would be more at risk if the company does comply?

So where do you stand? Will you feel more safe and secure if Apple follows the court order or if the company remains steadfast in its opposition to it?

Follow Joe Holt on Twitter @busethicsdude.


Topics: Main, Mendoza