Here’s the setup.
San Bernardino killer Syed Rizwan Farook owned an iPhone 5c, which may have been used — probably was used — in planning and perhaps even executing the holiday party terror attack with his wife, Tashfeen Malik.
That iPhone 5c, just like any other up-to-do-date iOS or Android smartphone, has disc-level encryption baked into the OS for users who want that level of privacy, for good or for ill.
Yesterday,U.S. Magistrate Judge Sheri Pym ordered Apple to bypass the phone’s security functions, and furthermore “to provide related technical assistance and to build special software that would essentially act as a skeleton key capable of unlocking the phone.”
Here’s what happened next:
Hours later, in a statement by its chief executive, Timothy D. Cook, Apple announced its refusal to comply. The move sets up a legal showdown between the company, which says it is eager to protect the privacy of its customers, and the law enforcement authorities, who assert that new encryption technologies hamper their ability to prevent and solve crime.
In his statement, Mr. Cook called the court order an “unprecedented step” by the federal government. “We oppose this order, which has implications far beyond the legal case at hand,” he wrote.
The Justice Department did not immediately respond publicly to Apple’s resistance.
The F.B.I. said its experts had been unable to access data on the iPhone 5c and that only Apple could bypass its security features. F.B.I. experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features.
The Justice Department had secured a search warrant for the phone, owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health, which consented to the search.
Because Apple declined to voluntarily provide, in essence, the “keys” to its encryption technology, federal prosecutors said they saw little choice but to get a judge to compel Apple’s assistance.
Mr. Cook said the order amounted to creating a “back door” to bypass Apple’s strong encryption standards — “something we simply do not have, and something we consider too dangerous to create.”
Security hawks are on solid ground when they worry (as I do) that Farook’s encrypted iPhone might contain data valuable to government efforts to stop future terror attacks on U.S. soil, or to aid intel efforts to locate, track, and kill Farook’s ISIS contacts overseas.
But that’s not the only worry, as Doug Mataconis explains:
From Apple’s point of view, there seem to be a myriad of issues motivating the decision to take what has the potential to be an unpopular decision given the circumstances of this case. First of all, there is the fact that ever since the company made the decision to strengthen security on its phones in a manner that essentially allows customers to encrypt data in a manner that makes it nearly impossible to access without the appropriate pass code, the concerns about data security have only become more prominent and that providing a backdoor that does not exist right now would only serve to make the data itself less secure overall. Second, as the Post article notes the use of the All Writs Act in this manner appears to be unprecedented and, if upheld, would essentially allow the government to do almost anything in the name of law enforcement and intelligence gathering. Finally, and perhaps most strongly, it’s important to note that law enforcement isn’t asking Apple to provide information that it already has, which is what an ordinary search warrant does. It is essentially asking a Federal Court to compel Apple to do something, in this case create a backdoor that does not exist. This arguably falls well outside the scope of the Fourth Amendment and, if upheld, would give law enforcement authority to compel technology companies to do almost anything conceivable in the name of a purported investigation or surveillance of a target. That seems to go well beyond what the Constitution and existing law permits law enforcement to do.
Or as David Burge summed up the issue on Twitter:
On one hand, Apple is a bank refusing a court order to open a safe deposit box. 1/2
— David Burge (@iowahawkblog) February 17, 2016
On the other, the government is asking for a generalized skeleton key to everybody’s safe deposit box. 2/2 — David Burge (@iowahawkblog) February 17, 2016
Would a government-mandated backdoor into every iOS and Android device be of help in the War on Terror? Almost certainly. But we’d also be giving unprecedented snooping power to a government which can no longer be trusted with simple matters like approving tax-exempt status for grassroots organizations in a non-partisan matter.
Perhaps civil libertarians would be less worried about handing Washington the keys to everyone’s personal data, 24/7, if Washington seemed at all worthy of the trust and responsibility required to carry those keys.