Homeland Security

Apple vs. the FBI: The Problem with the 'One Case' Scenario

Image Courtesy Shutterstock

Apple has found itself in the middle of a political and public relations firestorm. A few days ago in an open letter to Apple’s customers, CEO Tim Cook made the case for why Apple should not assist the FBI. In case you have missed it, the FBI has the iPhone of Syed Rizwan Farook, one of the San Bernardino terrorist who killed 14 and injured 22 others in December, 2015. The FBI has requested that Apple help decrypt the phone, as they believe there may be information about other potential terrorists and terrorist activity on it. Apple has refused to comply and the legal battle is starting to heat up.

This case happens to have many layers, from the War on Terror to legal precedents for law enforcement. The unfortunate truth is that the technological reality of the situation is not coming to the forefront of the debate. The natural reaction of many has been to condemn Apple for not assisting in an investigation of possible terrorists. However, the ramifications of what the U.S. government is asking of Apple could be quite severe.

Having law enforcement decrypt an iPhone for information is not the same as having law enforcement search a house for information. For starters, the house physically exists. It is unique. Furthermore, you must physically be at the house to search it. Finally, searching a house is done in the open; typically, you will notice if law enforcement is in your home.

So what makes decrypting an iPhone so different? You cannot decrypt just one iPhone. If Apple developed a decryption method, ALL iPhones would be decryptable…at once…even remotely…without the the owners ever knowing about it. There are a lot of innocent people—who are not terrorists—who happen to own iPhones who would be affected. Cook wrote:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The FBI is asking Apple—a private company—to perform this task for just one case. But the problem with the “one case” scenario from a technology standpoint is that there is no switch for Apple to magically unencrypt/encrypt Farook’s iPhone. The U.S. government is attempting to compel Apple—again, a private company—to develop software to work around the security measures of an iPhone. Once this software is created, it can be copied and used many times over. There are no physical limits to the reaches of this decryption software.

This fact presents two major problems. First, when it comes to cell phone surveillance on American citizens, our government doesn’t exactly have a great track record of being transparent—or even honest. It wasn’t that long ago when it was revealed that the NSA was capturing and storing millions of phone records of U.S. citizens (shortly after the NSA testified to the Senate that it was not doing so). This action was carried out under a secret court order from the Foreign Intelligence Surveillance Court (also known as FISA Court). Do we want law enforcement to have this power?

Second, this technology, once created, could be used by malicious entities, including terrorists. There is a major black market for malicious software. What is to prevent one rogue FBI agent from selling the software for millions of dollars on the black market? All kinds of groups, from the Chinese military to Al-Qaeda, would have uses for such software. Once the software leaks out, iPhones will be extremely vulnerable and Apple, a major innovator and employer, could be financially destroyed.

The legal battle between Apple and the FBI will set a precedent for future cases involving technology. Will American corporations be forced to spend time and money to develop software that could be turned against them? Is developing a method for decryption that could fall into the hands of terrorists (whereby they gain access to encrypted cell phones) the best way to fight terrorism? With our world becoming more technologically integrated and new legal precedents being set by those who don’t understand the technology at hand, it is time to take a step back and consider the wider ramifications.