Apple shouldn’t unlock the San Bernardino shooter’s iPhone for the FBI

By now you have likely heard that the FBI wants Apple to unlock the iPhone that belonged to one of the San Bernardino shooters (who is now dead). Apple has refused to unlock the iPhone, and I fully support them. This debate is far bigger than one act of terrorism or even the group that orchestrated that terrorism. This case has broad implications about security and privacy that should not be taken as lightly as many media accounts have been presenting them.

This is not just about one iPhone

iphoneIt isn’t like the FBI is asking Tim Cook to press a magical button at Apple headquarters that would instantly unlock this the San Bernardino shooter’s iPhone. The FBI is asking Apple to develop a weakened version of the iPhone operating system that would exploit a security flaw in that phone version and make it easier for the FBI to access the data on the phone.

(For a more in-depth but not overly technical explanation of the security flaw and what Apple is being asked to do, read this article by Bruce Schneier in the Washington Post.)

The problem is that once this weakened version of the operating exists, it exists. Apple and the FBI could try to protect it, but there are too many cases of systems being hacked or technology being misused to guarantee this would not be used to access more iPhones and for reasons far less acceptable than fighting terrorism.

As Apple CEO Tim Cook wrote in “A Message to Our Customers,”

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

This is not just about iPhones

Technology companies have been fighting for years against government demands for backdoors to encryption software. Now the government is emboldened by a case that focuses on a known terrorist instead of an abstraction. If Apple is forced to create a backdoor in a version of its operating system, it is likely that other companies will be forced to put backdoors in their systems as well.

As Bruce Schneier wrote in his Washington Post article “Why you should side with Apple, not the FBI, in the San Bernardino iPhone Case,”

Either everyone gets security or no one does. Either everyone gets access or no one does. The current case is about a single iPhone 5c, but the precedent it sets will apply to all smartphones, computers, cars and everything the Internet of Things promises. The danger is that the court’s demands will pave the way to the FBI forcing Apple and others to reduce the security levels of their smart phones and computers, as well as the security of cars, medical devices, homes, and everything else that will soon be computerized.

crypto key 

Privacy is important

All those pieces of technology that could have their security weakened and put data privacy at risk are things that a lot of people and organizations (even the FBI!) depend on. Encryption backdoors would hurt governments, corporations, and individuals.

Of course, the go-to argument in such situation tends to be “people (or organizations or companies) with nothing to hide don’t have anything to worry about.” The problem is even good people and good organizations have things to hide.

Below are my “Five reasons you should care about privacy,” which I copied, with only slight modifications, from a post I wrote about the NSA after Edward Snowden leaked his documents about their privacy invasive programs. The same reasons also apply to why you should not support security backdoors, such as what Apple is being asked to create.

1. You probably aren’t as open as you think you are

Maybe you don’t care if the government knows what phone numbers you dial or what your emails say. There are probably still limitations to what you are comfortable sharing with certain audiences.

  • Would you put social security number and credit card information on a public website?
  • Would you show your current spouse or partner correspondence you had with your ex?
  • Do you want your children to see you have sex? What about your neighbor? Or your boss?

When people say they don’t care about privacy they mean they don’t care about a specific privacy issue. Most people still close the bathroom door when they poop.

2. Standards change

What society deems acceptable changes over time. For decades it was perfectly reasonable to be a member of the Communist Party in the United States, but in the 1950s McCarthyism and the “red scare” meant that even a casual history with that party could get you in trouble. A detail about you that seems innocent today could make you a target for harassment, discrimination, or worse in the future.

3. Trust has limitations

By allowing a government or corporation to access your information you are trusting that their intentions are honorable. Maybe you are comfortable with that, but you are also trusting that all of their employees’ intentions are honorable. You are also trusting that all of their third party contractors’ employees are honorable. This can be thousands of people.

4. Security has limitations

Maybe you truly do trust all the people given access to your data. What about the people who aren’t granted access? If there is a collection of data about you it is vulnerable to hacking or other security breaches. Security issues can rapidly become privacy issues.

5. Some people do have things to hide

Maybe you don’t have anything to hide, but others do. I’m not just talking about criminals and terrorists. Whistle-blowers, undercover cops, workers at shelters for abused women, and many others rely on secrecy to do good. Surveillance technology puts these people at risk.

As Tim Cook stated in the above-mentioned letter:

  • Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

I strongly support Apple’s position and hope they prevail in this fight. If, however, Apple is forced to create a weakened iOS to help unlock that iPhone, I hope that whatever the FBI finds on it is tremendously valuable in fighting terrorism, bringing down ISIS, stopping world hunger, curing cancer, etc. Unlocking that iPhone is a opening Pandora’s box, so whatever is on it damn well better be worth the demons it unleashes.

PREVIOUS POST: Alexandra and the Terrible, Horrible, No Good, Very Bad Zero-Day Day

Get notified of new posts by email. Type your email address in the box and click the “create subscription” button. My list is completely spam free, and you can opt out at any time.

You can also find Kim Z. Dale on TwitterFacebook, and Google+ .

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.