The Snowdenization of Tech: A Response to Apple's Terrorist Protection Letter

I am from Silicon Valley. I live here, breathe here, see the tech culture here all the time. We are the heart of the technology revolution. Chances are, companies no more than 20-30 minutes from my home are responsible for more than one device in your home. At the same time, these companies can sometimes feel like Snowden-central, with their promises to prevent access to their customers' data even if the law requires it and a subpoena issues.

That was at the core of Apple's and Google's promises to encrypt phones. Apple, for example, chose not to hold on to the encryption keys (for the technically less-initiated, it's a unique code each device uses to unscramble data). The encryption key on an iPhone cannot operate until the user "unlocks" the device, using one of a number of methods: fingerprint scan, numeric code, or a password. Even Apple couldn't access the data if it wanted, since it does not have the user's unlock code and it has chosen not to retain the devices' encryption keys (which could directly unlock the data).

So, no one can access the data on your phone while it's locked. Sounds pretty good, if you're a customer, right?

It sounds even better if you are a terrorist. Syed Rizwan Farook was one such terrorist with an iPhone, and Farook, with his wife, shot up San Bernadino last year. When the FBI wanted access to the data locked inside Farook's iPhone, Apple refused to cooperate. A court has now issued an ruling ordering Apple to comply, and in response, Apple CEO penned an open letter openly defying the order of the court.

In the way of background, the order is simple and restricted: because the simplest way to unlock Farook's iPhone is to guess its unlock code, the judge ordered Apple to write software that can be installed over the current software - software that would disable iPhone's features that delay and eventually wipe data if the passcode is entered incorrectly too many times and enable entering of the code remotely. Once this is done, the FBI can simply use a computer to guess and try different passcodes in the blink of an eye, and eventually the phone will be unlocked and the data revealed.

Note that this is in full compliance with the Fourth amendment, and a court has issued subpoena for the data trapped behind the encryption, and Apple is inventing a legal excuse of consumer data protection to refuse the order of the court. In the process, Apple is in effect protecting data that very well could lead law enforcement to other home-grown terrorist plots.

But that isn't even the most strange part of this situation. The strangest part is Apple's own argument in favor of its position. It argues that this could lead to widespread use of this method by law enforcement - oh, my, God. You mean law enforcement could actually obey court orders to bust open what is the digital equivalent of physical safes that might contain criminal evidence? I shudder at the implications - and - and here's the kicker - that criminals and thieves could get access to the software to override passcode safeguards, once created.

No, really. That's what Tim Cook said.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

And how, may I ask Mr. Cook, would this software get into the wrong hands? Given that the Court has not ordered Apple to turn over the software to the government - only to allow it to use it to recover data from the phone in question, the software itself would remain in the possession of - and only in the possession of - Apple. So, it could only end up in the "wrong hands" if Apple allows it to, or if Apple fails to protect it properly. In other words, what Tim Cook is saying there is, Apple is not confident in its own ability to secure this data.

It, then, is pretty ridiculous that Apple's case rests on their claimed commitment to data security. More ridiculous is its claim that there is no equal in the physical for what it is being asked to do in the digital realm.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

No? But such a tool already exists in the physical world, Tim. Actually, several. Just call a locksmith and see for yourself. Any physical lock can be broken - you can use a hammer, a cutter, or a drill. If a dead terrorists' locked safe comes in possession of the government, and a court issues a subpoena for access to the contents of the safe, no one would bat an eye that the safe should in fact be opened and its contents revealed. A court could also easily order a locksmith to break the lock, and they'd have to comply.

That's all Apple is in this case, a locksmith. The FBI may be using an old law to compel this locksmith, but there's little credibility that it is some unprecedented breach of privacy.

B..but. BUT. Isn't this a slippery slope? Once a precedence is set, the government can just use this tool anytime they want, right?

The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Apart from the fact that this case has nothing to do with turning on functions on one's device without their knowledge, for the rest of it, like tracking messages and locations, yes, they will, provided a fourth-amendment subpoena has issued against the target. To make matters worse, Apple has already admitted to sharing such information with law enforcement under a subpoena earlier in the very same letter.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case.

Some of the data in Apple's possession is anything on Apple's cloud servers, including photos, videos, files, messages (texting between iPhones use iMessage by default unless turned off), location data (Apple Maps), any health and financial data collected by Apple or apps that use Apple's cloud services to store data, and heck, even your passwords.

It is not the FBI that is creating a novel legal theory by forcing Apple's compliance with a subpoena's execution; it is Apple that is inventing a legal novelty by claiming that because it makes digital products, it should be exempt such compliance. Apple is pandering to the government-phobic extremists who want tech to be their panacea against the rule of law.

If the delicate balance tips in Apple's favor, terrorists - domestic and international - could plot with no paper trail, store their plans behind a lock and key no one could ever access, and get away with it until the actual execution of their plan. Criminals and rapists could avoid the consequences of their actions because the evidence of their crime is stacked away behind an impenetrable lock.

If the delicate balance of privacy and security tips in Apple's favor, the Fourth Amendment will become irrelevant - as subpoenas, increasingly digital in nature, would be unenforceable.

Apple must respect the rule of law.

Like what you read? Chip in, keep us going.

I Never Thought I'd Vote for Hillary Clinton

#VettingBernie: How Sanders Cleared Way to Dump Toxic Nuclear Waste on Poor Hispanics (and How They Fought Back)