On the morning before suicide bombers rocked Brussels, Apple's CEO Tim Cook reiterated his position of protecting terrorists under the guise of encryption to prolonged applause at a launch event for the iPhone SE. Apple is refusing to help the FBI unlock the iPhone of San Bernadino terrorist Syed Rizwan Farook so that data trapped behind encryption could be recovered in accordance with a subpoena. As if to mock Cook's position, the FBI filed for a continuance on the case the same day, saying they have likely found a way to get around this impenetrable wall of encryption Apple is peddling.
For most observers, this development avoids a protracted legal battle for now. It allows the FBI to access the data it needs, while Apple gets to claim a moral victory in keeping its privacy commitment to its customers and clients.
And that is precisely where Apple's house of cards begins to collapse. Think about it. If the alternative method and experts the FBI is availing itself to is successful in unlocking the iPhone in question, and it appears it will be, it unravels Apple's very appeal to fight the government in this case: impenetrable encryption security. Apple and other tech giants like Google, Microsoft and Facebook have all sold their case to their clients on the basis that even they cannot break encryption of their products to get access to the customers' data. By implication, no one else can either, and the privacy-crazed customer who happily hands over his credit card to a waiter to take to the back of a restaurant can finally sleep soundly at night.
That case falls apart at the scene if one can in fact find a locksmith to unlock their products without any help from them. Evidently, the only thing Apple and its Silicon Valley cult of Snowdenized companies are selling is a false sense of security while falling woefully short of providing actual security.
There is another critical problem for Apple and other tech companies. You see, Apple was given the first choice to be the locksmith for the terrorist's iPhone, which would allow it to keep a tab on the mechanism to do so. But now that the FBI has gone out and found a third party to do it for them, Apple will have no information on how the mechanism works, how broad its use can be, and on what versions of the iPhone (and/or the iOS operating system) it will work.
Apple was given the opportunity to ensure that the software to override the locking mechanism would be narrowly tailored and only apply to a small number of devices. By giving into the Snowdenite culture of the Valley and refusing to cooperate with law enforcement, however, they have now prompted the FBI to find a potentially much broader tool to be exploited at the whim of law enforcement rather than be restrained by Apple's engineers - an outcome Tim Cook explicitly said he wanted to avoid by refusing his company's cooperation.
Indeed, experts are already saying Apple is not going to be pleased with the solution FBI has come up with.
I guess it's really too bad Apple willingly gave up having any say about it.
As I had pointed out in my response to Tim Cook's letter, Apple has never had a legitimate case for privacy. The moment they admitted to handing over to the FBI information including a customer's passwords in accordance with a subpoena, the case for refusing to unlock a physical device was greatly diminished. Everything else is marketing to lull their customers into a false sense of security.
That false sense of security is more dangerous than the ultimately insecure systems Silicon Valley is putting in the hands of their customers. No technology is fully impenetrable, and it's a bad idea to leave the security of your data in the hands of a single technology alone (the same tech companies will tell you this very simple truth when they are trying to sell your "redundant" systems).
Because no technology is impenetrably secure, Apple could have simply cooperated with the FBI and acknowledged this truth. Legions of their fans would be unhappy, but they would not contribute to the perception that encryption is panacea. By refusing, they have done a disservice to our country by protecting a terrorists' data against a court order, and they have ultimately made their devices more vulnerable to law enforcement's reach.
Tech companies must begin to do their part in not only helping law enforcement prevent and solve terrorism cases but also in reining in the misperception of encryption as a privacy panacea that they have helped spread.