Just to expand on a thing I’d been going on about yesterday… Taking the cynical view just for fun: The Apple case is good for open source development.
You know, we get expert witnesses in court cases for all sorts of things that are complex: forensics, psychology, various kinds of natural sciences, physics, and so forth. But when it comes to technology, it seems this tactic flies out the window. Just because we all have iPhones in our pockets everyone’s somehow now qualified to rule on how data security architecture actually works. Except that they’re not. And that includes a San Bernardino magistrate judge on behalf of the FBI. And they’re messing up the commercial technology sector for the U.S. Let’s watch.
The Feds want Apple to jailbrake a county-owned iPhone 5c, used by Syed Farook in building their case against him and wife Tashfeen Malik for the mass shooting at an office Christmas party in California last December. Caught up? Already knew that? Okay, great. It’s an awful idea. No, not getting the specific data off this specific mobile, that sounds like a generally good plan, but the method in which this would happen has nothing to do with building a case against these two, and more about creating precidence and a permanent back door into a popular mobile system, which they won’t need to seek permission to use again. It would be the death knell for commercial, proprietary technology in the U.S.
The request is best summarised by Lawfare blogger Nicholas Weaver: “Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target’s phone, cryptographically sign that malcode so the target’s phone accepts it as legitimate, and run that customized version through the update mechanism.”
Essentially, the FBI wants Apple to create an undetectable malware tool, turn it over and then forget about it. In what business model does that sound good?
The request would be disasterous in terms of legal precidence and would basically unleash a nasty peace of software into the wild that every SigInt agency from the NSA and GCHQ to Russian, Chinese and Israeli intelligence agencies would demand copies of, and probably have strong grounds to get one. And as soon as the code exists it’s going to end up in non-government hads as well. That’s just the physics of data over a long enough period of time. So, you can undestand why Apple’s CEO is hesitent to go along.
Still, it’s the request itself is fairly revealing as what the FBI is hoping to uncover. It can’t be something backing up to iCloud, because they could get that without the phone. It can’t be information about calls made, texts sent, where these two may have traveled while with the mobile. That’s all meta data police can (and did) get from mobile service providers. It can’t be the last remote backup the iPhone made, they have access to that. It can’t be to find out who supplied the weapons because that’s been figured it out. They know who the guy targeted (his colleagues) and probably why, and where the couple were “radicalised”, to use the parlance of our time. We don’t know, because the Justice Department doesn’t know. It just wanted a reason to get back into iPhone hard drives since the encryption curtain fell after the Snowden NSA leaks. That’s what it’s hoping to find. Nothing else.
What this court order suggests they’re hoping to find is something on the device that hasn’t been transmitted or shared. Something sitting on it that had been added to it after 19 October. So we know that iPhone physical hard drive encryption lives up to the company’s claims, at least. But it’s not as though the Justice Department doesn’t have ways and means to stress test it. They just don’t want to.
But I don’t think Apple is exactly a freedom fighter. Creating an easy-to-use back door for the government to employ on iPhones whenever it liked wouldn’s just be a blow to information security for all iPhone users, but a direct hit on proprietary software, which is ironic given the US penchant for over-the-top intellectual property protection. It would be impossible to apply the new rule fairly. Devices made and used outside U.S. jurisdiction would be immune, for example.
But more importantly, it highlights a clear advantage that open source software has: there are no secret places to install hidden back doors. It can always be independently audited, forked and strenghened. If the court gets its way, it will be handing the clear advantage to public domain technology, and encourage more software developers to release source code in order to ensure consumer confidence. So, there’s some upside after all.