Last week, a federal judge ordered Apple to help FBI investigators decrypt the employer-owned iPhone 5C used by Syed Rizwan Farook who took part in a terrorist attack in San Bernardino in December (ars technica, “Judge: Apple must help FBI unlock San Bernardino shooter’s iPhone”.) The order is to provide the FBI with a custom firmware to bypass auto-erase functions and allow investigators to brute force the passcode efficiently without the normal restrictions only for this specific iPhone, and that Apple may refuse to share the resulting software outside of Apple. This last part is the troubling point. Apple has challenged this court order (ars technica, ‘Apple CEO Tim Cook: Complying with court order is ‘too dangerous to do’) and recently wrote in a Q&A, “it would be the equivalent of a master key, capable of opening hundreds of millions of locks.”

Tools, processes, and software used for gathering evidence may be challenged in court. Software may need to be validated by a third-party which would open up Apple’s backdoor to the world allowing others to reverse engineer for new products, or research for weaknesses. All of this could also be leaked to criminals. As noted by forensic scientist, Jonathan Zdziarski, on his blog post, “Apple, FBI, and the Burden of Forensic Methodology,” an established tool is “validated, peer reviewed, and accepted in the scientific community.” Before iOS 8, the FBI would have simply asked for an image of the data–a copy, a reasonable request using established methods. However, iOS 8 encrypts the data within the device. As Zdziarski points out, it would be difficult to get a judge’s approval to request Apple to exceed reasonable assistance to hack the device. Which is why the request is for the investigators to do the cracking using a special tool developed by Apple to make it easier. Any such new tool could be challenged in court, and if Apple kept its methods secret then the any evidence obtained would likely be thrown out. The FBI knows this.

The FBI isn’t interested in ignoring forensic science and knows terrorists go through the trouble of masking their steps which may include using other encryption methods. Law enforcement wants the key or precedent to go ahead with other cases, and take another step towards pushing for backdoors.

Abiding by the court order in creating a new forensic tool while maintaining the designed security is unreasonable. As Apple pointed out in its Q&A, this isn’t about marketing, but about the risk. It says, “The only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.”