How bad would it be if Apple built a backdoor for the FBI?
The FBI is asking that Apple create a patch that disables the automatic wipe after a number of incorrectly entered passcodes into the phone. I assume this patch will be created in such a way that it will be automatically loaded onto the phone without the users’ intervention.
This is a very bad thing.
The environment which we live in today requires that the technologies used to protect your information are not intentionally weakened. We have strong evidence that both organized crime and national governments have used and continue to use weakened protections to gain unlawful access to your information.
Apple, and many others, argue that if they are required to provide a back door to the security protections inherent in their platform, it will be abused by those who are acting unlawfully.
Also, consider that if the United States government has a right to demand that Apple provide it with a toolkit to bypass the security protections on a phone, what’s to stop Russia, China, or any other government in the world from making the same request? After all, Apple is a global company serving people worldwide.
How is Apple to ensure the protection of the data of one phone, and allow the compromise of the data of another, when they are, in effect, handing over a master key to their platform?
Also, taking into fact that the court and law enforcement system have zero experience dealing with digital security. They think…completely erroneously…that their past experience makes them qualified and able to handle cell phone security…and it doesn’t.
These are great people with great intentions, but they are stumbling into an area in which they have no ability to make appropriate decisions, and where a single mistake could screw hundreds of millions of people in a dozen different nations.
Imagine if you put a bunch of old army geezers in charge of the Manhatten project instead of putting physicists in charge. Would they have successfully created nuclear weapons? Nope.