Should Apple Be Forced to Open a Back Door to Customer Security?
Last February 16, Apple CEO Tim Cook wrote a public letter to Apple customers, detailing the demand of the US government for the company to unlock an iPhone of one of Apple’s users. The owner of the iPhone was suspected shooter Syed Rizwan Farook, who was arrested for involvement in the San Bernardino incident last December, which killed 14 and injured 22.
Cook clearly intends to fight the order of the FBI because he says, “It threatens the security of our customers.” He adds that Apple is “deeply committed to safeguarding data” and that this request by the government has far-reaching implications.
Just imagine if anyone, especially the government, had a backdoor access to every piece of information on your smartphone. Your life would be an open book. The security implications and risks are tremendous and could easily get out of hand because, as Cook explains, “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.”
A Central California judge, Sheri Pym, has ordered Apple to create software for the FBI that would allow them to bypass the phone’s self-destruct mechanism and break the password of Farook – something called bruteforce.
Is Terrorism a Good Enough Reason?
The world is becoming a scarier place and the government (through its FBI and police) is at a disadvantage when fighting criminality and terrorism. Often, they come up empty, as in the case of Farook’s phone, which they have been unsuccessful in unlocking for the past two months. FBI Director James Comey has been very vocal about how smartphone encryptions have hampered their efforts to fight crime. For most of 2015, Comey had been pushing that the All Writs Act of 1789 should compel companies to provide them with legal access to smartphones. Presidential candidate Hillary Clinton has openly called for a “Manhattan-like project” to break encryption to help law enforcers.
According to Kevin Bankston of New America’s Open Technology Institute, compelling a company like Apple to create a backdoor to their security features could set a legal precedent for the government to require the same of other companies in the future. This could lead to secret installation of specific malware in all operating systems for all electronic devices. This would mean loss of trust in digital devices and a compromise in the growth of an industry on which the world has come to depend.
Cook has said that should Apple be forced to create a backdoor to their security encryption and allow the FBI access, there is no guarantee that it will be safe in the hands of the FBI. This code, like a master key, would be able to open any Apple product and the company would not be able to control access or provide the same security guarantee all Apple users enjoy now.
The Question Is: Isn’t There Any Other Way to Fight Crime and Terrorism?
A weakened encryption affects everyone and increases the risks to hacking and cybercrime. Even those who create the backdoor will have the power to break into any phone. The implications are chilling. Your personal and private information will be available to anyone with this master key. At any time, a person with access to this code can be victimized to give the code.
In addition, the court order to turn off a security feature with new software so the FBI can “bruteforce” the phone suggests that the code should be given to the FBI immediately. This means Apple will have no time to test or debug the new software.
And with all that has happened, another issue has come to light: Apple’s security is not really that secure after all, as someone out there might decide to independently create that backdoor entry now that it has been confirmed as capable. Furthermore, from the point of the government, forcing a private entity to do something it does not want to do has scary written all over it.
The bottom line: There should be another way to fight crime and terrorism.