Apple is trying to position itself as a staunch defender of citizens’ privacy. But when you extend its arguments to their logical conclusion, it comes out looking like the company is incapable of protecting its secrets.
As the battle between Apple and the FBI over providing access to an iPhone used by one of the San Bernardino terrorists rages on, Apple has been trying to turn the situation to its advantage by garnering all the free publicity it can get. Even the Justice Department, in a filing in federal court, called Apple’s statements a “public brand marketing strategy.”
In its campaign, Apple is mustering all the fear, uncertainty and doubt it can. In an open letter to its customers, it states that “the government would have us write an entirely new operating system for their use. They are asking Apple to remove security features and add a new ability to the operating system to attack iPhone encryption, allowing a passcode to be input electronically. … It would be wrong to intentionally weaken our products with a government-ordered backdoor.” The FUD factor in that statement is “weaken our products.” It is grossly misleading, the plural suggesting that the FBI wants Apple to make this back door a standard part of iPhones. That’s flat-out false. What the government has asked is that Apple modify software to remove a feature that was not present in earlier versions of the software, and then install that new software on the single phone used by the terrorist. Apple can then destroy the software.
More contradictory to Apple’s claims is that the FBI has specifically stated that it does not intend to cause a weakening of the consumer product, so this case cannot be used as a precedent. Should the government at any time attempt to do that so that back doors to be embedded in products, its own words would be the most compelling argument to counter that.
A section of Apple’s statement is titled, “Could Apple build this operating system just once, for this iPhone, and never use it again?” Although that is exactly what the government is requesting, Apple’s phrasing makes it sound like the most ridiculous pie-in-the-sky idea ever heard.
The FUD continues, with Apple saying, “Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.” That might very well be the case. But it has zero relevance. Each of those cases could be resolved only with a court order of its own, regardless of what happens with the San Bernardino iPhone. Even if this case were not in front of the court at the moment, any state, local or federal law enforcement agency could bring a similar case forward. Gaining access to locked data is a legitimate law enforcement issue, and whatever your personal beliefs, all law enforcement officers have a responsibility to attempt to collect all information that is legally possible to collect.
In other forums, Apple has been claiming that if the U.S. requires Apple to cooperate in providing access to the phone, all other governments around the world will then expect the same sort of cooperation. It is a bogus claim — more FUD. Do Apple’s lawyers really not know that the law of one country does not apply to another? Apple’s winning its case in the U.S. would do nothing to stop another country from initiating a similar action. Its losing its case should have no influence on whether other countries decide to pursue such matters.
So even if the Supreme Court decides that Apple does not have to modify the iPhone software for this case, China can make such a demand of Apple, and if Apple wanted to resist that demand, citing the U.S. precedent would be irrelevant. Russia can make such demands. Even Canada can do it, assuming its laws allow for such a thing. And frankly, no one can say for sure that Apple hasn’t already been required to provide Russia or China with the ability to backdoor its phones.
But of all of Apple’s arguments, the one that is most ludicrous, or perhaps the most damning of its much-touted security prowess, is revealed in this response to the government’s request for a key that could unlock one phone:
“Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”
First, Apple is already relentlessly attacked by hackers and criminals. I would like to hope that Apple has better security practices than the IRS. But when you unpack this statement, you are left with the impression that we should not trust any of Apple’s software or products. You have to assume that, should Apple write the software that the FBI wants, it would be among the most protected software in the company. If Apple is concerned about this software being compromised, what does that say about all of its other software?
So if you believe that Apple fears that this software is at risk of compromise and would create a serious security risk, you have to believe that all of its iOS software is at serious risk.
Even assuming that a bad guy gets hold of just the software that law enforcement wants created, it would have to be signed by Apple’s security certificate to load on any phone. If the criminal gets a copy of the software and it has already been signed with the certificate, Apple could revoke the certificate. But if a bad guy gets hold of Apple’s digital certificate, then the whole Apple software base is at risk, and this feature that the FBI wants bypassed is irrelevant. After all, Apple has stated that it is not immune from attack, and it has implied it is a reasonable concern that its most protected software can be compromised.
What we have here is really a policy issue about the level of cooperation that law enforcement can force upon software vendors. I am not taking sides on that issue; while Apple clearly has a responsibility to take reasonable steps in the interest of its business, I expect the government to win. But I don’t know that that is the better outcome. What I do know is that Apple is attempting to generate unreasonable fear, uncertainty and doubt to sway people to its position.
But Apple, seeming to take a page from Donald Trump’s presidential campaign, is using the situation to promote its brand with free advertising. I have to assume that the slump in iPhone sales will improve, given how the government implies that the iPhone is so secure, it needs Apple’s to help it compromise it. Luckily for Cook, so far few people realize that he is arguing that Apple can’t keep its software secure.