╌>

How Apple’s Stand Against the FBI Could Backfire

  

Category:  News & Politics

Via:  johnrussell  •  8 years ago  •  4 comments

How Apple’s Stand Against the FBI Could Backfire


"The skirmish between Apple and the FBI is quickly escalating to a battle royal, a fight to the finish over lofty principles and national values, involving not just this company and this bureau but all of Silicon Valley and the entire realm of U.S. intelligence gathering.



Certainly the two combatants are presenting the case in these terms, Apple and its supporters declaring that the  future of encryption and privacy rides on the outcome , the FBI (as well as the National Security Agency and several police chiefs) claiming that well-trod paths to capturing criminals and terrorists will be closed off if the computer giant gets its way.


Advertisement


Both sides are overstating the stakes; both sides are making disingenuous arguments. Yet I think Tim Cook, Apple’s swaggering or courageous chief executive, has miscalculated. The dispute he’s chosen to challenge in the courts—which seems destined for appeal all the way to the Supreme Court—is a weak test case, from his vantage. And in the unlikely event the justices hand him a victory, he’s likely to trigger a political fight that he and every ideal he stands for will probably lose.



The gist of the case is that the FBI wants full access to the iPhone 5C used by Syed Farook, one of the San Bernardino, California, shooters. The bureau isn’t asking Apple to unlock the phone. Owing to its software, which allows users to set their own security code, Apple’s engineers  can’t  unlock it even if they wanted to. In a clever workaround, the FBI has asked Apple to override the feature that wipes out all of the phone’s data after someone enters an incorrect passcode 10 times. With that obstacle lifted, the FBI (or some other government agency) could use “brute force” methods—applying software that can generate thousands of alphanumeric guesses per second—to break the code.



Apple refused to cooperate with the FBI, so the bureau took it to court, where a magistrate judge ordered the company to comply. On Tuesday, Cook wrote an  open letter  to customers, arguing that the government’s campaign has gone too far. By ordering Apple to write new software that makes one of its own systems vulnerable, the government is making all Apple systems, everywhere, vulnerable. Someone could duplicate the software. “In the wrong hands,” Cook wrote in his open letter, “this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.”



FBI officials have countered that they are not asking for Apple to carve out a “back door” into all Apple phones.  Their request would affect only this one phone. If Apple took precautions in applying the software, it wouldn’t necessarily leak out.



A retired intelligence official who dealt with software security outlined one way that this could be done. Apple’s engineers, he said, wouldn’t have to make elaborate changes to the code or operating system. Rather, it could simply reset the existing security feature, so that, instead of wiping out the phone’s data after 10 incorrect passcodes, it did so after 1 million tries or 10 million or whatever number was necessary.



More than that, he continued, Apple’s engineers could reset the feature in their own lab, through hard-wired controllers, with no government official present. They could even do the brute-force code-breaking themselves (password-sniffing software is commercially available), after which they could hand over the contents that the FBI wants, then destroy the phone. Problem solved, no leakage.



I asked a few other retired intelligence officials, and some computer-security specialists, whether this plan was feasible. They thought it was. I presented the idea to a senior executive at a software-security company who fervently endorses Apple’s side in the case. He too thought this might be a reasonably safe way to handle the data.



However, this executive, like Cook, is still disturbed by the larger issues of the case. Being compelled to unlock a phone, so the government can look inside, is one thing, he said. Being compelled to alter or write new software, in order to undermine a device’s security, is another matter entirely.



But are the two so different? As Shane Harris reported in  the  Daily Beast , Apple has unlocked phones, at the government’s request or under court order, at least 70 times since 2008. In doing so, Apple implicitly accepted the principle that the government has the right, under court-approved circumstances, to get inside Apple-made phones. Cook designed the iOS8 operating system in 2014 precisely to evade further requests: Under the new system, the user sets the code, so if the government asks Apple executives to unlock a phone, they can honestly say they  can’t . Now the FBI has devised a way around the problem by asking Apple to shut off the data-wipe feature, so the phone can be unlocked with brute force. The technique is different, but the outcome—letting the government into a phone designed by Apple—is the same. Cook may have changed his mind about the government’s right to his products’—his customers’—contents; he may regret ever cooperating in the first place. But that doesn’t negate the fact that Apple accepted the principle in the past, and the company’s identity and ownership haven’t changed in the interim.



Another weakness in Cook’s argument, from a legal point of view, is that the phone was bought by Farook’s employer, the San Bernardino County’s Department of Public Health. And county officials—the phone’s owners—have consented to an FBI search of the phone. In other words, as George Washington University law professor  Orin Kerr  argued in a balanced, well-reasoned analysis, “There are no Fourth Amendment rights in this case.” "

 

MORE http://www.slate.com/articles/technology/future_tense/2016/02/how_apple_ceo_tim_cook_s_stand_against_the_fbi_could_backfire.html



Tags

jrDiscussion - desc
[]
 
JohnRussell
Professor Principal
link   seeder  JohnRussell    8 years ago

 As Shane Harris reported in  the Daily Beast , Apple has unlocked phones, at the government’s request or under court order, at least 70 times since 2008. In doing so, Apple implicitly accepted the principle that the government has the right, under court-approved circumstances, to get inside Apple-made phones. 

 
 
 
Hal A. Lujah
Professor Guide
link   Hal A. Lujah    8 years ago

"More than that, he continued, Apple’s engineers could reset the feature in their own lab, through hard-wired controllers, with no government official present. They could even do the brute-force code-breaking themselves (password-sniffing software is commercially available), after which they could hand over the contents that the FBI wants, then destroy the phone. Problem solved, no leakage."

Except that Apple would then assume liability in the chain of custody.  What if information was subsequently found on that phone that implicated others?  I'm guessing that Apple's lawyers would be advising to not get involved in a situation where someone could legitimately defend themself by claiming they were set up by evidence that could have been placed while outside of the official chain of custody.

 
 
 
Cerenkov
Professor Silent
link   Cerenkov    8 years ago

Apple is wrong on this issue. If they don't support the FBI is such an obvious case, they court new legislation mandating backdoors. That would be bad.

 
 
 
Hal A. Lujah
Professor Guide
link   Hal A. Lujah  replied to  Cerenkov   8 years ago

What are you saying then?  Apple should unlock the phone with zero FBI oversight?  If not the FBI, then who should oversee the opening of a terrorist's phone?

If you are suggesting that the FBI should be present during the process, then the precedent is being set which effectively mandates a back door - which , as you said, would be bad.

 
 

Who is online







33 visitors