Site icon Metro Voice News

iPhone backdoor brawl at forefront of personal privacy debate

Law enforcement agencies depend on evidence to build and solve criminal cases. But much of that evidence these days comes in the form of electronic data in mobile devices such as smartphones, which are increasingly protected by encryption mechanisms. But, should there be an iPhone backdoor or backdoor into any phone?

The tension between the need for law enforcement access to electronic evidence and device manufacturers’ desire to protect the privacy of their customers’ personal data erupted recently in a confrontation between the FBI and Apple, Inc.

The case in question involves a single device—the iPhone 5C used by one of the shooters in the San Bernardino, Calif., attack that killed 14 people and wounded 22 others. On Feb. 16, a federal judge in California issued an order compelling Apple to assist the FBI in gaining access to the data on the phone.

Tim Cook, Apple’s CEO, immediately fired back in an open letter posted to Apple’s website, arguing that the government wanted to force Apple to build a software “backdoor,” which, if it fell into the wrong hands, “would have the potential to unlock any iPhone in someone’s possession.”

As the week progressed, other major tech companies began to side with Apple, fearing that helping the federal government bypass iPhone security in this case would lead to a slippery slope in which law enforcement would require access to all mobile devices.

“Ordering a company to hack one targeted system is clearly the first step to ordering them to backdoor them all,” Bob Lord, Yahoo’s chief information officer, tweeted on Friday, according to The New York Times.

By Friday, the Justice Department hit back at Apple, filing a motion to force the company to comply with the original court order, saying its refusal to comply was “based on concern for its business model and public marketing strategy,” according to a report in USA Today.

Lawyers for the Justice Department argued in court documents filed Friday that the order “does not require Apple to hack its own users or de-crypt its own phones” and would not provide “hackers and criminals” with access to iPhones.

The company already has turned over information that Syed Rizwan Farook, the San Bernardino shooter, stored on its cloud servers. But based on the FBI’s request, the court ordered Apple to do three more things:

  1. Bypass or disable the feature that automatically erases all data on the iPhone after 10 incorrect attempts to type in the 4- or 6-digit passcode.
  2. Allow the FBI to connect an external computer to the iPhone to automatically run an unlimited number of passcode combinations in an attempt to discover the correct passcode. A 4-digit passcode has 10,000 possible combinations, but using what is called a “brute force” approach, an external computer could run through all 10,000 combinations in about an hour, provided no delays are built into the encryption software.
  3. Ensure the encryption software does not introduce any delays between passcode attempts beyond what is built into the phone’s hardware, about 80 milliseconds.

The FBI is not technically requiring Apple to create software that hacks into an iPhone—what industry insiders call a iPhone backdoor. It is ordering Apple to disable or bypass a technology that would destroy evidence in a criminal case.

“First, this is not a case about security,” wrote Gus Hurwitz in a blog at the American Enterprise Institute. “It is not about weakening encryption that is used to keep information from the hands of those not authorized to access it. Rather, it is about a technology that potentially destroys data—that is designed, at least in part, to keep data from those who are legally authorized to access it.”

But many digital privacy advocates claim the FBI’s argument is specious.

“This is functionally a backdoor, one that the court has required Apple to create, to allow the FBI to then open using brute force,” wrote Shahid Buttar at the Electronic Freedom Foundation. “If any black hat hacker, foreign intelligence agency, or criminal syndicate got their hands on this tool, they could exploit it for their own nefarious purposes.”

Although the government has tailored the court order very narrowly and does not believe the security and privacy implications are as dire as many in the tech community claim, the showdown between the FBI and Apple likely will end up in higher courts. The company is expected to invoke the First Amendment’s free speech protections as one of its key legal arguments, while government lawyers have claimed the All Writs Act of 1789 (a federal statue authorizing federal courts to issue orders “necessary and appropriate in the aid of their jurisdictions”) is the legal authority for the court order, according to Reuters.

Public opinion over the controversy remains split. In a poll conducted last week by USA Today, 51 percent sided with the federal government, 41 percent with Apple, with the remaining undecided. As a court battle emerges, even those at the highest levels of national security admit to being conflicted about the circumstances surrounding this case.

“In this specific case, I’m trending toward the government, but I’ve got to tell you in general I oppose the government’s effort,” Gen. Michael Hayden, former director of the National Security Agency, said in an interview with USA Today, noting he believes FBI director James Comey does eventually want a global “backdoor” available to law enforcement.

“Frankly, I think on balance that actually harms American safety and security, even though it might make Jim’s job a bit easier in some specific circumstances,” Hayden said.

– By Michael Cochrane WNService

 

 

Exit mobile version