The February 16, 2016 order issued by Magistrate Judge Pym gave Apple five days to apply for relief if Apple believed the order was "unreasonably burdensome". Apple announced its intent to oppose the order, citing the security risks that the creation of a backdoor would pose towards customers.[31] It also stated that no government had ever asked for similar access.[32] The company was given until February 26 to fully respond to the court order.[33][34]
On April 7, 2016, FBI Director James Comey said that the tool used can only unlock an iPhone 5C like that used by the San Bernardino shooter, as well as older iPhone models lacking the Touch ID sensor. Comey also confirmed that the tool was purchased from a third party but would not reveal the source,[60] later indicating the tool cost more than $1.3 million and that they did not purchase the rights to technical details about how the tool functions.[61] Although the FBI was able to use other technological means to access the cellphone data from the San Bernardino shooter's iPhone 5C, without the aid of Apple, law enforcement still expresses concern over the encryption controversy.[62]
Why Apple is right to oppose an iPhone backdoor for law enforcement
"With Apple's privacy policy for the customers there is no way of getting into a phone without a person's master password. With this policy there will be no backdoor access on the phone for the law enforcement to access the person's private information. This has caused a great dispute between the FBI and Apple's encryption.[62] Apple has closed this backdoor for the law enforcement because they believe that by creating this backdoor it would make it easier for law enforcement, and also make it easier for criminal hackers to gain access to people's personal data on their phone." Former FBI director James Comey says that "We are drifting to a place in this country where there will be zones that are beyond the reach of the law."[62] He believes that this backdoor access is crucial to investigations, and without it many criminals will not be convicted.[62]
[48] Joe Nelson and Sandra Emerson, State, Federal Law Enforcement Agencies File in Support of FBI in Apple Battle, The Sun (March 3, 2016 2:46 PM), -federal-law-enforcement-agencies-file-in-support-of-fbi-in-apple-battle.
Tuesday's court order compelling Apple to hack the iPhone belonging to a gunman who killed 14 people and injured 22 others has ignited an acrimonious debate. CEO Tim Cook called the order "chilling" because, he said, it requires company engineers to create the equivalent of a backdoor that could be used against any iPhone. Law enforcement officials, meanwhile, contend the order is narrowly tailored to ensure only the shooter's phone is covered.
But as the order is drafted now, there are no guarantees that government officials won't get access to the software. That means it's also feasible that any software Apple produces would be reverse-engineered by government engineers and very possibly private forensics experts who regularly work with law enforcement agencies. And if the past digital rights management bypasses are any guide, odds are that with enough analysis, someone will figure out a way to remove the restriction that the OS install itself only on Farook's phone. From there, anyone with access to the custom iOS version would have an Apple-developed exploit that undoes years of work the company put into securing its flagship iPhone product.
It's worth noting, though, that Google's relationship with Android is very different from Apple and the iPhone. Apple retains direct control over the iPhone's hardware and software, while Google's role is more like a guiding rudder. Google itself might oppose the US government's demands for a backdoor in Android, but that probably wouldn't stop the government from going after mobile carriers or OEMs, both of which could also be compelled to insert a backdoor.
The Wall Street Journal has confirmed that there are actually 12 other iPhones the FBI wants to access in cases that have nothing to do with terrorism. _email/justice-department-seeks-to-force-apple-to-extract-data-from-about-12-other-iphones-1456202213-lMyQjAxMTI2MjIzMzMyMTMwWj
A senior law enforcement official, speaking at a DOJ briefing March 10, accused Apple of creating a diversion by saying the case is not about a single iPhone and trying to alarm the court with issues of network security, encryption, backdoors and privacy, invoking larger debates before Congress and in the news media (see Apple, FBI Battle Before House Judiciary Committee).
Even if it was a simpler process and tech companies could build a backdoor for presumably the right reasons, two enduring questions are who gets to decide who the good guys are, and under what legal, ethical and moral circumstances should investigators be issued the key.
Undoubtedly, criminals and terrorists abuse encryption platforms to stymie law enforcement efforts. Yet, these same tools are critical and fundamental parts of protecting classified materials from unauthorized disclosure. DOJ claims that allowing for exceptional access to these protected systems would not undermine their security, implicitly arguing that such access will not degrade overall national security and personal and commercial privacy. These arguments, however, fail to take account of existing lessons learned about the dangers of such access and overstate the importance of backdoors while distracting from more basic investment and attention to building capabilities to combat crime. Encryption backdoors cannot be the shortcut we pursue to tackle internet-enabled crime as the risk to national security is far too great and the upside decidedly questionable. Instead, Congress should ensure that federal agencies bolster public-private partnerships, receive the resources to pursue criminals, and hold the Executive Branch accountable to implementing existing laws, rather than seeking out a shortcut that is inconsistent with national security and civil liberty principles.
For years, the US government begged Apple executives to create a backdoor for law enforcement. Apple publicly resisted, arguing that any such move for law enforcement would quickly become a backdoor for cyberthieves and cyberterrorists.
Personally, I think that Apple should not add a back door into their phones. They should keep their products the same way they always have been. In tough times like terrorist attacks, having that backdoor into phones can be very vital to finding the terrorist's motives, but in some cases the FBI doesn't even get the info they need by hacking their way through the door. As big of a company that Apple is, I think that switching the makeup of their products to have a "backdoor" could really hurt their business. If people see that their privacy is all of a sudden threatened because of the new backdoor in iphones, that could really hurt Apple as a company. Privacy is becoming very hard to come by as time moves on. Therefore, I think it is a mindful marketing decision of Apple to keep the back door closed.
In my opinion it would not be wise for Apple to open the backdoor for its technology. By doing this Apple is protecting people's rights to their own personal privacy, and by doing this they're creating a sense of security for all their customers. I understand on the other hand why some people would want to have them change their technology though, it would be very useful for the government in the case of a terrorist attack or upcoming terrorist attacks. But by doing this they'd be taking away people's rights to their own personal privacy and possibly even their own personal health. Therefore, I believe that Apple made the right decision to not open the backdoor to their technology.
After reading Dr. Hagenbuch's post and the responses above, I would agree that Apple should not make alterations to their phones and products that would allow for easier entry. While it may benefit the FBI and government in some ways, I fail to see how it would reduce crime on a substantial level. In fact, after reading this post, it seems as though 'opening a backdoor' would do just the opposite. If Apple was to make these changes, hacking individual's private information may become easier. In a time in which hackers cost the nation billions of dollars annually and people are more concerned with privacy online than ever, it seems only right that Apple would be taking steps in aiding both of these concerns.
As all of the comments above have said, I agree that Apple should maintain the privacy and security of their phones by keeping the "backdoor" closed. To me, the government can get all of the information they are looking for in an iPhone from other sources such as cell phone companies and web services. For Apple, if they were to allow this backdoor to be opened, it could also be exploited by either hackers or ex/corrupt government officials. Data privacy is one of the top issues in technology right now and proving that they maintain secure data allows Apple to continue their brand loyalty and keep customers happy by doing what is right.
Do I think that Apple should change all their privacy policies so the U.S. government can try to keep a bit more of a handle on crooks? No, I don't. If there was a way for Apple to do that without threatening the rights and privacy of all the non-crooks, I'd be all for it. This is similar to the gun control argument that is everywhere in the U.S. Do we get rid of all guns so a few less people might get killed by them, or do we risk being helpless in the face of governmental tyranny? They're a little different, granted, but the idea is still the same. Do we throw away our own personal safety and rights in an attempt to stop a select few bad guys? I don't believe that would be a good solution in either of the circumstances, so no, I don't think Apple should create a "backdoor".
I believe that Apple is doing the right thing by not allowing backdoor access into their phones. This is an example of mindful marketing especially because it is upholding the value of privacy for cellphone users. However, I do think in a situation of trying to find out who terrorists are communicating with, I think Apple could have unlocked the phone as a matter of security for others. I think that postmortem it is okay to access the phone of a terrorist. But I do understand that if Apple gives in to this scenario the FBI and other agencies will push the line even further so I understand Apple wanting to nip this in the bud. 2ff7e9595c
Comments