By Klarize Medenilla
Apple made headlines this month when the multinational technology company refused to comply with the FBI’s request to gain access into the iPhone of Syed Rizwan Farook, one of the shooters during the San Bernardino attack that killed 14 people on Dec. 2, 2015.
Specifically, the FBI has requested that the tech giant create a new version of its operating software, which would bypass several iPhone encryption features and install it into this one iPhone.
FBI Director James Comey maintained that the issue is over a single device. But Apple CEO Tim Cook said in a customer letter that if the company creates this backdoor to this singular iPhone then that technique can and will be used to access other phones, reducing the security of all iPhone devices.
Following the controversy, Pew Research Center conducted a public opinion survey on the issue, and it found that 51 percent of Americans think that Apple should assist the FBI in unlocking Farook’s phone that may contain critical information linking to other terrorists and terrorist attacks.
Apple supporters generally agree with Cook’s statement that if the FBI is granted access to Farook’s iPhone by way of a new backdoor, the agency will get away with accessing more phones.
The core of this issue has to do with the broad matter of encryption, which is a way to safeguard privacy between communicators over a distance by way of mathematical operations, which is essential in the age of the Internet where “anything can be seen by anyone,” according to Max Wolotsky, a fourth-year computer science student at Cal Poly Pomona.
Generally, there are two ways in which one can bypass encryption: cracking the code from a distance over the network and getting a hold of the physical phone and trying to decrypt it by taking it apart, according to Mohammad Husain, an assistant professor at CPP’s Department of Computer Science.
Apple’s dedication to user privacy is shown in its encryption structure and vision. As the new versions of the iPhone are revealed, the company promises complete data privacy by way of customized passcodes.
“At one point in the manufacturing, Apple had complete control of your iPhone, so if they wanted to they could have kept a backdoor or another special matter to read your data without your knowledge,” said Husain. “Now the claim is that the data on your iPhone is hidden by your password, so even Apple can’t do anything about it.”
Some believe that the very existence of this process of creating a backdoor ” even for this one phone ” would likely threaten the privacy of all iPhone users.
“I don’t like the idea of it,” said Sean Corlin, a first-year computer science graduate student, (’15, computer science). “And not even because of the fact that it will tell people that it is possible, but even if it’s at Apple, it means there’s someone at Apple who has the ability to do that. That person, at any point, can quit Apple and take it on a flash drive. Who knows where it will go next? The fact it exists is in itself a problem, regardless if they gave it to the FBI.”
Fourth-year English education student Kenny Cameron believes that Apple is doing the right thing by amping up the encryption structure so that only the user alone can access his or her iPhone, thus further ensuring users’ privacy.
“Privacy is valuable, and it should be valued more, especially by the government,” said Cameron. “I don’t think it’s the government’s job to want a backdoor into ordinary people’s phones. Obviously they’re looking into terrorist, but when this backdoor exists, more than the government will just be allowed to get into it.”
In terms of bypassing security measures to seize terrorists, this situation ” which specifically pertains to a highly encrypted smartphone device ” is unique and unheard of, according to Marc Scarcelli, assistant professor at the Department of Political Science.
He referenced a common metaphor of an individual obtaining a personal safe that could be physically opened by neither the safe company nor law enforcement despite obtaining a court warrant: should there be a way to circumvent the security features of this particular safe but that could also potentially open other safes?
“It’s unprecedented, and we’ve never had a situation before in which you really could secure personal affects in a way in which absolutely no one could access even with the appropriate court order,” said Scarcelli.
Scarcelli compared the FBI’s request to an instance after the 9/11 attacks when the U.S. National Security Agency secretly paid network security company RSA Security $10 million to institute a new cryptography system that had a backdoor and allowed the NSA to decode the encryption.
That cryptography system is now discredited, but until the Edward Snowden leaks in 2013, this system was the default for a “wide range of Internet and computer security programs,” according to a Reuters article.
Scarcelli believes that Apple should comply with the request as long as the FBI maintains its stance that it applies to only one device. Investigating, in this case, what might be linked to other terrorists and terrorist attacks is crucial, said Scarcelli. He does, however, agree with Apple’s stance that this has the potential to “worsen security for everyone.”
“It still comes back to that technical aspect: if it were true that Apple could just unlock this one phone and it doesn’t affect anyone else, then I say they should do it,” said Scarcelli. “But if this is going to reduce security for everybody, as it seems to be, then it is not acceptable to say that we should all have less security just so law enforcement can get access to a small number of phones.”
Jean-Paul Escobar / The Poly Post
Apple vs. FBI
Show Comments (0)