Virtual Event Today: Ransomware Resilience & Recovery Summit - Login to Live Event
Connect with us

Hi, what are you looking for?

SecurityWeekSecurityWeek

Privacy & Compliance

Industry Reactions to FBI’s Request for iPhone Backdoor: Feedback Friday

A judge has ordered Apple to help the FBI search an iPhone belonging to the man who shot and killed 14 individuals in San Bernardino in December, but the tech giant will not give in without a fight.

A judge has ordered Apple to help the FBI search an iPhone belonging to the man who shot and killed 14 individuals in San Bernardino in December, but the tech giant will not give in without a fight.

The FBI wants Apple to create a firmware that will allow the agency to brute-force the PIN set by Syed Rizwan Farook on his work-issued iPhone 5C. Apple CEO Tim Cook says his company has been assisting the FBI in its investigation, but asking the company to create a backdoor to the iPhone is too much because there is no guarantee that it will only be used in this one case and that it will not fall into the wrong hands.

Industry reactions to FBI's request for iPhone backdoor

The news has sparked a debate between those who want tech companies to introduce backdoors into their products to facilitate criminal and national security investigations, and those who want data to be properly protected.

Experts determined that it’s technically possible for Apple to comply with the court order, but many believe the U.S. government is trying to set a precedent and make similar requests in other cases as well.

The company’s customers, privacy advocates, Google, Microsoft, and other tech companies have sided with Apple in this debate, but the FBI also has some supporters, including many politicians and the White House.

Industry professionals contacted by SecurityWeek commented on the story, providing good arguments on both sides.

And the feedback begins…

Mary Ann L. Wymore, Officer with Greensfelder, Hemker & Gale, P.C.:

Advertisement. Scroll to continue reading.

“Both sides have compelling arguments. As communications technology becomes more sophisticated and allows for greater privacy, it also provides criminals and terrorists with ever-increasing means to grow their syndicates and to plan criminal action clandestinely. But even if one accepts the government’s goals as valid, orders such as that directed to Apple are unlikely to solve the problem. Hundreds of encryption products are available on the global market. Wrongdoers intent on keeping secrets won’t rely on Apple products to protect their secrets. They’ll simply turn to methods and technology outside the reach of the U.S. government.


Overlaying the important privacy and security interests at stake, this case also raises collateral questions regarding the government’s ability to force private companies into the role of a quasi-governmental actor or agent. Apple is in a no-win situation. It well may be perceived by advocates of tighter governmental regulation as more concerned with protecting terrorists, yet simultaneously distrusted by privacy advocates concerned that their private communications will be accessed and misused by others.”

Lance James, Chief Scientist, Flashpoint:

“Apple didn’t need to react this way – it was premature and apples and oranges. Forensically speaking and legally speaking the Judge asked for reasonable assistance on unlocking THIS SPECIFIC phone. Even if that requires them to modify the firmware with a key they have they don’t have to give that software to the FBI. They can simple do a few steps: 1) Give phone to Apple 2) Apple runs their secret sauce and makes a backup image of the data/phone info 3) they give that image backup to FBI which only contains the data not the key. This is how forensics on mobile devices are done, by a backup image.


There is no threat to mass surveillance here. it was a reasonable search warrant request no different than a warrant to the free webmail services or face books asking for data. You’re not giving them your keys to ALL your data, you’re only giving them the very specific data of the account that was requested.


All companies have a way to modify their own devices and software – it’s like car companies having spare keys for individual cars… they exist. They don’t have to provide a back door to the FBI – they can provide a subkey, individual key, or Apple can take the device and unlock it and give them the data they requested.


This argument is not the proper argument. If they want to say it’s about privacy, Apple shouldn’t have the backdoor either. It would have been better for Tim Cook and Apple to say: We simply just don’t have a way to do that…”

Chris Eng, VP of Research, Veracode:

“The issue here is not one of creating a backdoor; nor is the FBI asking for Apple to decrypt the data on the phone. They’re asking for a software update (which could be designed to work only on that one particular phone) which would then allow the FBI to attempt to crack the passcode and decrypt the data. Such a solution would be useless if applied to any other phone.


In the past Apple has complied with requests to, for example, bypass lock screens in aid of criminal investigations. It’s only in recent years that they’ve taken an ideological stance on consumer privacy. I believe Apple is taking this position less as a moral high ground and more as a competitive differentiator, betting that Google won’t do the same.


The broader discussion around whether generic backdoors should be provided by technology providers to law enforcement is completely different, and the continued backlash against this is fully warranted. There is no way to do this safely without endangering users. Put a different way, if a backdoor exists for law enforcement, it also exists for criminals. However, this isn’t what’s being proposed in the Apple case right now and it’s important to make the distinction.”

Danelle Au, VP of Marketing, SafeBreach:

“Apple is calling the FBI’s request to unlock the San Bernardino iPhone a backdoor and a master key to data on any phone. It’s not. It’s not a request for encryption keys. It’s new firmware that can enable more than ten PIN attempts without erasing data on the phone, and without waiting for delays between attempts. As long as this firmware is restricted to running only on the San Bernardino phone, it can’t be used for brute force attempts on other phones. However, the bigger implication here is the government’s desire to get to encrypted data on a phone. This is one workaround that gets them closer. It also sets a precedence for software companies everywhere that the FBI can dictate (or remove) features on their product when it suits them.”

Eve Maler, VP Innovation & Emerging Technology, ForgeRock:

“In my mind, in this case the request was reasonable since Farook was clearly guilty and a warrant was issued for the information. However, it’s a testament to the strength of the technical protections that the process requires cycling through however many combinations are needed to unlock the phone, depending on Farook’s PIN code — it could have been numeric or alphanumeric, which could complicate things tremendously. If Farook’s employers mandated a mobile security profile on the phone, which they likely did since they issued the phone, it should be possible to figure out how many combinations are possible.


However, Apple has a business model (selling hardware and services in a vertically integrated fashion) and a dominant market position that help it use more transparent and stronger measures in the face of what may often be unreasonable government requests. A backdoor iOS is beyond the pale, and good for Apple. And lucky for the rest of us they have the right business model to incentivize that answer. This was a reasonable request from the government but at the end of the day Apple has the authority and right to also say ‘no’.


The refusal is due to unwillingness to create an OS with backdoors: This shows that the conversation about government encroachment on encryption soundness is reaching a fever pitch. Building in encryption backdoors is a foolish approach that doesn’t have the effectiveness nor the beneficial societal effects promised.”

James L. Bindseil, CEO, Globalscape:

“Apple’s dilemma is unenviable-balancing the desire to see justice served in the San Bernardino shooter case, with the desire to avoid setting a dangerous precedent by creating a backdoor at the government’s behest that would undermine the security of their product and overall trust in the Apple brand. If Apple gives in, not only will it become harder for smaller companies to stand up to similar pressure, but it will serve as an example and possibly a new standard for the FBI.


Now that the question has been asked, we’re seeing a very important public debate unfold that cuts to the heart of how much we as a society value our privacy. Whether it is Apple or Globalscape or any of thousands of companies working to create products that enable individuals and organizations to securely share sensitive, private information, there is responsibility to deliver on the promise of trust. Asking a private company to create a back door that, regardless of what the FBI says to the contrary, could then be exploited by others would be a serious violation of the public trust.”

Bill Blake, President, Fasoo:

“As a company whose primary business is the encryption and protection of confidential data we support Apple’s position to resist the government request to provide a back door to encrypted data on the iPhone. Should Apple be forced to comply with the government request there is no guarantee that the back door won’t used in any circumstance that the government deems necessary. All you need is another Edward Snowden in the right place to expose the code necessary to unlock data and our entire way of life will be at risk.


If the government invokes the All Writs Act of 1789 to justify an expansion of its authority we will all evidently be at risk of exposing our personal information and communications. Unfortunately, I believe that is where this is heading.”

Johannes Lintzen, VP of Business Development, Utimaco:

“It’s important to realize that the Apple VS FBI controversy is partly fueled by commercial interests. The issue lies in the fact that if Apple is forced to add a backdoor to their product, a competitor will step up and provide a similar product but without the backdoor. Cryptography isn’t a secret science and if I add a backdoor to anything, even if supposedly only law enforcement have the key, I have still added a new vulnerability to my product and made it less secure.


This issue also highlights an interesting divide between the United States and the European Union regarding who protects privacy and how. In Europe, this role is in the realm of the government while in the U.S., it’s the technology companies that take on this task, although mostly because they benefit the most from collecting consumer data and therefore have a commercial interest in protecting their customer’s privacy.”

Michael Harris, CMO, Guidance Software:

“We support Apple CEO Tim Cook’s position to oppose the FBI order demanding that Apple creates a backdoor for iPhones in order to assist with the investigation of the San Bernardino shooter case. Apple should act in the best interests of the company and its shareholders. We fully support the need for the FBI and other law enforcement agencies to discover digital evidence in criminal investigations, but we believe this problem should be solved by and between the agency of investigation and forensic security experts.


Having a back door to the iPhone operating system may make evidence collection easier, but it is not the only way to discover or access deep forensic evidence in most cases. As long as the use of encryption technology is a legal way of protecting user data privacy, the burden is on forensic security experts to innovate new ways to unlock evidence.”

Mike Davis, CTO, CounterTack:

“If Apple does decide to help, I do agree it creates a horrible precedent and there is little way to ensure that changes don’t get out. However, I feel it is important to realize it is NOT impossible to change the failed attempts the government is asking for. The settings, such as number of passcode attempts before erase, are configurable via policy. For example, tools such as Mobile Iron and Airwatch allow companies to increase this or disable it and you can disable the wiping. If you use your iPhone for work, your company may choose to modify these settings to help the government even though Apple doesn’t want to make the default weak.


What Apple is doing is courageous and something I am glad is happening. While I cannot comment on the legal nature of this, the discussions that I am having with friends and family that are not security or even tech people is great. Questions that require explaining how the encryption really works and where the weaknesses really are (like the fact that if you get the machine paired to the phone you don’t need the PIN to get the data) help raise the bar globally. This awareness is beneficial to all privacy advocates and detrimental to the government long term.


The more we talk about this, I believe the more people will enable a PIN on their phone and leverage the capabilities provided by Apple. To me, this is a great win regardless of whether Apple decides to help or not.”

Stephen Cobb, Senior Security Researcher, ESET:

“Apple’s position, as articulated in CEO Tim Cook’s Message to Our Customers, is widely supported by privacy advocates and cybersecurity professionals, many of whom agree that comp
lying with this particular FBI request would set a damaging precedent.


Indeed, it has been suggested that the FBI is looking for a precedent, a legal ruling that would apply well beyond this single and very emotional case. However, if Apple were to comply with this court order, the ramifications for US companies and consumers would be significant, from undermining international commerce to eroding trust in the technology on which so much of daily life and business in North America depends.


For example, the EU and the US are currently engaged in very delicate negotiations to preserve the free flow of data across the Atlantic. Past arrangements, known as Safe Harbor, came undone because of the apparent inability of US companies to control government access to personal data. US negotiators have been assuring the Europeans that there has been significant reform of government surveillance, but those assurances would be greatly weakened if Apple were to comply with this court order, potentially placing the new Privacy Shield agreement in doubt even before it has been implemented.”

Philip Lieberman, president, Lieberman Software:

“It is well known that both the phone carriers and manufacturers of locked cell phones maintain their own set of keys within their publicly declared ‘walled gardens’ to the devices they sell. This barrier to competition and their ability to select winners and losers in their app store, as well as patch and improve their operating system at any time, is also the back door they have to get into any phone their wish, and do as they wish at any time, irrespective of a customer’s wish to maintain privacy or security. It will be interesting to see how all parties respond to a Federal order to comply with a lawful order designed to counter terrorism.”

Igor Baikalov, chief scientist, Securonix:

“It’s a really sensitive topic, although most of the sensationalism around it comes from the attempts to institute a generic approach to a very diverse set of circumstances. The San Bernardino case is a no-brainer, but when one considers a long line of inquiries lined up after that one, claiming similar urgency plus preventive potential, but not having a benefit of hindsight, the question becomes where to draw the line and who is the to draw it. Since the technology vendors seem to be the ones in the cross-hairs of both customers and law enforcement, it’s only fair to leave the determination to them. Give them the ability to balance their privacy policies with legal pressure, because the success and often the survival of their business is at stake here, whether they want it or not.”

Bryan Glancey, Chief Technology Officer, OptioLabs:

“Since we are talking about an existing phone that already has software on it which we do not have the password for, Apple does not have the capability to break the in place protections. Apple phones are evaluated as in compliance with the Common Criteria Mobile Device Fundamentals Protection profile – which means that this phone has been independently validated to operate in a manner that would protect the information on it from disclosure.


The device in question is a corporate owned device, so there are no issues of privacy involved in this particular request. The deficiency in the current implementation for this device is that the information on a corporate owned device is outside of the control of the owner of that device. Proper security controls should have been put in place prior to the use of this device by an organizational employee.”

Written By

Eduard Kovacs (@EduardKovacs) is a managing editor at SecurityWeek. He worked as a high school IT teacher for two years before starting a career in journalism as Softpedia’s security news reporter. Eduard holds a bachelor’s degree in industrial informatics and a master’s degree in computer techniques applied in electrical engineering.

Click to comment

Trending

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest threats, trends, and technology, along with insightful columns from industry experts.

Join the session as we discuss the challenges and best practices for cybersecurity leaders managing cloud identities.

Register

SecurityWeek’s Ransomware Resilience and Recovery Summit helps businesses to plan, prepare, and recover from a ransomware incident.

Register

People on the Move

Backup and recovery firm Keepit has hired Kim Larsen as CISO.

Professional services company Slalom has appointed Christopher Burger as its first CISO.

Allied Universal announced that Deanna Steele has joined the company as CIO for North America.

More People On The Move

Expert Insights

Related Content

Compliance

The three primary drivers for cyber regulations are voter privacy, the economy, and national security – with the complication that the first is often...

Cybersecurity Funding

Los Gatos, Calif-based data protection and privacy firm Titaniam has raised $6 million seed funding from Refinery Ventures, with participation from Fusion Fund, Shasta...

Privacy

Many in the United States see TikTok, the highly popular video-sharing app owned by Beijing-based ByteDance, as a threat to national security.The following is...

Privacy

Employees of Chinese tech giant ByteDance improperly accessed data from social media platform TikTok to track journalists in a bid to identify the source...

Mobile & Wireless

As smartphone manufacturers are improving the ear speakers in their devices, it can become easier for malicious actors to leverage a particular side-channel for...

Cloud Security

AWS has announced that server-side encryption (SSE-S3) is now enabled by default for all Simple Storage Service (S3) buckets.

Audits

The PCI Security Standards Council (SSC), the organization that oversees the Payment Card Industry Data Security Standard (PCI DSS), this week announced the release...

Application Security

Security researchers at Google’s Project Zero have picked apart one of the most notorious in-the-wild iPhone exploits and found a never-before-seen hacking roadmap that...