Backdoors don’t just let law enforcement in—they open the door to attackers, insider threats, and broken trust.
When government demands something, ‘No’ is not an acceptable response. Government simply waits, rephrases the demand, and then demands again.
The debate over law enforcement access to encrypted content is not new – it has been almost continuous since the 1970s. We hear much about the views of government (favorable), vendors (disapproval), and civil liberty groups (total rejection of the idea). But we hear little of the views of the security professionals who are tasked with navigating regulations and maintaining the security of IP, PII, and business continuity.
Historical setting
The growth of encryption in the 1970s led to government concern that it would give adversary nations an advantage with impenetrable communication. The US government responded by classifying encryption as a munition and applying export controls.
At the same time, the global internet was making technology itself global. After Phil Zimmermann released PGP as freely available on the internet, he was investigated for three years. The government contended that in allowing the code to spread internationally (via the internet), Zimmermann had breached US export controls. The investigation was dropped in 1996 after MIT Press published the source code as a book which could by-pass export controls under the banner of free speech.
Meanwhile, the government had already switched its focus from international to intranational encryption with its 1993 proposal for the Clipper Chip. The chip could be accessed by a key held in escrow but retrievable via a warrant. Vendors objected, experts objected (Matt Blaze discovered vulnerabilities in the Clipper Chip design), and the public objected to government having access to private communications. The Clipper chip was abandoned. The government floated different key escrow ideas, but these also failed for similar security and privacy reasons.
By the end of the 1990s it appeared as if the Crypto War had been won by civil society. But what really happened was the government itself went dark with a technological and widespread surveillance system largely operated by the NSA with assistance from GCHQ. This was exposed by Edward Snowden in 2013. On the user side it created a desire for ever stronger encryption (driving the demand for end-to-end encryption – E2EE), while on the government side it kicked off a move toward legal enforcement rather than mutual agreement.
The difference between legal enforcement and mutual agreement was highlighted in 2016. Firstly, the FBI asked Apple to create a special OS that could be loaded onto a security-protected iPhone in its temporary memory, disabling the security and allowing an electronic brute force against the passcode. Apple declined, but if the FBI had been successful, it would have been an example of mutual cooperation.
Secondly, in the same year, the UK introduced its Investigatory Powers Act (IPA, popularly known as the ‘Snoopers’ Charter’). This act includes a non-disclosure requirement: anyone served a warrant to give up an encryption key is forbidden (including by threat of imprisonment) from announcing that it has received the warrant. We do not know how often the Act has been used in earnest, because the targets are not allowed to tell us.
However, it is almost certain that it was used against Apple. In February 2025, Apple ceased offering Advanced Data Protection, which provides encryption between users and iCloud, within the UK. The strongest proof that this was in response to a government demand for keys under the IPA is that Apple has never explained why it did so.
The IPA would not be effective against E2EE. A fundamental principle of E2EE is that the provider has no access to the decryption keys and cannot give them to law enforcement. So once again, law enforcement has shifted its focus, now demanding a backdoor into E2EE itself.
Government has decided it needs access to encrypted communications. After decades of trying to achieve this, it has failed. But it won’t give up. The Crypto War remains ongoing.
Practitioners’ views on E2EE backdoors
Boris Cipot, senior security engineer at Black Duck, calls E2EE backdoors a ‘gray area’. On the one hand, “What if such access could deliver the means to stop crime, aid public safety and stop child exploitation?” But on the other hand, “The idea of someone being able to look into all private conversations, all the data connected to an individual, feels exposing and vulnerable in unimaginable ways.”
As a security practitioner he has both moral and practical concerns. “Even if lawful access isn’t the same as mass surveillance, it would be difficult to distinguish between ‘good’ and ‘bad’ users without analyzing them all.” Morally, it is a reversal of the presumption of innocence and means no-one can have any guaranteed privacy.
Professionally he says, “Once the encryption can be broken, once there is a backdoor allowing someone to access data, trust in that vendor will lessen due to the threat to security and privacy introducing another attack vector into the equation.”
It is this latter point that is the focus for most security practitioners. “From a practitioner’s standpoint,” says Rob T Lee, chief of research at SANS Institute and founder at Harbingers, “we’ve seen time and again that once a vulnerability exists, it doesn’t stay in the hands of the ‘good guys’ for long. It becomes a target. And once it’s exploited, the damage isn’t theoretical. It affects real people, real businesses, and critical infrastructure.”

J Stephen Kowski, field CTO at SlashNext, agrees. “Security by obscurity just doesn’t hold up– history shows that any secret backdoor, no matter how well hidden, eventually gets found and abused, whether by hackers or insiders. Once a backdoor exists, it’s not just the ‘good guys’ who can use it; attackers can too, putting everyone’s private messages and sensitive business data at risk.”
The insider risk is often ignored in this debate but is real. Audian Paxson, principal technical strategist at Ironscales, explains, “Privileged access lands in the hands of employees, contractors, even officials. And they don’t need bad intent to cause harm… sometimes curiosity or carelessness is enough.”
There’s another difficulty. What about privacy laws? If hackers access and steal PII via law enforcement’s backdoor, who is to blame? The company for inadequate encryption, or law enforcement for breaking that encryption? Backdoors introduce potential complexities for security leaders navigating the sometimes conflicting demands of legal regulations, corporate expectations, user misuse, and adversarial attacks from both criminal and nation state attackers.
The backdoor debate, continues Paxson, “becomes a real-world operational problem for the security teams… they’re the ones who inherit the mess. Every exception becomes something they have to monitor, defend, and justify – trading security maturity for compliance theater. Once trust breaks, security teams are the ones left cleaning it up.”
He goes further. It won’t work – he calls encryption backdoors an exercise in chasing shadows. “I’ve seen how fast criminals adapt. When one platform gets compromised (or monitored), they move to decentralized apps or niche encrypted tools. Adding backdoors to mainstream platforms won’t stop them; it just leaves businesses and everyday users using them exposed.”

Jason Soroko, senior fellow at Sectigo summarizes the practitioners’ consensus. “History shows that secret access points never stay secret. Once a backdoor exists it becomes a target for sophisticated adversaries, from criminal gangs to nation‑state actors. The complexity of modern software means unintended flaws will accompany any intentional bypass, creating opportunities for data breaches, espionage, and corporate sabotage. No technical mechanism can guarantee a backdoor remains in the hands of only those deemed lawful.”
Soroko’s preference would be for the FBI to focus on traditional policing to disrupt criminals and terrorists. “Modern policing relies on court‑issued warrants, digital forensics, and human intelligence to disrupt criminal networks. Those tools respect privacy while allowing focused intervention. Expanding police powers by embedding backdoors trades short‑term gains for enduring insecurity. Good old fashioned investigative work guided by legal oversight offers a sustainable path to justice without sacrificing the integrity of our most critical communications.”
An alternative proposal
Cipot sees no obvious solution. Access for LEA may help the fight against terrorism and organized criminality but will definitely introduce another attack vector for defenders to navigate.
“Ultimately,” he says, “this comes down to finding a solution to have both – provide public safety and retain public privacy.” He calls the problem a Gordian knot with no sword. “We need a better, more technically fit solution that will watch out for moral usage and provide information needed to protect and secure.”
Ilia Kolochenko, CEO and founder at ImmuniWeb, floats just such an idea – difficult, but technically feasible. It is grounded in his research and thesis for his PhD in Computer Science at Capitol Technology University: Framework Proposal to Regulate Lawful Hacking by Police Within Criminal Investigations but expanded in conversation with SecurityWeek. The basic idea is that rather than reduce security for everyone, guilty and innocent alike, it would be more efficient to block access to E2EE for all convicted criminals.

Kolochenko wrote, “The encryption criminalization approach addresses the bad-faith use of encryption to further a criminal conduct or to deliberately hinder investigations by law enforcement agencies.” The UK’s alleged use of the IPA against Apple is an example of such a criminalization process, but it cannot be used for government access to E2EE.
Here, Kolochenko floats the possibility of blocking criminal access to E2EE services rather than breaking E2EE security for everyone. He believes it is possible and scalable, and he offers gun licensing as an analogy. “Guns are legal in most countries. But access to gun purchase or ownership is restricted by licensing. Untrusted persons cannot get a license to own a gun,” he explains.
Variations on this process exist in most countries – indeed, the details vary almost state-by-state in the US alone. But the principle is clear: the provider of E2EE services should verify the customer is not excluded (by criminal record or judicial warrant) from accessing E2EE services.
There’s the first problem: vendors will protest, insisting that it would adversely affect their business. So, it would need to be enforced by national legislation, even though it’s not altogether clear that this would damage business. Legal action against Apple and Apple’s subsequent removal of ADP from the UK is unlikely to damage iPhone sales in the UK (although that remains to be seen). However, refusal to sell iPhones to individuals included on the exclusion list would inevitably reduce sales fractionally.
But it would protect the privacy of the vast majority of ‘innocent’ users by eliminating law enforcement’s argument for demanding an encryption backdoor: if there are no criminals using E2EE, there is no need for a backdoor to access non-existent criminal communications. The same principle would apply to all providers of E2EE services, such as Telegram and Signal.
Further problems remain, including scalability and the cost of that scalability. Technology can solve the scalability issue – fundamentally it just requires a very secure database of excluded persons. This can come from existing criminal records databases and be supplemented with names where law enforcement can persuade an independent judicial office that this person, who has no criminal record, is nevertheless a terrorist.
The cost of maintaining this list of exclusions, suggests Kolochenko, could be met by a very tiny tax on the vendors. “The precedent already exists. Both the tobacco and oil companies already pay a tax to remediate the harm they do to society,” he says.
The final problem, and it’s hard to see any immediate solution for this, is that an E2EE List of exclusions would be as controversial and widely challenged as the existing No Fly List. But controversial and challenged as it is, it still exists and is in use.
The basis of Kolochenko’s idea is certainly valid. It would be easier, more secure, and fundamentally fairer to exclude criminals from E2EE, than to effectively criminalize all users without due process using a backdoor into the system. It certainly deserves exploration.
Ultimately, it is hard to find a single security practitioner who would support government demands for an E2EE backdoor. We cannot say that none exist, but we can say we couldn’t find any. “For practitioners,” says Paxson, “this isn’t just a philosophical debate, it’s stuff that reshapes their priorities.”
Related: Google Brings End-to-End Encrypted Emails to All Enterprise Gmail Users
Related: Apple, Civil Liberty Groups Condemn UK Online Safety Bill
Related: Encrypted Services Providers Concerned About EU Proposal for Encryption Backdoors
Related: DoJ Again Asks for Encryption Backdoors After Hacking US Naval Base Shooter’s iPhones
