The Open Technology Institute (OTI) has responded to GCHQ/NCSC’s article on ‘Principles for a More Informed Exceptional Access Debate’ with an ‘Open Letter to GCHQ on the Threats Posed by the Ghost Proposal’.
‘Exceptional access’ is the law enforcement term for accessing encrypted messages — the so-called government backdoor into end-to-end encryption services. ‘Going dark’ is the term law enforcement uses to describe its inability to access encrypted messages between subjects of interest that increasingly use encryption. ‘Ghost proposal’ is OTI’s term for GCHQ’s proposed method to prevent going dark.
Law enforcement in this instance combines both police services like the FBI and the Met, and intelligence services like the NSA and GCHQ/NCSC. Few people deny the benefit of law enforcement being able to access individual messages — with a court order — between specified persons of interest. The resistance of the security industry to a government backdoor into encryption is that it would break encryption for everyone — guilty and innocent alike.
At the end of November 2018, Ian Levy (technical director of the UK’s National Cyber Security Centre — NCSC), and Crispin Robinson (technical director for cryptanalysis at GCHQ) published an article that “outlines how to enable the majority of the necessary lawful access without undermining the values we all hold dear.”
The Levy/Robinson paper starts from what they call the ‘Five Country’ statement on access to evidence and encryption. The five countries are the Five Eyes (U.S.A., UK, Canada, Australia and New Zealand — the world’s largest single SigInt alliance). It describes three principles: access to encrypted messages is the mutual responsibility of both SigInt and vendors; access should be subject to oversight and judicial review; and different countries and different vendors should develop their own solutions.
The statement is currently available through the web archive. It concludes with a strong warning: “Should governments continue to encounter impediments to lawful access to information necessary to aid the protection of the citizens of our countries, we may pursue technological, enforcement, legislative or other measures to achieve lawful access solutions.”
Levy/Robinson build on this foundation with six principles for controlled and limited exceptional access. In brief, these are: exceptional access should be legitimate, least intrusive and authorized (adopted from the U.S. Cloud Act); collaboration with vendors; any solution should not undermine public trust; LEA access should be targeted, not unfettered; any solution should be subject to peer review and incremental implementation; and transparency.
The authors then propose possible methods of gaining access while conforming to the principles. Encrypted cloud backups are conceptually easy: “If those backups are encrypted, maybe we can do password guessing on big machines,” suggest the authors. It would be focused and could be given judicial oversight and legitimacy relatively easy.
Of more interest, however, is the proposed possible route into encrypted chats in real time. “It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved — they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication.” This is the so-called ‘ghost user’ solution.
The authors state very clearly that this does not interfere with encryption. “We’re not talking about weakening encryption or defeating the end-to-end nature of the service. In a solution like this, we’re normally talking about suppressing a notification on a target’s device, and only on the device of the target and possibly those they communicate with.”
In its open letter (PDF) to GCHQ, the OTI acknowledges that vendors’ encryption algorithms will not be manipulated, but suggest that implementing the ghost user will create significant other problems. For example, while the encryption itself does not need to be redeveloped, the method of authenticating users (the check codes to ensure that the chat is between expected users) will have to be rewritten. Susan Landau points out that the ghost proposal “involves changing how the encryption keys are negotiated in order to accommodate the silent listener, creating a much more complex protocol — raising the risk of an error.”
On top of this, GCHQ’s own principal of transparency over when the option is invoked will demonstrate that it is being invoked — meaning that users of encryption (for very legitimate purposes such as journalism, conversations between vulnerable people, and more) will never know, nor be able to trust, that their conversations are genuinely confidential.
This latter argument is further illustrated from cases on both sides of the Atlantic. In the UK, in September 2018, GCHQ, MI5 and MI6 admitted to unlawfully intercepting the private communications of Privacy International as part of the Bulk Communications Data (BCP) and Bulk Personal Datasets (BPD) programs. In the U.S., a former police officer discovered that 104 officers in 18 different agencies across the state had accessed her driver’s license record 425 times, using the state database as their personal Facebook service. On the one hand, the mere existence of a law will not prevent individuals from breaking that law, while on the other hand, foreign nations with lower principles will almost certainly abuse the process.
The introduction of a principal of distrust will inevitably damage the very basis of cybersecurity — trust. Since the new system would have to be introduced via software updates, this could persuade some users to turn off automatic updates to avoid the possibility of third-party access. This was cited by President Obama’s working group exploring the problem of LEA access in 2015. ” Individual users aware of the risk of remote access to their devices, could also choose to turn off software updates, rendering their devices significantly less secure as time passed and vulnerabilities were discovered [but] not patched,” they wrote.
The trust issue is supported by Chris Morales, head of security analytics at Vectra. “This is technically possible, but the idea scares me,” he told SecurityWeek. “I don’t have anything to hide, but it is a complete invasion of personal privacy. So much personal data is shared in electronic communication. I understand why government agencies want access, but I also believe it fundamentally breaks the entire trust model of encryption and makes the entire of point of encrypting data for privacy useless. If the backdoor exists, who is going to ‘watch’ the ‘watchers’?”
The conclusion reached by the Open Technology Institute, which comprises 47 signatories from civil society, vendors (including Apple, Google, Microsoft and WhatsApp), and cryptologists (including Neumann, Schneier, Shostack, and Zimmermann) is unequivocal: they “urge GCHQ to abide by the six principles they have announced, abandon the ghost proposal, and avoid any alternate approaches that would similarly threaten digital security and human rights.”
NCSC declined SecurityWeek’s request for a briefing on this, but did send a statement from Ian Levy by e-mail:
“We welcome this response to our request for thoughts on exceptional access to data — for example to stop terrorists. The hypothetical proposal was always intended as a starting point for discussion,” Levy said. “It is pleasing to see support for the six principles and we welcome feedback on their practical application. We will continue to engage with interested parties and look forward to having an open discussion to reach the best solutions possible.”
There are many, however, who consider that LEA demands for encryption backdoors are overstated. “The ‘going dark’ theory is overblown now and always has been,” Marcus Carey, CEO and founder of Threatcare, told SecurityWeek. He points out that even during the ‘Cold War’, Russian and American spies frequently didn’t bother using the encryption they had available. “The same can be said for criminals and bad actors today. The bad guys could use encryption (and sometimes they do), but there is a tremendous amount of communication and activity that shows their hand.”
Carey believes that encrypted content is a lesser problem than poor use of existing intelligence, and that encryption is used as an excuse for failure. “They blame encryption and going dark as excuses for failure. Later, we find out that there were tons of indicators, and, even worse, sometimes the actors were under surveillance. Why should people trust the government with ‘exceptional access’ when they can’t leverage the tools already at their disposal?”
The disappointing aspect to this proposal from GCHQ is that it isn’t new. Ex-president Obama’s encryption task force looked at a similar proposal and rejected it four years ago.