EU Proposal Aims to Prevent Human Rights Violations Through Cyber-Surveillance Exports
The European Union is expected to propose tighter rules on the export of dual-use technologies in September. The Union has been embarrassed by evidence that surveillance technology from companies such as Germany’s FinFisher GmbH and Italy’s Hacking Team have been used by repressive regimes to target activists and journalists. This doesn’t rest easy with the EU’s central themes of human rights and personal privacy.
A new draft proposal expanding on export controls for dual-use products has been obtained and released by the EurActiv publication. The detail of this proposal will undoubtedly be amended before September, but it does show the current thinking of the European Commission. The main new thrust is to include intrusion and surveillance software where it is likely to be used to violate human rights: “cyber-surveillance technology which can be used for the commission of serious violations of human rights or international humanitarian law, or can pose a threat to international security or the essential security interests of the Union and its Member States.”
There seems to be no attempt at a blanket ban on the export of these technologies. Friendly nations and their law enforcement agencies, and indeed EU member states, will still be able to purchase the technologies for their own use for ‘national security’ and law enforcement purposes. Furthermore, some media claims that the proposals ‘could classify smartphones as weapons’ because of their tracking capabilities are also far-fetched. Indeed, this seems to be implicitly excluded by the statement, “These measures should not go beyond what is proportionate. They should, in particular, not prevent the export of information and communication technology used for legitimate purposes, including law enforcement and internet security research.”
While this statement is important, it is included in the preamble rather than the Regulation itself. It highlights one of the greatest difficulties in defining ‘dual-use’ software: how do you define ‘legitimate purposes’; how do you define ‘repressive regimes’; and how do you define acceptable ‘internet security research’? It is an area that needs to be clarified before the proposal is finalized.
In an initial analysis of the EU proposal by Privacy International (PI), PI research officer Edin Omanovic discusses the potential ‘chilling effect’ of getting it wrong. Having no control is dangerous. “They can be used by governments, and potentially private sector contractors, for internal repression by targeting devices and infrastructure,” writes Omanovic.
“However,” he adds, “PI recognizes the central role offensive tools play in producing defensive countermeasures to keep us all safe. As such, these technologies must not be controlled where they are exported for defensive purposes or where the purpose has not been determined.”
This is a view broadly supported by the security research industry. F-Secure security advisor Erka Koivunen sees one particular encouraging feature. “The most important ‘news’ here is that this Regulation makes an attempt to factor in the end user’s intent and track record of human rights abuses when deciding whether or not to permit an export.”
Nevertheless his primary concerns remain. “One problem,” he told SecurityWeek, “is that you don’t necessarily know who the buyer is, nor who the buyer works for. It would be unreasonable for a provider of COTS software or a researcher writing a study paper to demand a list of customers or to seek prior permission before ‘delivering’ the goods to the end user.”
Study papers could be a victim of the ‘chilling effect’ described by PI. “A potential unintended consequence of this type of dual-use regulation,” said Koivunen, “would be that security researchers would not be able to collaborate, share information or publish their results in fear of breaching the rules. It is not clear at this stage whether this is an unfounded fear, but I think it is correct to say that as a company we are following this regulation carefully.”
The reality is that this leaked draft proposal from the European Commission shows recognition that European technology can be used by repressive regimes in defiance of generally held human rights — but it does not yet show how the problems can be solved. Nevertheless, as PI concludes, “This is a leaked proposal, and it could look drastically different when finally implemented. Nevertheless, the recognition that human rights considerations should play a role in this huge area of trade policy is to be celebrated.”