Security Experts:

Connect with us

Hi, what are you looking for?

SecurityWeekSecurityWeek

Data Protection

DeepMind’s Use of NHS Patient Data Contravenes Data Protection Act

The UK’s Information Commissioner’s Office (ICO) has ruled today that the Royal Free NHS Foundation Trust contravened the Data Protection Act when it provided the personal data of 1.6 million patients to Google-owned DeepMind. The purpose of the data transfer was to help develop the healthcare app, Streams — a diagnosis and detection system for acute kidney infection.

The UK’s Information Commissioner’s Office (ICO) has ruled today that the Royal Free NHS Foundation Trust contravened the Data Protection Act when it provided the personal data of 1.6 million patients to Google-owned DeepMind. The purpose of the data transfer was to help develop the healthcare app, Streams — a diagnosis and detection system for acute kidney infection.

DeepMind is an artificial intelligence research company determined to use AI to solve complex problems — such as health issues. Like all machine learning and AI, it needs large amounts of data from which to learn and from which its algorithms are developed. It is where and how this data is acquired that is at issue.

The Information Commissioner said in a statement, “There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights. Our investigation found a number of shortcomings in the way patient records were shared for this trial.”

Those shortcomings included: patients were not properly informed that their data would be used in this way, and that the Trust should have been more transparent about the arrangement. However, the ICO stopped short of delivering any sanctions over these shortcomings, even though it states, “patient identifiable data was not subject to pseudonymisation.”

In theory, the ICO could have fined the Royal Free up to £500,000. Instead, it has merely asked the Trust to establish a legal basis for the Google DeepMind project and for any future trials; to set out how it will comply with its duty of confidence in any future trials; to complete a privacy impact assessment; and to commission an audit of the trial which may be published by the ICO.

There are no specific requirements on DeepMind since under the law it is the ‘data controller’ — that is, the Royal Free — that is responsible for data protection.

The lack of specific legal sanctions may be because of the willingness of both the Trust and DeepMind to cooperate with the ICO. In a blog posted today, DeepMind first justified the project, and then admitted its failings. “We’re proud that, within a few weeks of Streams being deployed at the Royal Free, nurses said that it was saving them up to two hours each day, and we’ve already heard examples of patients with serious conditions being seen more quickly thanks to the instant alerts.”

But it added, “In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health… We made a mistake in not publicizing our work when it first began in 2015, so we’ve proactively announced and published the contracts for our subsequent NHS partnerships.”

In its own statement, the Royal Free commented, “We accept the ICO’s findings and have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”

Perhaps the key aspect of today’s ruling is the ICO’s final comment, “The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.” This suggests that a light and pragmatic approach to applying current and future data protection laws will be the approach adopted by the UK regulator.

Written By

Click to comment

Daily Briefing Newsletter

Subscribe to the SecurityWeek Email Briefing to stay informed on the latest threats, trends, and technology, along with insightful columns from industry experts.

Expert Insights

Related Content

Application Security

Cycode, a startup that provides solutions for protecting software source code, emerged from stealth mode on Tuesday with $4.6 million in seed funding.

Application Security

Many developers and security people admit to having experienced a breach effected through compromised API credentials.

Application Security

Electric car maker Tesla is using the annual Pwn2Own hacker contest to incentivize security researchers to showcase complex exploit chains that can lead to...

Data Protection

The cryptopocalypse is the point at which quantum computing becomes powerful enough to use Shor’s algorithm to crack PKI encryption.

Application Security

Virtualization technology giant VMware on Tuesday shipped urgent updates to fix a trio of security problems in multiple software products, including a virtual machine...

Cybersecurity Funding

Los Gatos, Calif-based data protection and privacy firm Titaniam has raised $6 million seed funding from Refinery Ventures, with participation from Fusion Fund, Shasta...

Cybercrime

A database containing over 235 million unique records of Twitter users is now available for free on the web, cybercrime intelligence firm Hudson Rock...

Application Security

Fortinet on Monday issued an emergency patch to cover a severe vulnerability in its FortiOS SSL-VPN product, warning that hackers have already exploited the...