Data Protection

DeepMind’s Use of NHS Patient Data Contravenes Data Protection Act

The UK’s Information Commissioner’s Office (ICO) has ruled today that the Royal Free NHS Foundation Trust contravened the Data Protection Act when it provided the personal data of 1.6 million patients to Google-owned DeepMind. The purpose of the data transfer was to help develop the healthcare app, Streams — a diagnosis and detection system for acute kidney infection.

<p><strong><span><span>The UK's Information Commissioner's Office (ICO) has ruled today that the Royal Free NHS Foundation Trust contravened the Data Protection Act when it provided the personal data of 1.6 million patients to Google-owned DeepMind. The purpose of the data transfer was to help develop the healthcare app, Streams -- a diagnosis and detection system for acute kidney infection.</span></span></strong></p>

The UK’s Information Commissioner’s Office (ICO) has ruled today that the Royal Free NHS Foundation Trust contravened the Data Protection Act when it provided the personal data of 1.6 million patients to Google-owned DeepMind. The purpose of the data transfer was to help develop the healthcare app, Streams — a diagnosis and detection system for acute kidney infection.

DeepMind is an artificial intelligence research company determined to use AI to solve complex problems — such as health issues. Like all machine learning and AI, it needs large amounts of data from which to learn and from which its algorithms are developed. It is where and how this data is acquired that is at issue.

The Information Commissioner said in a statement, “There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights. Our investigation found a number of shortcomings in the way patient records were shared for this trial.”

Those shortcomings included: patients were not properly informed that their data would be used in this way, and that the Trust should have been more transparent about the arrangement. However, the ICO stopped short of delivering any sanctions over these shortcomings, even though it states, “patient identifiable data was not subject to pseudonymisation.”

In theory, the ICO could have fined the Royal Free up to £500,000. Instead, it has merely asked the Trust to establish a legal basis for the Google DeepMind project and for any future trials; to set out how it will comply with its duty of confidence in any future trials; to complete a privacy impact assessment; and to commission an audit of the trial which may be published by the ICO.

There are no specific requirements on DeepMind since under the law it is the ‘data controller’ — that is, the Royal Free — that is responsible for data protection.

The lack of specific legal sanctions may be because of the willingness of both the Trust and DeepMind to cooperate with the ICO. In a blog posted today, DeepMind first justified the project, and then admitted its failings. “We’re proud that, within a few weeks of Streams being deployed at the Royal Free, nurses said that it was saving them up to two hours each day, and we’ve already heard examples of patients with serious conditions being seen more quickly thanks to the instant alerts.”

But it added, “In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health… We made a mistake in not publicizing our work when it first began in 2015, so we’ve proactively announced and published the contracts for our subsequent NHS partnerships.”

Advertisement. Scroll to continue reading.

In its own statement, the Royal Free commented, “We accept the ICO’s findings and have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used. We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”

Perhaps the key aspect of today’s ruling is the ICO’s final comment, “The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.” This suggests that a light and pragmatic approach to applying current and future data protection laws will be the approach adopted by the UK regulator.

Related Content

Copyright © 2024 SecurityWeek ®, a Wired Business Media Publication. All Rights Reserved.

Exit mobile version