Infrastructure > Devices

ICO: data use for DeepMind's Royal Free app test was illegal

Neil Merrett Published 03 July 2017

Royal Free London trust is allowed to continuing using Streams app developed with Google-owned company after agreeing to revise future testing and patient data use

 

The Information Commissioner’s Office (ICO) has said that the price of innovation in data-led technologies does not justify the erosion of privacy regulations after concluding that Google-owned AI group DeepMind had illegally made use of patient records in testing a new healthcare technology.

According to its investigation, the data regulator concluded that Royal Free London NHS Foundation Trust had failed to comply with the Data Protection Act by providing personal data of 1.6m individuals to test the Streams kidney health app that is now in use by the trust for direct care purposes.

With the ICO pointing to several shortcomings in how the data used for testing Streams was handled, specifically in informing patients that their data was involved, information commissioner Elizabeth Denham noted that the trust has been asked to sign an undertaking to commit future testing to be within the law.

“The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights,” wrote Denham in a blog post. “I’ve every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work. This will also be true for the wider NHS as deployments of innovative technologies are considered.”

However, Denham maintained that while the trust and DeepMind had breached the Data Protection Act during testing, there were not currently grounds for concern around the processing of data for the live Streams service.

With Streams continuing to be used by the trust, which claims it is realising significant operational benefits from the technology for its staff, the ICO is demanding a privacy impact assessment to ensure transparency.  The trust must also carry out an audit of the testing it carried out last year that could be published for further possible public scrutiny.

The measures are unlikely to appease privacy organisations and campaign groups that have continued to maintain that DeepMind had infringed data protection legislation in partnership with the trust.

Data guardian view

National Data Guardian Dame Fiona Caldicott, who has also questioned the legality of the ‘implied consent’ with which patient data was originally used to test Streams, welcomed the conclusions of the ICO investigation that she helped inform.

While welcoming the development of new technologies that are sufficiently tested to provide a more accurate and secure service to patients, she highlighted a need for transparency in making use of personal and confidential information.

 “I concur with the points that the ICO has made that much more should have been done to inform patients about the project and to allow them to opt out of their data being used to develop and test the technology if they were not happy for it to be used for this purpose,” she said.

“If patients and service users discover their information has been used in unexpected ways, we risk damaging public trust and losing support for the technological advances that could benefit us all."

Caldicott said she was pleased that the Department of Health was closely considering existing regulatory frameworks and guidance provided around making use of health data, with the findings of a review yet to be published.

Ongoing use

The Royal Free trust said that it had been allowed to continue to use the technology to treat patients, and would co-operate with the ICO alongside having been involved with the regulator’s investigation that started in May last year.

“We have signed up to all of the ICO’s undertakings and accept their findings. We have already made good progress to address the areas where they have concerns. For example, we are now doing much more to keep our patients informed about how their data is used,” said the trust. “We would like to reassure patients that their information has been in our control at all times and has never been used for anything other than delivering patient care or ensuring their safety.”

In seeking to play up the operational benefits of Streams’ use, DeepMind issued a statement noting that the ICO has recognised that the Royal Free was in control of all data used for testing, with the company processing the information in line with the trust’s instructions.

In a statement, the company argued that no issues were raised about safety or the security of data being used. 

“Although today’s findings are about the Royal Free, we need to reflect on our own actions too. In our determination to achieve quick impact when this work started in 2015, we underestimated the complexity of the NHS and of the rules around patient data, as well as the potential fears about a well-known tech company working in health,” the company said. “We were almost exclusively focused on building tools that nurses and doctors wanted, and thought of our work as technology for clinicians rather than something that needed to be accountable to and shaped by patients, the public and the NHS as a whole. We got that wrong, and we need to do better.”

In responding to the ICO, the company said that its initial legal agreement with the trust to develop Streams could have carried much more detail about the project and its rules for data handling.  DeepMind argued these concerns had been addressed by a more comprehensive contract agreed last year that had served as the basis for wider arrangements in working with NHS organisations.

In claiming to have made a “mistake” in not publicising its work back in 2015, the company said it would publish contracts for all subsequent agreements with NHS trusts, while also having implemented an independent review body to investigate its work.

The MedConfidential pressure group accused Google DeepMind of having given “various contradictory quotes about its intent over time, repeatedly asserting that what it was doing was lawful.”

With the independent reviewers body established by the company set to publish its own report this week on the project, MedConfidential coordinator Phil Booth said the group looks forward to the publication on Wednesday (July 5).

Related articles:

Data guardian challenges legal basis for initial DeepMind NHS app data test

Data ethics, AI health innovation and the lessons from GM crop debates








We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.