Infrastructure > Devices

Data guardian challenges legal basis for initial DeepMind NHS app data test

Neil Merrett Published 16 May 2017

Streams app, now live at London hospital for use in direct care, did not have appropriate legal basis to use patient information specifically for its testing phase, leaked letter argues


The Royal Free London hospital will continue to use a kidney health app developed with Google-owned AI group DeepMind despite the UK’s National Data Guardian (NDG) questioning the legality of the ‘implied consent’ with which patient data was originally used to test the technology before launch.

The Streams app was developed to try and improve acute kidney injury (AKI) detection by immediately reviewing blood test results for signs of deterioration and delivering results and alerts to relevant clinicians via a mobile device, according to the trust.

While testing was underway on the app last year, the Information Commissioner’s Office (ICO) asked NDG Dame Fiona Caldicott to look at the legal basis for sharing identifiable information of 1.6m patients ahead of its launch.  It is understood that the ICO is still required to decide on the overall legality of how information was used to test the technology, which is now used in the context of direct care to support clinicians.

Regardless of the decision, Caldicott has played up the need for clearer guidance around patient data use when testing future technologies for the NHS.

In a letter sent to DeepMind in February from the office of the National Data Guardian, which was confirmed as genuine after being leaked to media this week, 'implied consent' used to justify use of the records in tests was not deemed as being appropriate as a legal basis.  In the specific case of the app’s testing phase, the data guardian concluded in the letter that development of the technology could not be counted as ‘direct care’ that would have allowed for patient data to be shared with the company to develop the app.

The Royal Free has maintained that the Streams app was now in use at the hospital and had been devised closely in collaboration with its clinicians under strict conditions.  The hospital body claimed that the technology supported clinicians to provide faster and more effective care, arguing that strict instructions were in place during initial testing on how DeepMind used data.

"We took a safety-first approach in testing Streams using real data. This was to check that the app was presenting patient information accurately and safely before being deployed in a live patient setting. Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live,” said the trust.

“No responsible hospital would ever deploy a system that hadn't been thoroughly tested. The NHS remained in full control of all patient data throughout.”

The Royal Free said that project and the resulting technology was intended to prevent unnecessary deaths and was a relatively unique development within the NHS.

“We take seriously the conclusions of the NDG, and are pleased that they have asked the Department of Health to look closely at the regulatory framework and guidance provided to organisations taking forward this type of innovation, which is essential to the future of the NHS,” said the statement.

The office of the National Data Guardian, which does not have regulatory powers to directly impact projects, said it considered the sharing of patient data by the Royal Free with DeepMind for Streams at the behest of the UK's data regulator.

“In discussions with the ICO about this, the NDG agreed to provide advice on the use of implied consent for direct care as a legal basis for the sharing of data by the Royal Free with DeepMind. While the ICO investigation is ongoing the NDG will provide any further assistance to the ICO as required, but will not be commenting further on the matter at this point,” noted the NDG’s office.

Responding to the data guardian’s conclusions on its testing, DeepMind said Streams was improving how staff at the hospital delivered care, with the app not having been used for commercial purposes or combined with other Google products or services.  The company argued this would continue to be the case.

The company argued that a “full set” of patient data was used in the app’s testing before going live in line with NHS safety testing requirements.

“Safety testing is essential across the NHS, and no hospital would turn a new service live without testing it first. We’re glad the NDG has said that further guidance would be useful to organisations which are undertaking work to test new technologies,” said the company.

“We also recognise that there needs to be much more public engagement and discussion about new technology in the NHS. We want to become one of the most transparent companies working in NHS IT, appointing a panel of independent reviewers, embarking on a major patient involvement strategy, and starting a groundbreaking project called Verifiable Data Audit. We believe that these steps are helping to set a new standard of transparency across the health system.”

Responding to the publication of the leaked letter, Phil Booth of the MedConfidential pressure group said the NDG’s response raised significant concerns over the legality of using identifiable patient data for the development of DeepMind’s technology before going live.

“Every flow of patient data in and around the NHS must be safe, consensual and transparent. Patients should know how their data is used, including for possible improvements to care using new digital tools. Such gross disregard of medical ethics by commercial interests – whose vision of ‘patient care’ reaches little further than their business plan – must never be repeated.”

Related articles:

Data ethics, AI health innovation and the lessons from GM crop debates

Google and Royal Free trust consider data sharing issues

We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.