Site navigation

NHS Trust broke privacy law while sharing data with Google

Andrew Hamilton

,

doctor's instruments on a document

The UK’s Information Commissioner’s Office (ICO) has ruled that data from a number of undisclosed UK hospitals was shared unlawfully with Google’s DeepMind during trails for new detection practices.

The UK’s Information Commissioner claims that the Royal Free NHS Foundation Trust in London breached data regulations when the Trust handed over 1.6 million patients’ data to Google for its DeepMind learning programme.

The data was provided as part of a new trial to develop an app – named Streams – which could act as an ‘alert, diagnosis and detection system’ for acute kidney injury (AKI). But the ICO claims that during its investigation it unearthed ‘several shortcomings’ in the Trust’s handling of patients’ personal data, including that patients were not adequately told their information would be used in this manner.

Elizabeth Denham, the Information Commissioner, said: “Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.

“There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.”

More specifically, the year-long ICO investigation found that the exchange of data between the NHS and Google UK – which began on 18th November 2015 – contravened a number of data regulations. These principles include not handling excessive amounts of data and ensuring that all data processed is strictly relevant to the trial. The investigation was launched in May 2016 when the ICO was alerted to the deal between the Trust and Google UK by media reports.

Controversy flared again in March when law academic Julia Powles of Cambridge University and economic journalist Hal Hudson examined how the data of 1.6 million patients might be abused under the guise of the DeepMind project. Their paper, titled ‘Google DeepMind and healthcare in an age of algorithms’, concluded that hospital administrators were lax in their surrendering of large amounts of patients’ data to an unaccountable international body. The paper also alleged that such acts could allow Google to obtain a ‘monopolistic’ position over future healthcare analytics, data being the driving force behind research.

Powles and Hal contested: “The public’s situation is analogous to being interrogated through a one-way mirror: Google can see us, but we cannot see it. The company benefits from relying on commercial secrets and the absence of public law obligations and remedies against it. This leaves it with few incentives for accountability.” DeepMind and the hospitals claimed that the authors both made ‘significant factual and analytical errors’, and would review the paper before publishing a response.

With the conclusion of the ICO’s current investigation, Google and the Royal Free Trust will escape fines of up to £500,000. The Trust, however, has been asked to adopt a number of policy changes to ensure compliance with data protection laws going forward. Miss Denham said: “We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.”

These undertakings include operating fully under the Data Protection Act for any future trials, creating a plan to assure patients that their data will remain secure, and assessing fully the impact that the previous infringement may have had on patients. The ‘Royal Free’ agreed fully to the terms, confirmed in a public statement released alongside the ICO’s announcement of its successful investigation. According to the Royal Free, the personal data of its patients was never used for any other means than developing improved care, and the Streams app will continue to be developed ahead of its imminent deployment.

The Trust said: “We passionately believe in the power of technology to improve care for patients and that has always been the driving force for our Streams app. We are pleased that the information commissioner [sic] supports this approach and has allowed us to continue using the app which is helping us to get the fastest treatment to our most vulnerable patients – potentially saving lives.”

Meanwhile, the DeepMind team have released a statement of their own, also assuring that the Royal Free was at all times in control of the patients’ information. However, DeepMind has also accepted oversight on a number of issues, admitting that their original agreement with the NHS back in 2015 was under publicized and their process for handling data was not transparent enough. They claim that they commissioned an independent audit into their practices long before any negative media attention, and is due to publish their own conclusions soon.

DeepMind said: “We’re proud that, within a few weeks of Streams being deployed at the Royal Free, nurses said that it was saving them up to two hours each day, and we’ve already heard examples of patients with serious conditions being seen more quickly thanks to the instant alerts. Because Streams is designed to be ready for more advanced technology in the future, including AI-powered clinical alerts, we hope that it will help bring even more benefits to patients and clinicians in time.”

Under GDPR, the penalties for both parties could have been far steeper. Any and all firms dealing with protected data belonging to EU citizens face fines of €20 million or 4% of their global turnover when the regulations come into force next May.

Andrew Hamilton

Andrew Hamilton

PR & Content Executive at Hutchinson Networks

Latest News

%d bloggers like this: