A controversial deal between Google’s artificial intelligence unit DeepMind and The Royal Free London NHS Foundation Trust has come under fire for “inexcusable” failings and a lack of transparency in an academic paper published in Health and Technology.

In 2015 the parties signed an initial agreement to develop an app, called Streams, designed to improve outcomes for patients at risk of acute kidney injury by getting "the right data to the right clinician at the right time", notifying nurses and doctors immediately when test results show a patient is at risk of becoming seriously ill.

The relationship between DeepMind and the Trust was subsequently expanded with a new information-sharing deal and a wider scope for the initial agreement, including plans for a data-sharing access infrastructure for the hospital.

In 2016 the deal hit the spotlight after it emerged in a report by the New Scientist that DeepMind was being given access to a huge amount of identifiable patient data, triggering an investigation into patient consent by UK watchdog the Information Commissioner's Office, which is still ongoing but close to conclusion.

Patient data for Streams was obtained without patient consent on the argument that it was unnecessary to do so given that the app is used for direct patient care, but the paper - titled Google DeepMind and healthcare in an age of algorithms and written by Dr Julia Powles, a research associate in law and computer science at the University of Cambridge, and The Economist journalist Hal Hodson - says “the failure on both sides to engage in any conversation with patients and citizens is inexcusable”.

Furthermore, it questions use of ‘direct care’ as a premise for not obtaining consent, as the data includes patients who not been tested or treated for kidney injury.

“Since the large, Trust-wide group whose data has been transferred includes individuals who have never had a blood test, never been tested or treated for kidney injury, or indeed patients who have since left the constituent hospitals or even passed away, the position that Royal Free and DeepMind assert - that the company is preventing, investigating or treating kidney disease in every patient - seems difficult to sustain on any reasonable interpretation of direct patient care”.

Further questioning the deal, the paper argues that “the amount of data transferred is far in excess of the requirements of those publicly stated needs, but not in excess of the information sharing agreement and broader memorandum of understanding governing the deal, both of which were kept private for many months.”

It also points out that the data transfer was done without consulting relevant regulatory bodies, “with only one superficial assessment of server security, combined with a post-hoc and inadequate privacy impact assessment”.

“We do not know - and have no power to find out - what Google and DeepMind are really doing with NHS patient data, nor the extent of Royal Free’s meaningful control over what Google and DeepMind are doing,” it points out, and notes that “any assurances about use of the dataset come from public relations statements, rather than independent oversight or legally binding documents”.

However, DeepMind and the Royal Free hospital told the media that the paper “completely misrepresents the reality of how the NHS uses technology to process data," and “makes a series of significant factual and analytical errors, assuming that this kind of data agreement is unprecedented”.

Powles and Hodson responded to PharmaTimes that DeepMind and Royal Free's allegations of misunderstanding and incorrectness are entirely unsubstantiated, and they invited the parties to respond on the record in an open forum. They added: "The obvious fact is that we care about Google and DeepMind getting into healthcare because it is a break from the norm."