ISTOCK, PHOTOBYLOVELast year, New Scientist revealed that DeepMind Health (DMH), a London-based, Google-owned artificial intelligence (AI) company, had access to the health records of 1.6 million patients, and had received them without proper consent.
This occurred through a data-sharing agreement the company made with the Royal Free NHS (National Health Service) Foundation Trust to test and develop an app called Streams, which monitors patients with kidney disease.
In a report released Monday (July 3), the Information Commissioner’s Office (ICO), an independent body in the U.K. set up to uphold information rights, found that the Trust breached the Data Protection Act when it provided Google with patients’ information. The ICO also noted that there were several issues with how data were handled, such as the lack of proper informed consent.
“Patients would not have reasonably expected their information to have been used in this way, and the Trust could and...
One day after the ICO’s review was released, an independent review panel set by DMH released its report that also raised concerns, including lack of clarity in the initial information sharing agreement and minor vulnerabilities in the app, the application programming interface, and some of DMH’s servers. However, it concluded that DeepMind itself had not broken any laws.
“DeepMind had a view that they could use AI to significantly improve healthcare within the NHS and frankly around the world,” former Cambridge Member of Parliament Julian Huppert, who served on the independent review panel, said during a press briefing (via Business Insider). “That was the starting conceptual point. I think it would then be fair to say that as they discussed this with people they found that the state of data and the state of information flow within the NHS was simply not as good as they had hoped.”
DMH could have done more to make sure the data it received from the Trust were properly sourced, Regina Lally, codirector of the data protection company DataBasix, tells BuzzFeed. “Given the sensitive, personal nature of the data, they could have asked more questions about what needed to happen,” she adds.