6 minutes read

Court of Appeal pours more cold water on group compensation claims for breach of data protection law & privacy rights

In Prismall v Google Ltd, the Court of Appeal considered whether a representative action for misuse of private information could be used to claim compensation on behalf of 1.6 million patients whose personal data had been used in ways to which they did not consent. The case illustrates the difficulty of "coming at justice" in the social media age, and is closely linked to the earlier case of Lloyd v Google LLC, a representative action for breach of data protection law.

Background

In October 2015, London’s Royal Free NHS Trust (the Trust) shared the sensitive personal data of 1.6 million NHS patients with Google and its artificial intelligence subsidiary, DeepMind. The data – which was not anonymised – included X-Rays, blood tests and medical notes. Patients were not informed that their data was being shared.

The patients’ data was used to develop an app that would help to diagnose acute kidney injuries. However, Google and the Trust had also entered into contracts that would potentially have allowed the data to be used for a much broader range of purposes.

An investigative journalist at the New Scientist got hold of the agreement between Google and the Trust and in April 2016 published an article criticising the volume of information that was shared and questioning the purposes for which it would be used.

Andrew Prismall, one of the patients whose information was shared, took legal action against Google seeking compensation on behalf of all 1.6 million patients. That claim has now been resolved by the Court of Appeal, in Google’s favour.

Timeline of key events

  • September 2015: the Trust enters an information sharing agreement with Google. This agreement envisages Google using the data for a broad range of purposes “to support treatment and avert clinical deterioration across a range of diagnoses and organ systems”.
  • October 2015: the Trust transfers 1.6 million patient-identifiable medical records to Google. DeepMind used some of this data to test its Streams app, which was to be used by the Trust to accelerate the diagnosis of acute kidney injuries. A live data feed was established around the same time in respect of subsequent medical records.
  • November 2015: the NHS Research Ethics Committee approves Google’s application for a project “using machine learning to improve prediction of acute kidney injury”.
  • January 2016: the Trust enters an MOU with DeepMind envisaging “a wide-ranging collaborative relationship for the purposes of advancing knowledge in…life and medical sciences through research”.
  • April 2016: New Scientist article published, prompting an investigation by the Information Commissioner’s Office (ICO) into whether the sharing of patient records was a breach of data protection law.
  • February 2017 – November 2019: the Trust uses the Streams app in its hospitals to assist in diagnosis of acute kidney injuries (the app generated alerts for clinicians based on changes in blood creatine levels).

Was data shared for the purposes of direct care?

The use of the Streams app from 2017-2019 reflects the fact that it is lawful to share patients’ personal data without their consent for direct care (eg medical treatment). Before the ICO investigation, both Google and the Trust took the position that their information sharing agreement was entirely lawful, on the basis that the patient data was to be used for direct care. The ICO did not share this view because:

  1. Although some of the data was eventually used for direct care, it was initially used to test an app. It would have been possible to obtain the data that was needed by informing patients and asking for their consent.
  2. The Streams app could have been tested with far less data, meaning that it was not necessary or proportionate to have shared 1.6 million patient records.  

The ICO found that the information sharing agreement between Google and the Trust was not compliant with data protection law, and required the Trust to enter into a series of undertakings to rectify this.

Were the claimants lawfully entitled to compensation?

Prismall’s claim was a representative action for misuse of private information under section 19.8 of the Civil Procedure Rules. This mechanism allows an individual to make a claim on behalf of an unlimited number of other people who have “the same interest”. Framing the compensation claim in this way allowed Prismall to opt-in his 1.6 million fellow patients to his claim. Breach of data protection law might have been a more logical foundation for the claim, but that had already been tried without success in the Supreme Court case of Lloyd v Google. In that case, Google had covertly tracked the internet activity of iPhone users and used the data obtained to serve targeted advertising. The problem for the claimant was showing that Google’s breach of the Data Protection Act had caused damage to all the millions of iPhone users party to his claim. It might have been possible to show damage by collecting facts relevant to a small subset of those claimants, but doing that would have significantly increased the time and expense of bringing the claim. 

Prismall would have faced the same difficulties as Lloyd if he had relied on breach of data protection law. For that reason, his claim was founded on the tort of misuse of private information. In such cases, there was good reason to believe that compensation might be awarded without the need to prove damage or distress (see Gulati v MGN Ltd). Ultimately, however, the court did not agree. In a representative action – whether for breach of data protection law or misuse of private information – it said that a "threshold of seriousness" must be cleared to justify a claim for compensation. On that score, Prismall’s case was undermined by the following facts: 

  • Some of the patients’ data was not used by DeepMind for testing – it was simply held on their systems for 12 months and then deleted
  • Some patients publicised the treatment they had received in national newspapers
  • Some of the patient data was relatively anodyne in nature

A subset of the 1.6 million patients might have been able to show that they had a "reasonable expectation of privacy" in relation to their data and that the breach of privacy they suffered was serious enough to justify compensation. Proving those things on behalf of all claimants was ultimately an impossible task. 

Conclusion

Data controllers & processors shouldn't rest too easy following the decisions in the Lloyd and Prismall cases. If you use personal data in ways for which you lack permission, you are still open to fines from data protection regulators – and still open to compensation claims from individuals, albeit probably not in such large groups. Interestingly, there are some signs that the European courts may take a different approach in similar circumstances. In one recent case, the German Federal Court of Justice ruled that a “mere loss of control” of personal data could give rise to a right to compensation under Article 82(1) of the GDPR.  

Our content explained

Every piece of content we create is correct on the date it’s published but please don’t rely on it as legal advice. If you’d like to speak to us about your own legal requirements, please contact one of our expert lawyers.

Contact

Nick Smallwood

+441223659016

How we can help you

Contact us

Related sectors & services