Why We Need Collective Redress For Data Protection
This post was written by Chair Emeritus of PI’s Board of Trustees, Anna Fielder.
The UK Data Protection Bill is currently making its way through the genteel debates of the House of Lords. We at Privacy International welcome its stated intent to provide a holistic regime for the protection of personal information and to set the “gold standard on data protection”. To make that promise a reality, one of the commitments in this government’s ‘statement of intent’ was to enhance people’s enforcement of rights as provided by the EU General Data Protection Regulation, (GDPR), which this Bill implements.
So far, however, our Government is reneging on this commitment to ensure effective redress for those impacted by unlawful practices undermining data protection safeguards, or by data breaches that may result in ID theft or worse. Clause 173 of the proposed Bill does not provide for qualified non-profit organisations to pursue data protection infringements of their own accord, as provided by the GDPR, albeit on an optional basis. This kind of enforcement, with one person or body representing a group of individuals that have suffered the same harm, is known as ‘collective redress’.
Ironically, the issue of collective redress has been one of the most controversial and hotly debated in the House of Lords. Labour and Lib Dem Peers have proposed at Committee stage an amendment that would implement the government’s earlier promises, but the Government flatly rejected it, using Trump-esque arguments, which we have largely addressed in our briefing for the Report stage of the bill.
One government argument is that consumers would not want to have their rights defended without their explicit consent (i.e. to ‘opt-in’ to be represented rather than ‘opt-out’ if they don’t wish to be). This is contradicted by survey evidence that shows a significant majority of consumers support such group actions: Gallup research for the European Commission found that 87% of UK consumers would be more willing to defend their rights if they could join with other consumers who suffered the same harm.
Weak enforcement provisions were one of the widely acknowledged reasons why the current data protection laws, in the UK and elsewhere in Europe, were no longer fit for purpose in the big data age. As a result, it has been more convenient for organisations collecting and processing personal information to break the law and pay up if found out, than to observe the law — as profits made from people’s personal information vastly outweighed even the most punitive of fines. The GDPR and the UK Data Protection Bill have corrected this enforcement imbalance to a large extent.
However, the information and power imbalance between individuals and those controlling their personal information remains, notwithstanding the improved individual rights in the legislation. Massive data breach cases, such as those of Equifax or Uber, are now so commonplace as to become the subject of artistic endeavours. They make media headlines, and can be readily addressed by enforcement authorities.
But there are many other unlawful practices that can affect hundreds of thousands of individuals and that take place under the bonnet. Those can only be revealed by independent research and investigations, most often carried out by civil society organisations and charities. Examples are numerous: Privacy International has recently published a report on the use, and possible abuse, of personal data in connected rental cars; Which? has carried out research on connected toys widely available in this country that could pose child safety risks; the Norway Consumer Council has exposed data-related safety problems with the now famous Cayla doll and kids’ smart watches, as well as unlawful practices by health and dating apps; and a US consumer group has just this month exposed a potential mass surveillance “Orwellian future” by digital home assistants Amazon Echo and Google Home, through studying in detail their patent applications for these devices.
Such cases can be taken up on behalf of individual consumers, and individuals and consumer groups have done so in the past across Europe. But experience shows that infringing companies and organisations would not necessarily correct their practices to cover all individuals affected, or in all countries where they trade. The empowerment of collective action would ensure corrective action by organisations processing personal information, which would benefit all those affected. It would act as a deterrent for companies, and it would save time and money for the courts. It could have effects across several jurisdictions, since with regard to data flows the UK is not an island, and most companies handling data operate internationally.
Some industry associations claim that provision for collective redress would open the floodgates to US-style class action and profiteering third parties financing such actions. Such fears are wide off the mark, as the UK (and European) legal systems are very different to the US — there are no punitive damages, the ‘loser-pays’ principle would prevent cases without merit, and the courts rule if the case is justified before it proceeds.
There are successful all-encompassing collective redress systems in Belgium, Italy, Portugal, Spain and Sweden, as well as Canada and Australia — there is no evidence that these schemes have harmed industry or the digital economy. On the contrary, there is evidence to show how a powerful industry can play different redress systems to its advantage: in the ‘Dieselgate’ VW case, where the company cheated on car emissions tests, effective compensation and enforcement action has been possible in the US on a unified basis, while only wronged consumers in those EU countries where collective redress systems exist may receive compensation. In the rest of Europe not much has been offered to deceived consumers, with the result that some VW customers will end up being more equal than others. Besides, collective redress is not a new concept in the UK legal system: collective action is already enabled under the Consumer Rights Act 2015 and under the “super-complaint system”(Enterprise Act 2002) for any market failures that harm the interest of consumers.
In the longer term, we need a collective redress system that can deal with the widest range of harm, including shoddy services, dangerous products, uncompetitive behaviour and breaches of data protection. Tomorrow (10 January), Peers in the House of Lords will be debating once again this issue at the Report stage of the Bill. Far from being deterred by earlier government arguments, the amendment being proposed by Labour, Lib Dem and Cross-bench Peers recommends a ‘real’ kind of collective redress where both affected individuals and suitably qualified organisations can represent all consumers that are victims of the same harm, and can demand compensation too.
By voting to include a collective redress system in the Data Protection Bill, UK legislators will be taking one further step in the right direction to justify the claim and ambition for a ‘gold standard’ on data protection.