Clearview and the long arm of the law

A legal ruling from the UK puts the onus back on Clearview to change its ways, while also raising important questions about the cross-border application of laws that regulate tech companies. 

Long Read

A tribunal in the UK has overturned a legal judgment about Clearview’s objectionable scraping of images of people’s faces from the internet. The latest ruling helpfully clarifies what should be in scope of data protection law, and provides a sensible view on how companies that operate across many jurisdictions should not be able to dodge the application of local laws.

Clearview’s model of selling intrusive surveillance to law enforcement agencies is not just grim, it also undermines people’s freedom to speak, act and post online without fear. Their business model is to provide services that can can be abused to intimidate and oppress. That’s why we’re pleased to see this judgment bring the UK back in line with other jurisdictions that have banned Clearview’s processing, especially at a time when powers to hold tech companies accountable are starting to fray.

News & Analysis

Clearview have been told that the EU and UK GDPRs apply to their indefensible business of selling your face to foreign law enforcement agencies, after PI intervened in a recent legal case.

Summary of judgment

In 2021, Privacy International and others made complaints about Clearview in five different jurisdictions. We weren’t impressed by the use of web-scraping and facial recognition technology to extract the unique features of people’s faces, effectively building a gigantic database of our biometrics. Across the board, regulators found Clearview’s behaviour to be illegal. However, Clearview appealed its fine in the UK, arguing that the GDPR didn’t apply to it because it provides services to foreign national security and law enforcement agencies and does not conduct business in the UK.

The appeal was at first upheld. But this latest judgment of the Upper Tribunal re-confirms that Clearview’s processing is within the scope of the GDPR. Privacy International, represented by AWO, intervened in the case to push back against the creation loopholes for international actors in data protection law. Our arguments were accepted in the judgment which, while technical in nature, brings out three wider trends worth commenting on.

Private companies can’t easily escape application of GDPR (No ducking, no diving) 

One of Clearview’s major arguments was that the law shouldn’t apply to them because they were operating ‘in furtherance of state functions’. Clearview provides services to a number of foreign government agencies, that much is true. But it went on to argue that because what it does was so intertwined with state functions that the two were “effectively merged”, it should benefit from exemptions to GDPR and separate established principles of international law that prevent one country from making law that restricts how another country’s government can act.

Photo by Anderson Schmig on Unsplash

The Tribunal rejected those claims. On the exemption to GDPR, the Upper Tribunal agreed with the legal interpretation PI had offered, finding the exemption too narrow to provide protection to Clearview. The Tribunal went on to note that the “relationship between the activities of Clearview and the activities of its clients are no more ‘merged’ or ‘fundamentally intersected’ than the activities of parties to any transaction that involves transfers between them of electronic data.” Clearview and the states purchasing its services thus would be considered separately in the legal analysis. And whether Clearview was regulated by the GDPR would hinge on whether and how it was processing the personal data of people in the UK.

Recent months have seen a number of tech companies that are active across the world trying to push back on the application of local laws in the countries where they operate. It’s no secret that Meta aren’t impressed by the EU fining it (and have rallied the US President to be their cheerleader on that front). Mark Zuckerberg’s company has also pushed back against regulatory demands from three Nigerian agencies. Apple too seem unimpressed at the idea that operating in the EU might mean adhering to rules that seek to guide innovation. The effect is multinational (large, often enormous) companies contesting the legitimacy of governments to regulate how they operate within their territory.

At the same time, Governments can often get how they regulate companies (whether foreign or not, tech or not) pretty badly wrong. The UK Government’s use of secret orders to compel Apple to undermine encryption is a good example. Sometimes laws are simply ill thought through. But it must be possible for democratic governments to set and enforce rules that protect people’s rights, while attempts to use state power to undermine rights are resisted. Relying on the Trump administration to strike the balance via its unpredictable programme of undermining foreign regulatory efforts would be an unwise gamble.

While the Clearview case was about a narrow aspect of a specific law, and while we must remain wary of regulations that weaken our rights and threaten our privacy, the Upper Tribunal’s judgment is a reminder of the importance of having courts, legislators and regulators that are able to stand firm against political or commercial pushback. While many tech companies seem to have a love/hate relationship with being rule-setters; it’s a problem if private actors have too much sway over the governance of digital infrastructure. No company is too big or too important to have to care about the law.

National Security and law enforcement exemptions must be narrow (Don’t look away)

Clearview’s arguments also hinged on the nature of the services their processing provides: supporting the work of law enforcement and national security agencies. For example, Clearview has a number of contracts with US agency ICE, which has been hitting the headlines for a range of dubious practices over recent months.

Photo by Ryoji Iwata on Unsplash

The GDPR does exempt certain state functions from its scope. But that’s because there’s a sister law (the Law Enforcement Directive) and further human rights requirements that regulate in its place. Exempting a private company from legal oversight because it is serving a state entity, while also exempting the state entity from oversight because it relies on a private company, could create an untenable legal gap. The Upper Tribunal declined to address this specific concern as it found there was not sufficient evidence or reasoning to connect Clearview’s activities closely enough the states to which it sells it services in order to invoke the immunity Clearview claimed.

But the fact that Clearview ran this argument at all should serve as a warning. Carve outs for national security and law enforcement often appear in these sorts of laws aimed at private actors. For example, the EU AI Act (another law currently under attack because of its effect on foreign companies) includes exemptions in relation to national security and immigration. The AI Act then proceeds to prohibit the development of facial recognition databases based on untargeted scraping of the web (sound familiar?).

National security and law enforcement exemptions need to be interpreted narrowly to prevent unwinding and undermining what these laws are meant to achieve in the first place. Banning an activity for commercial purposes while permitting it in other circumstances may not remove the financial incentive for private companies to pursue it when attractive government contracts are on the line.

Behavioural monitoring can be automated (Nothing to see here)

Lastly, the judgment is also important because the Tribunal found that Clearview’s processing amounts to ‘behavioural monitoring’ (and so was within the extraterritorial reach of the GDPR). Clearview had tried to claim its processing was not ‘behavioural monitoring’ because it was simply passively gathering and amassing data through automated means - there was no analysis or active ‘watchfulness’. But this claim was emphatically rejected by the Tribunal, which rightly recognised that analogies to an analogue human carefully trawling through data or watching a screen are irrelevant to today’s means of mass data collection. As the Tribunal notes, “[t]he key to establishing monitoring is not that someone or something actually accesses the output; it is that the data is available to be accessed should access be needed, and the data has been gathered in contemplation of that potential eventuality.” [312]

Firmly locating Clearview’s activities within the scope of data protection is a must: emerging tech is exacerbating the intrusion that can be caused by mass data scraping and facial recognition. Whether it’s strapping PimEyes to VR glasses, embedding facial recognition in Amazon Ring, or the possibility of using publicly available generative AI tools for facial recognition purposes, companies must be alive to the harms of behavioural monitoring facilitated through the mass gathering of personal data.

Ghost of Clearview & beyond

Despite the thoroughness and clarity of the Upper Tribunal’s judgment, we understand Clearview is seeking to appeal it. Clearview seems intent on fighting based on jurisdictional grounds - perhaps because the most significant time it engaged in a battle over the substance of their processing, it lost in Illinois.

Despite reprimands in the UK, France, Italy, Greece, Austria, the Netherlands and Australia, Clearview has rejected, ignored or evaded the fines issued. A criminal complaint has now been launched, but Clearview shows no signs of stopping its expansion - seeking instead to enhance its privacy-busting technology by acquiring arrest records and social security numbers, benefiting from single source contracts, expanding into new countries such as Argentina, and even attempting to sell to schools.

It’s essential that multinational tech companies are recognised as being within the scope of laws that protect our rights. But it may not mean enough if enforcement falls flat. Better international co-operation may be needed, at a time when that seems in ever shorter supply.