Get out of our face, Clearview!
Our legal action against a company that collects photos of you and your loved ones online.
Companies like Clearview AI are in the business of hunting faces. They trawl through sites like Instagram, YouTube and Facebook, as well as personal blogs and professional websites, and save a copy of public photos that contain a face. They then use facial recognition technology to extract the unique features of people’s faces, effectively building a gigantic database of our biometrics. We have taken legal action to stop their practices in Europe.
Why do they do this?
Because they make money by giving access to their huge database of faces to the police or even private companies.
What this means is that without you knowing, your face could be stored indefinitely in Clearview AI’s face database, accessed by a wide variety of strangers, and linked to all kinds of other online information about you.
PI thinks what Clearview AI is doing goes against privacy laws and is incredibly invasive and dangerous. It also threatens the freedom of opinion and expression of many. That’s why we, together with three other organisations, have taken legal action against the company.
What’s the problem with companies using public images like this?
Almost everyone has photos of them online, whether they know it or not. You could be uploading photos of yourself daily on your social media accounts. You could appear in photos from a work conference you recently attended. You could be sitting at a cafe and end up in the background of some other customer's photo. Or, you could be participating in a street protest and end up in a journalist's photo coverage of it. In many cases, you did not choose for your face to appear online, and where it appears can say a lot about you and your life. But Clearview's technology allows its clients to, at the click of a button, identify you and recoup all of that information about you.
This form of surveillance constitutes a serious interference with privacy rights. More generally, the development and deployment of this sort of surveillance by private actors has a chilling effect on people’s willingness to express themselves online, and can be a threat to people going about their lives freely. It is crucial for a healthy, striving and open Internet that people feel free to share personal information and photos however and wherever they want, without the fear that they might be 'grabbed' by private companies and shared with strangers.
These systems can also cause particular harm to vulnerable communities, who are at heightened risk of harassment and discrimination. In the hands of law enforcement and authorities, tools like these could potentially enable the grouping of people based on their ethnicity or other characteristics, opening the door to discriminatory tracking and monitoring, or practices like predictive policing.
What you can do!
What we want people to know is that there are things that can be done. While we take legal action to ask regulators to stop Clearview AI’s practices, people in the EU can ask the company if their face is in its database. And they can request that their face is no longer included in the searches that the company’s clients perform.
While Clearview AI has now removed from its website information on how to submit a request for access and deletion, we believe that it remains their legal obligation to respond to people’s requests. PI has provided general guidance on exercising your data access rights.