Our response to the Westminster Hall Debate on Facial Recognition

Advocacy
Facial recognition graphic

This week a public debate on facial recognition will take place in Westminster Hall.

Following a public request for comment by Darren Jones MP (Science and Technology Committee), we sent our responses to the questions asked.

Below you can find the integral content of our letter.

 

1. Would you consent to the police scanning your face in a crowd to check you’re not a criminal?

Facial recognition technology uses cameras with software to match live footage of people in public with images on a ‘watch list’. It is often unclear who might be on a watch list or where the authorities obtain the images included in their watch list databases. The images in a watch list could come from a range of sources and do not just include images of people with criminal records or people suspected of criminal wrongdoing. For example, the images may come from a custody images database, which contains pictures of people who have come into contact with the police, including thousands of innocent people. Images could also come from social media. 

So the very premise of the question "Would you consent to the police scanning your face in a crowd to check you’re not a criminal?' is fundamentally flawed. Facial recognition systems do not 'flag criminals', but cast a much wider net. And that's before we get into the problem of false positives (see below)

Facial recognition cameras are far more intrusive than regular CCTV. They scan distinct, specific facial features, such as face shape, to create a detailed biometric map – which means that being captured by these cameras is like being fingerprinted, without knowledge or consent.

This clearly infringes basic human rights: 

  • Privacy: The use of facial recognition in public spaces is a hugely disproportionate crime-fighting technique, scanning the face of every single person who passes by the camera, whether or not they are suspected of any wrongdoing. And the biometric data it snatches can be as uniquely identifying as a fingerprint or DNA – which are usually only taken from you if you’ve been arrested. But this is taking place on the street without your consent – and often without you knowing at all. Because there is no law covering police use  of facial recognition, there is nothing to stop forces from holding onto your image once you’ve been scanned. 
  • Freedom of expression and association: Being watched and identified in public spaces is likely to lead us to change our behaviour, limiting where we go, what we do and who we spend time with. For instance, several protest groups have told us they would avoid demonstrations if facial recognition was used in the area. 

 

2. Do the advances in this sector make you feel more or less secure?

Biometric data can identify a person for their entire lifetime. This makes the creation of a biometric database problematic, as they have to anticipate risks far into the future – whether that be a change in political situation or regime, a future data breach, or the development of technology meaning that biometrics can be used for more purposes, and could reveal more information and intelligence about individuals than is currently possible. 

When adopted in the absence of strong legal frameworks and strict safeguards, biometric technologies pose grave threats to privacy and personal security, as their application can be broadened to facilitate discrimination, profiling and mass surveillance.

The varying accuracy and failure rates of the technology can lead to misidentification, fraud and civic exclusion. 

By 2016, numerous examples had surfaced of bias in facial recognition systems that meant they failed to recognise non-white faces, labelled non-white people as "gorillas", "animals", or "apes" (Google, Flickr), told Asian users their eyes were closed when taking photographs (Nikon), or tracked white faces but couldn't see black ones (HP). The consequences are endemic unfairness and a system that demoralises those who don't fit the "standard". 

Some examples:

A global industry is fueling the biometrics boom.  But these systems are expensive, and the procurement of such systems by governments is deeply opaque. It is often unclear how the systems actually work and therefore why and how they fail, and how the lessons learned are reflected upon, and there is also a lack of clarity of duties and responsibilities of different parties, particularly when they are being deployed in a legal and regulatory void.

Privacy International and partners have observed that governments are keen to develop data-intensive projects, but often lack any pre-assessment of the risks, or consideration for securing the personal data those projects generate. Purported benefits of the systems are stated without the necessary baseline studies to back up the claims.

Governments and non-state actors in other sectors adopting biometric systems need to ask, why do this at all? What problem is a biometric database trying to solve? How will it succeed? Has it succeeded? Or have new risks been created? It is no use throwing technology at a problem that is not technical and there may be other ways to solve the perceived problem. There must be more transparency and public discussion of the privacy implications and the measures put in place to protect individual’s biometric data.

 

3. Are you concerned about how this technology could be used by the state?

Yes, very much, for the reasons cited above - the building of a biometric database, the lack of consent, the misidenitification, the effect on freedom of expression and the right to peaceful protest, and the problems of further targeting of already over-policed communities. If we accept that creating a national fingerprint database would be draconian, then we must also accept that the much greater violations that occur when our 'faceprint' is continually checked in public places without our consent or knowledge is  a total anathema to our democracy.

Facial recognition technology has been deployed by police forces, despite the fact that often there are no laws or guidelines giving the police the power to use this surveillance power.

In recent years, this technology has been used at music concerts and football matches, shopping centres and high streets, festivals and  protests. It could be rolled out across all public spaces eventually.

Most recently in the UK:

Facial recognition is currently being ‘trialled’ by the Metropolitan Police in London and South Wales Police – but it is likely these trials will be used to justify nationwide use.

Resources