Why Amazon's temporary ban of police use of facial recognition is not enough

Amazon announced that they will be putting a one-year suspension on sales of its facial recognition tech to law enforcement. Here is why think there is still a long way to go.

News & Analysis
Hands holding mask

Photo by John Noonan on Unsplash

Yesterday, Amazon announced that they will be putting a one-year suspension on sales of its facial recognition software Rekognition to law enforcement. While Amazon’s move should be welcomed as a step towards sanctioning company opportunism at the expense of our fundamental freedoms, there is still a lot to be done.

The announcement speaks of just a one-year ban. What is Amazon exactly expecting to change within that one year? Is one year enough to make the technology to not discriminate against women? Is one year enough to eliminate bias inherent in algorithmic systems that can potentially deprive people of social benefits or even put them in jail?

It would be hypocritical and nothing more than a bad PR stunt for Amazon to take a stance like this, if their engagement with the police regarding other technologies continues as normal. Amazon’s video doorbell, Ring, which, for instance, allows people to see who passes by their front door and share footage with each other, has established partnerships with several police forces around the world. It has even been reported that Ring is actively coaching police officers on how to obtain doorbell footage without a warrant.

We are wondering whether this is also among the activities to be ceased or whether Amazon thinks that turning neighbours into surveillance officers is acceptable. Let's not forget that, according to reports, the great majority of people being reported as “Suspicious” on Neighbors, a social media crime-reporting app owned by Amazon, are people of colour. If this technology too is helping perpetuate stereotypes, why not shut it down?

Just this week, we saw IBM, a company that has coined the term smart city and has been instrumental in developing the technical capabilities of police forces, say that they will “no longer offers general purpose IBM facial recognition or analysis software”.

When left unchallenged, public-private surveillance partnerships can eventually normalise surveillance and place us all on watchlists. They can have a negative impact on our right to protest, our ability to freely criticise the government and express dissenting ideas. Just look at the vicious cycle we have here: big tech offering surveillance technology to governemnts, these solutions being discriminatory and biased and thus resulting in serious abuses, people protesting against the systemic racism our societies are suffering from, police using the same surveillance technology again to tackle protesters.

If Amazon, and their like, want to send a strong message against the abuses that big tech have reportedly supported through public private surveillance partnerships, then they need to stop rushing to develop the latest "shiny thing" that is ultimately detrimental to society and inevitably results in human rights abuses.

Starting from now, we expect Amazon, IBM and others to be using this year and beyond to focus on human rights as the starting point for all their products and services, no exceptions.

This is the only way to effectively redeem themselves and take responsibility for the injustices that are now putting people on the streets. Otherwise they, as much as the governments that use their products, will have to face up to their role in the abuses yet to come.