Facial recognition spreads in Wales despite 92% false positive rate
At the 2017 Champions League Final, South Wales Police deployed an automated facial recognition system that wrongly identified more than 2,000 people in Cardiff as potential criminals. The system's cameras watched 170,000 people arrive in Cardiff for the football match between Real Madrid and Juventus, and identified 2,470 potential matches. According to the force's own figures, 92% (2,297) of these matches were false positives. SWP has also deployed the technology at the annual Elvis festival in Porthcawl, concerts, and a Royal visit, and in coordinated crackdowns on certain types of crimes.
SWP defended its use of the technology by saying that it had led to more than 450 arrests since its introduction, that no one had been arrested after an incorrect match, and that its use had led to several successful convictions and helped identify vulnerable people in crisis (such as the body of a suicide in the River Taff). SWP also said the accuracy rate was improving, blaming the high false positive rate on the poor quality of the images supplied by agencies such as UEFA and Interpol, and said its use is necessary at large events because of the risk of terrorist attacks. SWP also says it's a significant advantage that no cooperation from a person is required. It hopes to be able to integrate the technology with other databases, such as the 19 million-image Police National Database, the Automatic Number Plate Recognition database, and passports and driving licences. Police in Leicester and London's Metropolitan Police are also testing the technology, with similarly low accuracy rates. Big Brother Watch and the biometrics and surveillance camera regulators have called for stronger rules around the use of this technology.
writer: Press Association; Matt Burgess
Publication: Guardian; Wired UK