Search
Content type: Examples
In November 2016 the UK Information Commissioner's Office issued an enforcement notice against London's Metropolitan Police, finding that there had been multiple and serious breaches of data protection law in the organisation's use of the Gangs Violence Matrix, which it had operated since 2012. The ICO documented failures of oversight and coherent guidance, and an absence of basic data protection practices such as encryption and agreements covering data sharing. Individuals whose details are…
Content type: Examples
On 14 May 2018, the husband of the victim, a pharmacist living in Linthorpe in Middlesbrough, subdued his wife with insulin injection before straggling her. He then ransacked the house to make it appear as a burglary. The data recorded by the health app on the murder’s phone, showed him racing around the house as he staged the burglary, running up and down the stairs. The victim’s app showed that she remained still after her death apart from a movement of 14 paces when her husband moved her…
Content type: Examples
In 2018 a report from the Royal United Services Institute found that UK police were testing automated facial recognition, crime location prediction, and decision-making systems but offering little transparency in evaluating them. An automated facial recognition system trialled by the South Wales Police incorrectly identified 2,279 of 2,470 potential matches. In London, where the Metropolitan Police used facial recognition systems at the Notting Hill Carnival, in 2017 the system was wrong 98% of…