Search
Content type: Advocacy
Generative AI models are based on indiscriminate and potentially harmful data scrapingExisting and emergent practices of web-scraping for AI is rife with problems. We are not convinced it stands up to the scrutiny and standards expected by existing law. If the balance is got wrong here, then people stand to have their right to privacy further violated by new technologies.The approach taken by the ICO towards web scraping for generative AI models may therefore have important downstream…
Content type: Press release
9 November 2023 - Privacy International (PI) has just published new research into UK Members of Parliament’s (startling lack of) knowledge on the use of Facial Recognition Technology (FRT) in public spaces, even within their own constituencies. Read the research published here in full: "MPs Asleep at the Wheel as Facial Recognition Technology Spells The End of Privacy in Public".PI has recently conducted a survey of 114 UK MPs through YouGov. Published this morning, the results are seriously…
Content type: Long Read
What Do We Know?
In late March, the NHS quietly announced that it would give technology businesses access to unprecedented quantities of patient data for processing and analysis in response to COVID-19. One of those businesses is CIA-backed Palantir Technologies. Palantir’s software is allegedly “mission critical” to US Immigration and Customs Enforcement’s (ICE) mass raids, detentions, and deportations. Despite trusting Palantir with patient data, the NHS has been tight-lipped about the scope…
Content type: News & Analysis
Yesterday, Amazon announced that they will be putting a one-year suspension on sales of its facial recognition software Rekognition to law enforcement. While Amazon’s move should be welcomed as a step towards sanctioning company opportunism at the expense of our fundamental freedoms, there is still a lot to be done.
The announcement speaks of just a one-year ban. What is Amazon exactly expecting to change within that one year? Is one year enough to make the technology to not discriminate…