Search
Content type: Advocacy
15th June 2022
Despite repeated recommendations by the UN Human Rights Council and the UN General Assembly to review, amend or enact national laws to ensure respect and protection of the right to privacy, national laws are often inadequate and do not regulate, limit or prohibit surveillance powers of government agencies as well as data exploitative practices of companies.
Even when laws are in place, they are seldom enforced. In fact PI notes how it is often only following legal challenges in national or…
Content type: News & Analysis
6th December 2021
What if we told you that every photo of you, your family, and your friends posted on your social media or even your blog could be copied and saved indefinitely in a database with billions of images of other people, by a company you've never heard of? And what if we told you that this mass surveillance database was pitched to law enforcement and private companies across the world?
This is more or less the business model and aspiration of Clearview AI, a company that only received worldwide…
Content type: Examples
20th August 2020
After governments in many parts of the world began mandating wearing masks when out in public, researchers in China and the US published datasets of images of masked faces scraped from social media sites to use as training data for AI facial recognition models. Researchers from the startup Workaround, who published the COVID19 Mask image Dataset to Github in April 2020 claimed the images were not private because they were posted on Instagram and therefore permission from the posters was not…
Content type: Long Read
8th July 2020
Over the last two decades we have seen an array of digital technologies being deployed in the context of border controls and immigration enforcement, with surveillance practices and data-driven immigration policies routinely leading to discriminatory treatment of people and undermining peoples’ dignity.
And yet this is happening with little public scrutiny, often in a regulatory or legal void and without understanding and consideration to the impact on migrant communities at the border and…
Content type: Press release
29th April 2020
Photo by Ashkan Forouzani on Unsplash
Today Privacy International, Big Brother Watch, medConfidential, Foxglove, and Open Rights Group have sent Palantir 10 questions about their work with the UK’s National Health Service (NHS) during the Covid-19 public health crisis and have requested for the contract to be disclosed.
On its website Palantir says that the company has a “culture of open and critical discussion around the implications of [their] technology” but the company have so far…
Content type: News & Analysis
15th May 2019
Photo by Mike MacKenzie (via www.vpnsrus.com)
Ever, a cloud storage app, is an example of how facial recognition technology can be developed in ways people do not expect and can risk amplifying discrimination.
Ever is a cloud storage app that brands itself as “helping you capture and rediscover your life’s memories,” including by uploading and storing personal photos; Ever does not advertise that it uses the millions of photos people upload to train its facial recognition software, which…
Content type: Examples
1st December 2017
Few people realise how many databases may include images of their face; these may be owned by data brokers, social media companies such as Facebook and Snapchat, and governments. The systems in use by Snap and the Chinese start-up Face++ don't save facial images, but map detailed points on faces and store that data instead. The FBI's latest system, as of 2017, gave it the ability to scan the images of millions of ordinary Americans collected from millions of mugshots and the driver's licence…