Companies selling diet programmes are using tests to lure users. Those tests encourage users to share sensitive personal data, including about their mental health. But what happens to the data? We investigated to find out.
Companies are deploying satellites capable of tracking signals and selling access to the data collected to government agencies. We explain what this nascent industry is selling, why border agencies are among their customers, and why it matters.
In this briefing, PI together with Amnesty International and SOMO seek to aid civil society efforts toward greater oversight, accountability and remedy of corporate structures that have been reported to contribute to government surveillance of individuals, including human rights defenders.
An array of digital technologies are being deployed in the context of border enforcement. To effectively critique state use and delve into potential benefits of satellite and aerial surveillance, we must first understand it.
La soumission de PI à propos de la conformité de la France avec le Pacte international relatif aux droits civils et politiques, soulignant des inquiétudes concernant les mesures d'urgence prise pour faire face au Covid-19, les pouvoirs de surveillance, ainsi que les difficultés rencontrées par les personnes transgenres pour modifier leurs papiers d'identité.
PI's submission regarding France's compliance with the International Covenant of Civil and Political Rights highlighting concerns regarding Covid emergency measures, surveillance powers, and transgender people difficulties to change their ID.
This article was written by Abdías Zambrano, Public Policy Coordinator at IPANDETEC, and is adapted from a blog entry that originally appeared here. Digital identity can be described as our digital personal data footprint, ranging from banking information and statistics to images, news we appear in
The controversial Police, Crime, Sentencing & Courts Bill (‘Policing Bill’) includes provisions for ‘extraction of information from electronic devices’ by immigration officers. The provisions to seize and extract rely solely on ‘voluntary provision’ of devices and ‘agreement’ to extract data.
We are concerned immigration officers not only lack requisite skills, the power imbalance between state and migrant calls into question whether provision of a device can ever be truly voluntary.
This proposal comes at a time when there is a total lack of transparency around Home Office use of mobile phone extraction.
On 28 June 2021, Privacy International alongside 15 other civil society organisations published an open letter to all Members of the European Parliament Committee on the Internal Market and Consumer Protection raising concerns about the European Commission’s Digital Markets Act (DMA) proposal. Privacy International believes that the proposal needs further strenghtening as we noted also in our analysis of the DMA.
The controversial Police, Crime, Sentencing & Courts Bill includes provisions for 'extraction of information from electronic devices'. It relies solely on voluntary provision and agreement. We analyse the power imbalance between the State and individual - which calls into question 'voluntary provision' and 'agreement' as a basis for seizure of a device and extraction of data.
Read our new ‘Free to Protest’ guides and learn about the high tech surveillance tools that enable the police to identify, monitor and track protestors, indiscriminately and at scale - and find out how you can better protect yourself.
‘Free to Protest: The protestor’s guide to police surveillance and how to avoid it’ (UK edition) is a collection of bite-sized guides about high-tech police surveillance capabilities at protests, including tips and strategies about how you can protect yourself from being identified, tracked and
Privacy InternationaI, Liberty, Defend Digital Me, Open Rights Group and Big Brother Watch submitted a response to the College of Policing's public consultation on the Police use of live facial recognition technology.
In the response, we make it clear that all the aforementioned organisations believe that LFRT poses significant and unmitigable risks to our society.
The global Covid-19 pandemic has acted as a catalyst for technology-intensive initiatives for welfare distribution, coming at a high cost to human rights and inclusion: enforcing automated discrimination, exacerbating existing inequalities and compromising access to essential benefits.
Privacy International submitted its input to the forthcoming report by the UN High Commissioner for Human Rights (HCHR) on the right to privacy and artificial intelligence (AI.)
In our submission we identify key concerns about AI applications and the right to privacy. In particular we highlight concerns about facial recognition technologies and the use of AI for social media monitoring (SOCMINT). We document sectors where the use of AI applications have negatively affected the most vulnerable groups in society, such as the use of AI in welfare and in immigration and border control.
The briefing also argues for the adoption of adequate and effective laws accompanied by safeguards to ensure AI applications comply with human rights.