Producing real change: key highlights of our 2025 results by season

Our key achievements since the beginning of 2025.

Long Read

We continue producing real change by challenging governments and corporations that use data and technology to exploit us.

Since the beginning of the year, we’ve achieved some wins and would like to share them with you.

Creating change is hard, and takes time. We have to uncover problems, draw attention to them, and pressure for change. In the first quarter of the year, we’ve been able to challenge secret surveillance powers, research the data practices of the menstruation apps, contribute to fight against use of the Facial Recognition Technology in Brazilian schools, support our partners in protecting people from the use of their biometric data in Kenya.

Take a look below for a quick overview of the results we produced or contributed towards, by season.

Spring 2025

UK tribunal pushes for more transparency and will hear our challenge against secret order

In March 2025, PI launched a legal challenge against the UK government’s use of a Technical Capability Notice. Following our complaints, on April 7 the Investigatory Powers Tribunal confirmed it will hear our challenge to the legality of the Home Secretary’s decision to use a Notice to secretly force Apple to allegedly reduce security in order to give the UK Government access to users’ secured data stored on its iCloud service. The Tribunal also rejected the Government request to keep basic details of Apple’s case secret. We launched this case in partnership with the UK campaign group Liberty.

What this means in short: The tribunal agreed that it’s in everyone’s interest that we have an open examination of the obscure government’s surveillance powers. The next stage of the case will delve into the substance of our legal objections.

Our Menstruation app research gets industry attention

As part of our revisiting of earlier research on the data practices of period-tracking apps, we contacted the companies whose apps we tested in this latest round. We received significant feedback from them. One company, in their reply, said they revised slightly their privacy policy to reflect our position, and others provided relevant comments or recommendations. Given that we’ve just recently launched our report, this is a good indication that our research can drive change.

What this means in short: Period-tracking apps must consider their data sharing practices amidst heightened concerns in this challenging policy environment in order to better protect their users. Getting feedback from companies on our research is a first step towards making their products better.

Lawsuit against use of facial recognition in Brazilian schools

In April 2025, the Public Prosecutor’s Office of Paraná, Brazil, sued the state government for using facial recognition in public schools. The Prosecutor’s Office argues that the program violates children’s data protection rights, especially given their vulnerability and the lack of informed consent. The lawsuit demands: (i) Immediate suspension of biometric data collection from nearly 1 million students (ii) R$15 million in damages for collective moral harm. This came after the UN Special Rapporteur on the right to education published her report on Academic freedom, which recommends that states ban facial recognition technologies from educational institutions. We have been advocating for this, by exposing the problem and calling for the Special Rapporteur’s recommendation to ban FRT in schools.

What this means in short: Children in school in Brazil should be free from harmful use of the Facial Recognition Technology, and children’s rights should be properly protected. Our demands materialised in a concrete action from an independent legal body.

Kenyan high court ruled against WorldCoin on case brought by ICJ Kenya

On 5 April 2025, the International Commission of Jurists (ICJ) Kenya, our partner organisation, together with the Katiba Institute filed a case against Worldcoin in the High Court of Kenya. They challenged Worldcoin’s collection, processing, and transfer of biometric data (such as iris and facial scans) without proper consent or a legally required Data Protection Impact Assessment. On 5 May, the court ruled that the company had violated Kenya’s Data Protection Act and ordered the deletion of illegally collected data. PI had not been directly involved in case, but supported the advocacy and research around it.

What this means in short: The ruling set a legal precedent for how tech companies must handle sensitive personal data responsibly and impose protections of people’s biometric data from misuse by private companies.

Please consider supporting us as we continue our work to protect privacy and human rights around the world.