
Photo by Joel Filipe on Unsplash
Our key achievements since the beginning of 2025.
Photo by Joel Filipe on Unsplash
We continue producing real change by challenging governments and corporations that use data and technology to exploit us.
Though 2025 has been a tumultuous year, we’ve achieved some wins and we would like to share them with you.
Creating real change is hard, and worthwhile changes takes time. We uncover problems, draw attention to them, and pressure for change. In the second quarter of the year, we drew attention to menstruation apps and explored improvements to their products’ safety and privacy; we continued to challenge secret surveillance powers; we pushed the International Labour Organization to agree to develop standards for platform workers; and we got support for a ‘rights-based approach’ to safety in Education.
Take a look below for a quick overview of the results we produced or contributed towards, by season.
Following the launch of our research on the Menstruation apps data sharing, we have been approached by two app providers. One of the apps was not part of our analysis, but after reading the report they offered to meet with us and have an open dialogue regarding their app and privacy policy.
What this means in short: Period-tracking apps must protect their users. Our research initiated conversations on how firms can make their products better.
In June, the US House Judiciary Subcommittee on Crime and Federal Government Surveillance held a hearing on how foreign governments might influence Americans’ data through the CLOUD Act. The spotlight was on a controversial data-sharing agreement between the US and the UK, and a secret UK order (known as a ‘Technical Capabilities Notice’) that allegedly required Apple to create a backdoor into its iCloud storage to grant potential access to users’ private data. PI was one of two civil society organisations invited to testify, a strong recognition of our expertise and leadership on these critical issues. Meanwhile, our legal challenge against the secret UK order is moving forward. The Investigatory Powers Tribunal (IPT) expressed interest in hearing in public as much as possible of Apple’s and PI’s claims in early 2026.
What this means in short: A secret order which can be used to force the re-architecture of entire systems represents a dangerous, disproportionate and intrusive surveillance power, which needs to be restrained. PI’s vast experience can challenge that power.
On 12th June 2025, the General Conference of the International Labour Organization (ILO) agreed a resolution committing to adopting a binding Convention, supplemented by a Recommendation, concerning decent work in the platform economy. Discussions will continue with a view to adopting these new standards at the International Labour Conference in 2026.
This decision came a couple of weeks after a joint declaration made by PI and 32 other organisations. In our declaration we asked the ILO to protect workers from algorithmic harms by adopting legally binding standards on decent work in the platform economy.
What this means in short: platform workers should be protected by international labour standards. PI is working with other organisations to demand the necessary standards.
In June, the UN Special Rapporteur on Education issued her report on "Safety in education”. The document declares that the “right to be safe in education requires an all-encompassing rights-based approach to safety, for all rights-holders, in all contexts, for all hazards” and calls on states to ensure all security measures respect human rights. The report also states that “facial recognition must be banned in all education”. These were among the key advocacy points made in our submission to the Report.
What this means in short: Students should be protected from abusive use of tech, particularly Facial Recognition Technology.
In March 2025, PI launched a legal challenge against the UK government’s use of a Technical Capability Notice. Following our complaints, on April 7 the Investigatory Powers Tribunal confirmed it will hear our challenge to the legality of the Home Secretary’s decision to use a Notice to secretly force Apple to allegedly reduce security in order to give the UK Government access to users’ secured data stored on its iCloud service. The Tribunal also rejected the Government request to keep basic details of Apple’s case secret. We launched this case in partnership with the UK campaign group Liberty.
What this means in short: The tribunal agreed that it’s in everyone’s interest that we have an open examination of the obscure government’s surveillance powers. The next stage of the case will delve into the substance of our legal objections.
As part of our revisiting of earlier research on the data practices of period-tracking apps, we contacted the companies whose apps we tested in this latest round. We received significant feedback from them. One company, in their reply, said they revised slightly their privacy policy to reflect our position, and others provided relevant comments or recommendations. Given that we’ve just recently launched our report, this is a good indication that our research can drive change.
What this means in short: Period-tracking apps must consider their data sharing practices amidst heightened concerns in this challenging policy environment in order to better protect their users. Getting feedback from companies on our research is a first step towards making their products better.
In April 2025, the Public Prosecutor’s Office of Paraná, Brazil, sued the state government for using facial recognition in public schools. The Prosecutor’s Office argues that the program violates children’s data protection rights, especially given their vulnerability and the lack of informed consent. The lawsuit demands: (i) Immediate suspension of biometric data collection from nearly 1 million students (ii) R$15 million in damages for collective moral harm. This came after the UN Special Rapporteur on the right to education published her report on Academic freedom, which recommends that states ban facial recognition technologies from educational institutions. We have been advocating for this, by exposing the problem and calling for the Special Rapporteur’s recommendation to ban FRT in schools.
What this means in short: Children in school in Brazil should be free from harmful use of the Facial Recognition Technology, and children’s rights should be properly protected. Our demands materialised in a concrete action from an independent legal body.
On 5 April 2025, the International Commission of Jurists (ICJ) Kenya, our partner organisation, together with the Katiba Institute filed a case against Worldcoin in the High Court of Kenya. They challenged Worldcoin’s collection, processing, and transfer of biometric data (such as iris and facial scans) without proper consent or a legally required Data Protection Impact Assessment. On 5 May, the court ruled that the company had violated Kenya’s Data Protection Act and ordered the deletion of illegally collected data. PI had not been directly involved in case, but supported the advocacy and research around it.
What this means in short: The ruling set a legal precedent for how tech companies must handle sensitive personal data responsibly and impose protections of people’s biometric data from misuse by private companies.
Please consider supporting us as we continue our work to protect privacy and human rights around the world.