A public-private partnership and the technologies it deploys must be subject to continued independent oversight, to ensure they remain circumscribed to their stated purpose, to detect abuses or resulting harm, and to require redress.
The UN Guiding Principles on Business and Human Rights require that states exercise “adequate oversight in order to meet their international human rights obligations when they contract with, or legislate for, business enterprises to provide services that may impact upon the enjoyment of human rights.”
Continuing oversight of the deployment and results of a technology is essential to ensure that accountability mechanisms are properly used and work to constrain the use of the technology to its stated purpose, detect abuses or resulting harm, and require redress. The UN Special Rapporteur on Counter-Terrorism and Human Rights has explained that “[s]urveillance systems require effective oversight to minimize harms and abuses.” The Special Rapporteur recommended that “[s]trong independent oversight mandates […] be established to review policies and practices, in order to ensure that there is strong oversight of the use of intrusive surveillance techniques and the processing of personal information.” The safeguards in this section therefore recommend concrete ways of establishing relevant oversight mechanisms, that address the potential harms caused by the deployment of private technologies on affected individuals and communities.
Safeguard 19 - Independent oversight body
When a new PPP is deployed, establish or designate an independent oversight body (depending on the technology and authority concerned, this could be the country’s data protection authority if one exists, or an authority responsible for overseeing investigatory powers) responsible for (1) reviewing, approving or rejecting new proposals for use of the technology or system deployed as part of the PPP, (2) undertaking regular audits of the technology deployment including public consultations on the impact of a technology on the rights of civilians and the achievement of its intended objective(s), and (3) receiving grievances and mediating those between the public and the entities using the technology. This independent oversight body should be given appropriate resources (human and financial) to be able to perform its duties.
No independent entity responsible for overseeing the partnership and its obligations to the public
Example(s) of abuse
- MPE in the UK: the use of mobile phone extraction (‘MPE’) technology by police forces in the UK went on for years in ways the ICO later found inappropriate and unlawful.
Safeguard 20 - Civilian control board
When a technology is likely to affect certain communities in a disproportionate way, institute a “civilian control board” composed of individuals directly affected by the technology, in particular those at risk of discrimination. This control board should be consulted prior to deployment of the technology, seek consent of the affected population, and be tasked with receiving and voicing grievances as to the impact of the technology on individuals’ rights throughout the deployment’s lifecycle.
Lack of consultation of communities and civilians affected by the deployment of technologies
Example(s) of abuse
- Amazon Ring and police forces: no consultations of communities prior to co-opting Ring’s private security cameras for law enforcement.
Safeguard 21 - Regular audits/impact assessments
Throughout the lifecycle of a technology’s deployment, public authorities ought to record indicators of performance of the technology such as successes, failures, accuracy levels, purpose and outcome. Through an independent oversight body, and in collaboration with a civilian control board, they should carry out regular audits of the technology and updates to relevant HRIAs. These audits should include regular consultations with groups and individuals affected by the technology (in particular those at risk of discrimination) and with CSOs, to evaluate the ongoing or potential impacts of the technology in a holistic way.
A “retrospective” audit should also be performed after the contracting relationship has ended, as the impacts of a technology on human rights can sometimes be delayed. Conclusions of such audit should be published and inform the assessments of all future PPPs.
Lack of ongoing impact assessments
Example(s) of abuse
- Police forces in the US do not record questionable or negative results of facial recognition technology (‘FRT’), producing a one-sided, entirely positive view of FRT.