Legality, Necessity and Proportionality

The use of a private technology or system to deliver public functions must be legal, necessary to achieve a defined goal, and proportionate (any adverse impact on citizens' rights and freedoms must be justified). Any partnership must be able to show that legality, necessity and proportionality assessments have been performed.

The use of a technology or system to deliver public functions can only ever be legitimate if it is “legal”, in the sense of falling under an appropriate legal framework that authorises such technology to be used for such purposes. This is the principle of legality, a fundamental principle of international human rights law that requires any interference with human rights to be “prescribed by law”. In addition, international human rights law requires that any interference with the right to privacy must be necessary and proportionate. Any technology deployed by the state that has an impact on its citizens’ privacy must therefore demonstrate in “specific and individualized fashion the precise nature of the threat” that it seeks to address. In addition, the principle of proportionality requires that the interference with privacy be both “in proportion to the aim and the least intrusive option available.”

In the context of PPPs, assessments of legality, necessity and proportionality should be performed prior to any contracting with private companies, as well as during the contracting relationship before any individual deployment of the technology.

Safeguard 16 - Legality

Safeguard

When considering the need for, and the deployment of a technology to address a public need or fulfil a public function, the state must consider whether an appropriate legal framework authorises the use of such technology for the intended purpose. The technology should not be experimented with nor deployed before appropriate statutory (not secondary) legislation is passed. Legislation will be appropriate if it authorises the use of the specific technology, by the specific authorities, for the specific purpose – general legislation (e.g. granting blanket powers or complete discretion to law enforcement authorities) will not be sufficient. A proper legal framework must also contain specific policies and guidance governing the use of the technology (such as the technology use Policy put forward in safeguard 13).

Issue addressed

Privacy-invasive technologies are deployed without appropriate legal framework authorising and governing their use

Example(s)

  • MPE in the UK: Mobile Phone Extraction (MPE) technology has been deployed by police forces in the UK for years without a proper legal framework.
  • Huawei in Valenciennes: Huawei deployed surveillance cameras equipped with facial recognition technology in the city of Valenciennes, while FRT is not legally authorised in France.

Safeguard 17 - Necessity

As part of an adequate DPIA and/or HRIA, a necessity assessment must be conducted to clearly demonstrate that recourse to a particular technology or data analytics system is necessary to achieve defined goals, rather than a mere advantage. As part of this assessment, any projected positive effects of a technology should be assessed through a collection of independent evidence sources and comparative practices.

Issue addressed

Technologies deployed through PPPs are not always necessary to achieve stated goals

Example(s) of abuse

  • Huawei in Belgrade: the DPIA did not establish that the use of smart video surveillance was necessary for public safety as it overestimated its positive effects on crime reduction.

Safeguard 18 - Proportionality

As part of an adequate DPIA and/or HRIA, a proportionality assessment must be conducted to measure the adverse impact on citizens’ rights and freedoms and demonstrate that it is justified by a corresponding positive impact on citizens’ welfare. These assessments should take into account the potential chilling effects on other rights such as the rights to freedom of expression and freedom of assembly, which can be affected by surveillance and data processing systems in ways that can be difficult to anticipate and measure.

Issue addressed

Technologies deployed through PPPs often have an impact on human rights disproportionate to their intended purpose

Example(s) of abuse

  • Huawei in Como: the need for a facial recognition system was justified in official documentation by an isolated incident that occurred years before.