Data Protection Impact Assessments and ID systems: the 2021 Kenyan ruling on Huduma Namba

In this article we provide background on the initial challenge of the Huduma Namba and subsequent developments which led to an important ruling of the High Court of Kenya on the retrospective effect of the Data Protection Act as we reflect on its wider implications for the governance and regulation of digital ID systems.

News & Analysis
Huduma Namba logo

Source: Ministry of Interior and Coordination of National Government

In a ruling handed down on 14 October 2021 by the High Court of Kenya in relation to an application filed by Katiba Institute calling for a halt to the rollout of the Huduma card in the absence of a data impact assessment, the Kenyan High Court found that the Data Protection Act applied retrospectively.

In this article we provide background on the initial challenge of the Huduma Namba and subsequent developments which led to this important ruling of the High Court of Kenya as we reflect on its wider implications for the governance and regulation of digital ID systems.

Background to the case

Huduma Namba as initially proposed

In January 2019, the Kenyan Statute Law (Miscellaneous Amendment) Act No. 18 of 2018 came into effect, introducing a raft of amendments across several laws including the Registration of Persons Act ("the Act").

The amendments to the Act established a National Integrated Identity Management System (“NIIMS”) that was proposed as a single source of personal information of all Kenyans as well as foreigners resident in Kenya. According to Section 9A(2)(a) and (b) of the amended Act, NIIMS was introduced to:

  • create, manage, maintain and operate a national population register as a single source of personal information of all Kenyan citizens and registered foreigners resident in Kenya;
  • assign a unique national identification number to every person registered in the register.

In order to constitute the NIIMS database established under section 9A of the Act, the government ordered Kenyan citizens and foreign nationals to provide sensitive personal information purportedly to establish, verify and authenticate their identity, and subsequently receive a unique identity number, known as Huduma Namba, through a 30-day nationwide mass biometric registration exercise started in March 2019.

Registration for the Huduma Namba was initially compulsory, and proposed to be a prerequisite to accessing government services after 12 December 2021.

Timeline of Huduma Namba roll-out & brief overview of legal challenges

Towards the end of 2019, three petitions were filed at the Kenyan High Court challenging various aspects of the proposed NIIMS. Privacy International submitted a witness statement in support of the application filed by the Nubian Rights Forum.

One of the main legal arguments contained in these petitions was that the NIIMS system called for the collection of huge amounts of personal information in the absence of a national data protection law. The lack of an adequate data protection law in effect meant the unregulated collection and processing of personal and biometric data.

Establishing limits on the data collected by NIIMS

The Kenyan High Court joined all three petitions and issued a judgment on 30 January 2020. In a decisive move, the Court agreed with the petitioners that the proposed collection of DNA and GPS co-ordinates for purposes of identification under NIIMS was "intrusive and unnecessary" (para.784). After recognising that DNA and GPS coordinates amounted to personal data of a sensitive and intrusive nature requiring heightened protections (para. 772), the Court found their collection to be unjustified in light of the government's practical inability to process both DNA and GPS data for the entire population (para. 781).

Accordingly, we find that the provision for collection of DNA and GPS coordinates in the impugned amendments, without specific legislation detailing out the appropriate safeguards and procedures in the collection, and the manner and extent that the right to privacy will be limited in this regard, is not justifiable. (para. 919).

Despite its robust objections to the collection of DNA and GPS data, the Court overall agreed with the government that the collection of biometric data was "necessary for the purposes of identification" (para. 910), and therefore the introduction of NIIMS did not amount to a violation of the right to privacy as argued by the petitioners. You can read more about the ruling and its impact here.

Setting minimum standards for the operationalisation of NIIMS

Despite its overall conclusion in favour of deploying NIIMS, the Court extensively commented on the regulatory vacuum in which NIIMS was conceptualised. As the Kenyan Data Protection Act had not come into force at the time the petition was made, the compatibility of NIIMS with data protection laws was outside the scope of the challenge. The Court nonetheless noted the absence of a regulatory framework governing the operations and security of NIIMS at the time of its conception, as well as the government's failure to provide "any cogent reason for this obvious gap" (para. 1038), which led it to conclude that NIIMS "appeared to have been rushed" (para. 922). Unsurprisingly, the Court found that the legal framework on the operations of NIIMS was "inadequate, and poses a risk to the security of data that will be collected in the system" (para. 1038).

The Court briefly commented on the new Data Protection Act having come into force roughly a month before the judgment, noting that it was not enough to just have a data protection legal framework in place, but that "once in force, data protection legislation must also be accompanied by effective implementation and enforcement" (paragraph 1035) and "adequate protection of the data requires the operationalisation of the said legal framework." (paragraph 1036).

Accordingly, the Court ordered that the Government could not proceed with the implementation of NIIMS until there was "an appropriate and comprehensive regulatory framework on the implementation of NIIMS".

Basis of the October 2021 ruling

While the Nubian Rights Forum petition described above was still pending, the Cabinet Secretary of the Ministry of Interior and Coordination of National Government announced the launch and availability of the Huduma (identity) card from 1 December 2019. This announcement was in the form of a press statement published on 18 November 2019 which implied that the Kenyan government had met any prerequisite requirements for setting up the NIIMS scheme when it appointed a national Data Commissioner.

On 24 November 2019, Katiba Institute filed an application with the High Court of Kenya calling for a halt to the rollout of the Huduma card on the basis that that the Card was being launched without a data impact assessment. Katiba Institute submitted this was contrary to the provisions of section 31 of the Data Protection Act and is also in defiance of the orders and direction of the same Court in the Nubian Rights Forum case.

Key aspects of the ruling

In a ruling handed down on 14 October 2021 in relation to Katiba Institute’s application, the Kenyan High Court found that the Data Protection Act applied retrospectively. Therefore, the requirement contained in section 31 of the Data Protection Act to conduct a data protection impact assessment where a data processing operation is likely to result in high risk to the rights and freedoms of a data subject, applied.

Reading data protection into the right to privacy

It was undisputed that the roll-out of the mass collection of personal data under the National Integrated Identity Management System took place in March 2019, well before the Kenyan Data Protection Act came into force on 25 November 2019.

Therefore, the question before the Court was not whether the Act was in force at the time the data collection exercise complained of took place; but whether the Data Protection Act, and in particular section 31 concerning data protection impact assessments, were intended to have retrospective effect. The High Court answered this question in the affirmative, finding that the Data Protection Act had been enacted to give effect to the right to privacy as protected by Article 31 of the Kenyan Constitution. This was apparent to the High Court both in light of the substantive content of Article 31 of the Constitution, and the explicit provisions contained within the Data Protection Act itself, which in its preamble - and again in Section 3 - mentioned its role in protecting the privacy of individuals, by direct reference to Article 31 of the Constitution or otherwise.

The High Court accordingly found:

Needless to say, the need to protect the constitutional right to privacy did not arise with the enactment of the Data Protection Act; the right accrued from the moment the Constitution was promulgated. It would be unreasonable, in these circumstances, to argue [...] that the obligation to protect the individual rights under Article 31 of the Constitution is a new obligation or duty imposed on the state only when the Data Protection Act came into force and that for this reason, Section 31 of the Data Protection Act cannot be said to be retrospective.

The High Court was clear that it would have been preferable for the Data Protection Act to be in place prior to the collection and processing of personal data under NIIMS. In the absence of a Data Protection Act at the time of the mass data collection exercise, it reasoned:

[...] since the state chose to put the cart before the horse, so to speak, it has to live with the reality there now exists legislation against which its actions must be weighed irrespective of when they were taken so long as those actions touch on the individual's right under Article 31 of the Constitution.

To the extent that the government had failed to comply with the provisions outlined in the Data Protection Act, it had accordingly failed to fulfil its legal obligations. The government's actions in rolling out the Huduma Namba cards had therefore been ultra vires.

While the judgment did not explicitly address the benefits of data protection impact assessments at large, the Court specifically considered the fairness of enforcing Article 31 of the Kenyan Data Protection Act on data protection impact assessments.

In the Court's view:

The question would be, where does fairness lie in terms of protection of the individual right to privacy? Is it in retrospective application of section 31 of the Data Protection Act or in the rule against such application? I would stand with the individual or the citizen against the might of the state and hold that fairness is in interpreting section 31 as being retrospective in its application.

Practical impact

The High Court quashed the government's decision of 18 November 2020 to roll out the Huduma Namba cards, and issued an order compelling the government to conduct a data protection impact assessment in accordance with Section 31 of the Data Protection Act before continuing to process data and rolling out the Huduma Namba cards.

Why this latest ruling matters, and what we can take from it

As we have highlighted elsewhere, ID systems are not without risks. And yet one of the key problems we continue to observe with decision-making processes to deploy digital ID systems is the lack of careful consideration, identification and deliberation of the impact such systems could have on the people and the communities which interact with them, and those who won't because of existing factors and barriers which mean they will be excluded by the system.

Impact assessments in context

Adopters and proponents of such digital ID systems are still not systematically identifying and recording the associated risks which should inform the design of the system and the measures which must be adopted to prevent or mitigate these risks. Importantly, this first step in the process could also help to identify proposals which should never go beyond the concept stage where the risks outweigh the benefits and the risks cannot be mitigated sufficiently to a level that is deemed acceptable.

There are a wide variety of approaches to impact assessments and what they are called may vary. Common references include: risk assessments; risks, harms and benefits assessments; human rights impact assessments; and data protection or privacy impact assessments.

In data protection terms, this assessment is often called a data protection impact assessment (DPIA) and it is a common obligation imposed on those who will process data, data controllers, by data protection laws around the world. The language may vary but this obligation often entails a requirement for a data controller to assess:

  • the necessity and proportionality of the processing
  • the risks to individuals, and
  • how these risks are to be addressed, namely the mitigation strategies.

In the human rights sector, these are often referred to as human rights impact assessments and entails considering the broader human rights implications of a policy and/or practices. They provide a process by which to:

  • Identify and address adverse human rights impacts;
  • Undertake and contribute to an effective human rights due diligence, cross-checking proposed activities with existing obligations of Members States, the role of industry and how their engagement is regulated;
  • Facilitate meaningful dialogue between stakeholders in a particular context enabled by a process of open, inclusive consultations; and
  • Empower rights-holders to hold governments and businesses accountable for their adverse human rights impacts by integrating them in the decision-making process which also provides them with a platform to express their needs, interests and concerns.

Ultimately, regardless of the approach and what they are called, these assessments all serve a same objective: to understand the positive and negative impact of a particular data processing activity or larger system, identify the risks, and then take measures to prevent or mitigate them accordingly through a variety of measures.

Undertaking such assessment as part of broader human rights due diligence processes is a requirement that PI and others have been demanding of governments and companies to undertake in an attempt to limit and prevent digital initiatives from resulting in, contributing to, or exacerbating existing human rights abuses and violations.

Assessing ID systems

By choosing to enforce the Kenyan Data Protection Act's provision on data protection impact assessments as a matter of fairness, the ruling indicates that undertaking a data protection impact assessment is, if not a key component of accountability mechanisms, at least an important mechanism to ensure the protection and respect of fundamental rights and freedoms. That is the core message to be extracted from the ruling. This aligns with the pitfalls identified by rulings of courts in other countries where legal challenges were brought against digital ID systems. The Mauritian Supreme Court highlighted how security risks associated with biometrics were not adequately defended against, the Aadhaar judgement in India raised concerns around centralised databases, the Supreme Court of the Philippines identified the risk that an individual’s movements could be tracked using a national identity system, and the Kenyan High Court specified risks of exclusion as a result of biometric failures as well as other identity system registration failures.

In the case of a digital ID system, this would require that any actor who plans to deploy such a system must prior to any other processes assess its impact - both negative and positive - and for that to form the basis for all decisions made from that point onwards, including decisions related to the following aspects of the ID system:

  • its design, shape and forms, i.e. whether to use biometrics, whether the database is encrypted, or whether it is proprietary or open-source;
  • what data will be processed and for what purpose, i.e. for identification, authentication, verification;
  • the conditionalities for interacting with the systems which may help identify risks of exclusion and discrimination, i.e. pre-requisites for registration;
  • the contexts in which it will be deployed and be required, which may help identify risks of exclusion as well as data exploitation, surveillance and generally mission creep, e.g. to access public services such as health and financial assistance.

There is ample evidence of the potential risks and harms which are likely to emerge if decision-making around digital identity systems from concept to design and implementation does not carefully consider the implications of such systems on the people and communities which interact with them, as well as those who do not or cannot do so as noted above.

We need to see governments, whether they are legally bound or not to undertake impact assessments, along with other safeguards, as a best practice at the initial stages of decision-making for the deployment of a digital ID system.