Protecting persons with disabilities in a digitised world

Support systems are undergoing significant digitisation and automation under the banner of efficiency. Privacy International calls for the impacts of these innovations on the rights of people with disabilities to be comprehensively assessed and addressed.

Key findings
  • When governments interact with people with disabilities to fulfil their obligations, they are likely collecting sensitive data which requires heightened safeguards 
  • The digitisation of social support systems represents an opportunity for accessibility and inclusion, but it also comes with risks that must be assessed and mitigated
  • It is key that implementing bodies and oversight mechanisms take steps to ensure that new technologies keep privacy and data protection at the forefront
Long Read
Image of lone wheelchair in patio


 Photo by Henry & Co. on Unsplash.

Our world is undergoing a seismic process of increasing digitisation, which sees the proliferation of new technologies  and the growing integration of these technologies into public services, which rely more and more on copious amounts of personal data and on automated processes. 

This phenomenon has a unique impact upon the rights of persons with disabilities. As the era of global digitisation causes societies worldwide to undergo a digital metamorphosis, persons with disabilities find themselves at an intersection of empowerment and vulnerability. It is critical that the incorporation of data-driven technologies into the fabric of our societies does not come at the expense of their fundamental rights and freedoms. This article will unpack the unique challenges this digital world poses to the realisation of the rights of persons with disabilities and put forward some key considerations to be taken into account by implementing bodies and oversight mechanisms.

This piece was informed by and expands upon our submission to the UN Office of the High Commissioner for Human Rights (OHCHR), as well as consultations we have conducted with Organisations of Persons with Disabilities (OPDs) across the world over the last year who have provided us with invaluable support and guidance.

Advocacy

PI provided input to the UN Office of the High Commissioner for Human Rights in their consultation on the rights of people with disabilities

A treaty-based connection between data collection and people with disabilities

The rights of persons with disabilities are enshrined in the international legal framework. The starting point is the Convention on the Rights of Persons with Disabilities (CRPD) and its Optional Protocol, both of which came into force in 2008. Other key elements of international human rights frameworks pertaining to the rights of persons with disabilities include the International Covenant on Civil and Political Rights (ICCPR) and the Universal Declaration of Human Rights (UDHR), as well as the Protocol to the African Charter on Human and People’s Rights of Persons with Disabilities in Africa and the Inter-American Convention on the Elimination of All Forms of Discrimination against Persons with Disabilities. 

There is a balancing act to be had between preserving the fundamental human right to privacy and collecting personal data, which is required in many cases for persons with disabilities to gain access to essential social benefits and welfare schemes because of the way digital welfare and social protection programmes have evolved over time. International law sets out specific standards when it comes to the processing of the data of persons with disabilities. For example, Article 31 of the CRPD imposes an obligation to collect data, and states that states’ process of collection and maintaining data on persons with disabilities must: 

“[c]omply with legally established safeguards, including legislation on data protection, to ensure confidentiality and respect for the privacy of persons with disabilities” as well as adhering to “norms to protect human rights and fundamental freedoms and ethical principles in the collection and use of statistics”. 

The obligations imposed upon states by Article 31 are laid out in detail in a December 2021 report from the UN OHCHR which states that “data protection laws and policies should include persons with disabilities”, and goes on to emphasise that states should use data privacy and data protection principles when developing policies that may affect persons with disabilities. 

This balancing act makes it all the more important that states take persons with disabilities explicitly into account when digitising access to public services and active participation in society in ways that will impact, even indirectly, persons living with disabilities.

The rights of persons with disabilities in the context of global digitisation and the use of new technologies

The significance of the shifting context of the increasing digitisation of public services has been recognised by the UN Department of Social and Economic Affairs, which stated that “Information and communication technologies and infrastructures are rapidly growing in importance in the provision of information and services to the population”. Whilst such a context presents key opportunities in terms of accessibility, as documented by the World Health Organisation, it also risks undermining the right to privacy in the absence of appropriate safeguards and mitigations. 

Automated decision-making (ADM)

Imagine a world where decisions, pivotal and mundane alike, are shaped not by human intuition but by silent algorithms orchestrating our daily lives. These algorithms tend to be designed by third-parties who are far removed from the reality of those whose lives will be impacted. This is the world of automated decision-making (ADM), where invisible hands of technology thread strands into the fabric of our societies. ADM increasingly is involved in making decisions about who can access a public service or benefit, and the implications for persons with disabilities are huge. 

Report
In contrast to automated decision-making, profiling is a relatively novel concept in European data protection law. It is now explicitly defined in Article 4(4) of the EU General Data Protection Regulation (GDPR), and refers to the automated processing of data (personal and not) to derive, infer

The threats posed by ADM to fundamental rights

ADM harnesses the power of algorithm-powered Artificial Intelligence (AI) and plays a growing role in making social welfare determinations. The use of this technology by governments in the provision of social services brings forth an array of challenges, from the opacity of its operations to concerns about reliability and bias, and thus has the potential to seriously undermine the rights of persons with disabilities, particularly when lacking meaningful human intervention. 

These concerns have been repeatedly expressed by consensus in the international community. For example, in 2023, the UN Special Rapporteur on the right to health voiced such concerns, as did the UN Special Rapporteur on extreme poverty and human rights in 2019, and the EU Parliament voted to prohibit AI systems that pose an “unacceptable level of risk to people’s safety”, such as “biometric categorisation systems using sensitive characteristics […]”. Similarly, the UN Special Rapporteur on the rights of persons with disabilities commented on the “well-known discriminatory impacts” of ADM, in part due to its black-box operation which makes transparency and accountability for the decisions that it makes extremely difficult to obtain. They went on to recognise the specific impacts ADM can have on the rights of persons with disabilities, stating: “Biased data sets and discriminatory algorithms can restrict persons with disabilities from employment or benefits making them even more vulnerable to poverty and marginalization, and in ways that are more systematic and harder to detect”.

Broadly, there exist two types of ADM; full-ADM, where decisions unfold without any human intervention, and semi-ADM, where a human is involved in the decision-making process, and both have been subject to scrutiny because of the concerns they raise for people and their rights. The European General Data Protection Regulation's (GDPR) Article 22 recognizes the right not to be subjected to full ADM. It is especially important when it comes to social welfare systems that decisions that will significantly impact individuals’ lives should not be left to the cold calculations of machines alone and that the human component of a semi-ADM decision-making process be meaningful; something the Wisconsin Supreme Court recognised in 2016 when it sanctioned the racially discriminatory COMPAS risk assessment system. In the Netherlands, a 2023 ruling by the Amsterdam High Court went further and stated that purely symbolic human involvement did not absolve a system from being labelled as fully automated, or full-ADM.

Real-World Scenarios: Unveiling the Impact of ADM and digital social welfare on persons with disabilities

As part of the spreading digitisation of our societies, public services around the world are increasingly relying on technologies including ADM to make decisions such as who is eligible to receive a government benefit. The eruption of the Covid-19 pandemic rapidly accelerated governments’ roll-out of digital welfare programmes. Government-run social protection programmes’ growing reliance on digitisation and technology raises real concerns over the risk that the use of technology will infringe upon fundamental rights and discriminate against persons with disabilities. 

These risks were recognised by the UN Special Rapporteur on extreme poverty and human rights who highlighted the “ various forms of rigidity and the robotic application of the rules” involved in digital welfare states and noted that digital contexts often don’t take into account extenuating circumstances resulting from a disability. Similarly, in 2023 the UN Special Rapporteur on the rights of persons with disabilities warned of serious risks accompanying the advancement of technologies, despite the opportunities they present for realising the rights of persons with disabilities. 

Real-world examples of the use of ADM illustrate these concerns. Cases of ADM in public services being found to discriminate on the basis of nationality and race were recorded in the Netherlands, in Rotterdam City's welfare fraud algorithm, as well as the automated fraud detection system adopted by Dutch tax authorities. Equally, the lack of transparency of ADM systems were highlighted in Colombia where the social protection benefit rolled out in response to the Covid-19 pandemic, Solidarity Income (“Ingreso Solidario”) made flawed eligibility decisions of who would benefit from the programme, but it was impossible to assess how the technology came to choose each person. 

In NigeriaAngola and Mozambique, the eligibility criteria dictating who would and would not benefit from digital Covid-19 social protection response programmes were not made public as documented by Privacy International's research, and serve as cautionary tales.

On top of case studies which exemplify the risks of digitising public services and incorporating ADM, the following case studies demonstrate the specific harms upon the rights of persons with disabilities that digitised social protection programmes can cause:

  • Serbia's Social Card Law: The Serbian government utilizes ADM in its Social Card program, established through the Social Card Law introduced in March 2022. The introduction of this law prompted NGOs to file a joint legal complaint, urging a halt to the law's implementation. Concerns revolve around the system being deemed an "intrusive surveillance system" with potential harm to marginalized members of society. Notably, persons with disabilities, who make up a significant portion of social support recipients, are affected as the ADM system assesses their eligibility for state benefits under the Social Card program. This has led to questions about the legislation's compliance with human rights obligations.
  • The United Kingdom's Department for Work and Pensions (DWP): Privacy International’s research uncovered that the UK Department for Work and Pensions' (DWP) explicitly aims to assess the legitimacy of disability claims by scrutinizing individuals' declared capabilities. The department’s guidelines explicitly endorse the use of surveillance for this purpose, emphasizing that it should capture evidence of both physical and, in some cases, mental capabilities. Adding complexity to the situation is the DWP's reliance on computer algorithms which profile persons with disabilities on the basis of unknown data points. Currently, the DWP faces legal action challenging the fairness and discriminatory nature of the algorithm used to identify individuals as 'fraud risks'. Without clear insights into the assessment criteria, the systems in place may perpetuate serious risks to the integrity of the welfare system and the rights of individuals with disabilities.
  • The Allegheny Family Screening Tool: The impact of ADM transcends borders, as seen in Pittsburgh, USA, where an algorithmic tool faced scrutiny for its biased risk assessments, sparking a U.S. Justice Department investigation into potential discrimination against persons with disabilities. The ADM-driven child welfare services tool included disability status as a data field upon which risk was calculated, as well as tracking whether parents ever received public benefits; something used as proxy data to ascertain whether individuals concerned had a disability. In one case, two disabled parents’ baby daughter was taken into foster care after the automated system flagged them as a risk to their child. 

Towards a privacy-preserving approach to processing the data of persons with disabilities

When it comes to preserving the privacy of persons with disabilities, data protection is just a single, albeit critical, piece of the puzzle. There are well established and internationally accepted norms and principles when it comes to data protection of persons with disabilities. States should incorporate these principles into their national legislations and frameworks in order to ensure the rights of persons with disabilities are protected. 

Article 31 of the CRPD obliges states to “Comply with legally established safeguards, including legislation on data protection, to ensure confidentiality and respect for the privacy of persons with disabilities” and further to “Comply with internationally accepted norms to protect human rights and fundamental freedoms and ethical principles in the collection and use of statistics”. 

In our submission to the UN OHCHR, we urged the office to call upon governments to ensure that their deployment of digital technologies is in line with data protection principles as well as for them to adopt and enforce related national regulatory laws and frameworks that enshrine these principles. More than that, however, we underlined the urgency for governments to systematically conduct human rights and data protection due diligence assessments that take into account persons with disabilities, as well as the dangers that the use of ADM can pose to their rights. We also encouraged the OHCHR to recall the responsibility of businesses in the context of public-private collaborations and facilitating the provision of assistive technologies, products or services. 

Implementing data protection principles for people with disabilities 

Key issues pertaining to data protection principles when it comes to persons with disabilities include the following:

  1. Ensuring there is a legal basis prior to any data processing : The processing of data for individuals with disabilities should adhere to a basis provided by law. Various legal foundations can be applied independently or concurrently when it comes to the processing of sensitive health data for people with disabilities. The CRPD mandates states to collect necessary information so that they can give effect to the Convention, constituting a legal basis as it aligns with a state's compliance with its legal obligations. Convention 108 recognises the legitimacy of legal bases such as free, specific, and informed consent. Legal frameworks require the communication of the legal basis to data subjects, along with information on processed data categories, recipients, and the means to exercise rights. Clearly articulating legal bases is essential for individuals to exercise their rights, especially when relying on free and informed consent, where the dignity and autonomy of persons with disabilities must be respected.
  2. Observing heightened safeguards for the processing of health data: International data protection rules treat information about health conditions, including disabilities, as sensitive personal data. This higher level of protection is a key aspect recognized in Article 6 of Convention 108, stating that such data "may not be processed automatically unless domestic law provides appropriate safeguards". This principle is echoed in national data protection laws globally, which mandate additional safeguards for the processing of health data. In essence, data from individuals with disabilities should only be processed when necessary for the intended purpose. If the data pertains to their disability, it requires heightened safeguards to ensure the sensitive nature of the information is respected during collection and processing.
  3. Using only the data that is strictly necessary: The principle of data minimisation mandates that any entity, whether public or private, should process only the minimum necessary data to achieve a specific and legitimate purpose. Disability groups have called for enhanced privacy protections for individuals with disabilities due to the increasing volume and diversity of processed data. Even non-disability data can inadvertently reveal and identify persons with disabilities. To uphold privacy rights and mitigate risks, it is crucial to minimize the data collected from individuals with disabilities, thus reducing their exposure to potential violations or abuse and ensuring the preservation of their right to privacy.
  4. Limiting use of data to specific purposes: Data processing should have a clear, specific, and legitimate purpose, and any subsequent processing should align with the initially specified purposes. Indiscriminate data collection is not permissible; instead, data collection should be purposeful and well-defined. Concerns arise from observed state practices involving indiscriminate or widespread data collection for assessing social welfare claimants. This includes surveillance and fraud detection. For example, authorised fraud-detection practices in the UK range from physical surveillance and open source intelligence to obtaining information from private companies and approaching individuals. Such practices pose a significant threat to the right to privacy of persons with disabilities, along with other rights, including the right to non-discrimination.

Ensuring that assistive technologies respect their users' privacy

Assistive Technologies (ATs) can include assistive products and services, as well as medical assistive devices, like a hearing aid for example, and these may need to be accessed through state-run social welfare programmes or other public services. 

For persons with disabilities, accessing ATs raises unique concerns over the preservation of their right to privacy including:

  • Private sector and PPPs: States can contract private companies to facilitate access to ATs to persons with disabilities, via Public Private Partnerships (PPPs). When doing this, governments should ensure that adequate safeguards are firmly in place before privatising public responsibilities. Privacy International has developed a set of recommended safeguards which should be adopted when engaging in public-private partnerships related to surveillance technologies and data processing, which encourage centring transparency, proper procurement, legality, accountability, oversight and redress.
  • Centring user privacy: The use of, or access to, ATs for persons with disabilities must not come at the expense of their privacy. There is a risk that data collected by the private sector from persons with disabilities’ use of ATs may be exploited or abused. Indeed, the EU Accessibility Act (Directive 2019/882) clearly states that products must “protect the user’s privacy when he or she uses the accessibility features” and the EU Parliamentary Research Service have warned of the unique threat to privacy that ATs can pose and argued for a privacy-by-design approach in order to ensure patient safety.

 

In an increasingly digitised world, safeguarding the rights of persons with disabilities is paramount. The challenges to upholding these rights posed by automation, data collection, and assistive technologies underscore the need for states and international funders to ensure that the rights of persons with disabilities are specifically addressed and centred. Privacy International remains committed to advocating for the rights of persons with disabilities in this ever-evolving landscape, and to this end we made an August 2023 submission to the UN High Commissioner for human rights, laying out in more detail the concerns described in this article and making a series of recommendations. 

In our submission, we laid out some of the key human rights issues that an expanding digital world poses for persons with disabilities, as our rapidly changing global context introduces new and mounting challenges when it comes to preserving their rights to non-discrimination, equal access to public services, the right to health and the right to privacy. In response to their call for inputs to which we responded, the UN OHCHR are expected to publish their report on "good practices on support systems to ensure community inclusion of persons with disabilities, including as a means of building forward better after the COVID-19 pandemic" which will make direct recommendations to governments and which is set to be presented at the 55th session of the UN Human Rights Council in Geneva, from February 2024. These recommendations will be key to inform changes to the social protection field going forward.

PI will be closely following the publication of the UN report and will continue our global advocacy work in order to better ensure the full realisation of the full spectrum of human rights for persons with disabilities. 

Read more about our project work on disabilities and go here to read our UN submission in full.