“Do no harm” in the digital age: Privacy and security cannot be ignored
Why is a privacy organisation working with the humanitarian sector, and why does it matter? We may seem like strange bedfellows, but today's ever-growing digital world means that, more and more, people who receive humanitarian assistance are being exposed to unexpected threats.
According to the 2018 Global Humanitarian Overview, there are more than 134 million people across the world in need humanitarian assistance. Of these, about 90.1 million will receive aid of some form. It is likely that the data of each of these people will be collected and processed at some point.
One of the most visible examples include the registration of their fingerprints and iris biometric data as they are registered as refugees. But increasingly, there are less obvious instances of data collection and processing whose risks are maybe even less known. This includes the metadata, i.e. data about data, that is generated across our digital interactions be it communications, financial, or to access services.
As explored in our new report with the International Committee of the Red Cross - The humanitarian metadata problem: ‘Doing no harm’ in the digital era, metadata is generated by humanitarian organisations as they coordinate responses, communicate with staff, and engage with the people they serve.
These activities includeinformation sharing initiatives over messaging apps, SMS or social media, the delivery of aid projects such as cash transfer programmes that use mobile cash or Smartcards, and even internal monitoring and evaluation systems that use data analytics to detect fraud.
But are humanitarian organisations aware of the data their operations generate which is in turn feeding the data exploitation ecosystem? And do they have the mechanisms in place to identify and mitigate the risks?
By shedding light on the key risks emerging from the use of digital and mobile technologies in this sector, we hope that humanitarian organisations will think through some of these questions as they fulfil their duty of care.
This requires them to understand the bigger picture which means an awareness of how the systems on which they rely (often provided by industry) operate in practice, the knowledge of what data is being generated, collected and processed, and an assessment of the safeguards in place to regulate access and of use of that data, and the actors operating in the humanitarian ecosystem.
The right to privacy is upheld by Article 12 of the Universal Declaration of Human Rights. Over 130 countries have constitutional statements regarding the protection of privacy. Just like any other person, those seeking protection must be able to enjoy this fundamental right.
There is no question that advancements in technology, communications and data-intensive systemshave significantly changed the way humanitarian assistance is provided. But unless measures are taken to protect humanitarian organisations and the people they serve, sustaining the impartiality, neutrality and independence of humanitarian action will continue to be challenging.
Humanitarian initiatives must undertake thorough risk and impact assessments prior to implementation so they are able to adopt measures that protect people and their data. This is particularly important in light of the continuous drive to adopt more digital solutions and new technologies such as AI and blockchain.
As recommended in our new report, humanitarian organisations should review their internal processes, build their knowledge and expertise, and importantly, acknowledge the implications of their actions which may run counter to their altruistic efforts so they can improve their ability to mitigate the risks.
We hope that this report will be an opportunity for us to continue engaging with the humanitarian community as we continue our mandate to demand the highest privacy and security safeguards for all.