Companies must “do no harm” in the humanitarian sector

News & Analysis
ICRC video

Photo credit: Screenshot from video produced by ICRC, ‘What are metadata’, accessible here.

 “The platform [Facebook] does not intentionally cause harm to users. Too often, however, Facebook’s business model allows harm to occur.”

The Editorial Board, The Financial Times, 2 December 2018


Imagine if a recipient of humanitarian aid was prevented from accessing a loan because they were deemed ‘uncreditworthy’ for receiving assistance? Or if a refugee was denied asylum because their social media engagement told a different story? And in some cases, you don’t have to even imagine given reports of humanitarian organisations being spied on by intelligence agencies.

Humanitarian organisations are responding to multiple changes in their sector, including not only a growth in demand for their assistance, but also requirements to be more effective, transparent and accountable. In order to address these challenges, they have turned to innovation in tech and data processing as solutions. 

The problem is that, almost always, these digital solutions are reliant on the private sector for their design, implementation and maintenance i.e. social medial platforms, or messaging apps. Moreover, there are certain sectors such as telecommunications or banking that are rigidly governed and anyone wishing to make use of their infrastructure must accept their conditions. In a sense these companies are increasingly becoming intermediaries for the humanitarian sector and the persons to whom they provide assistance. 

Humanitarian organisations are becoming dependent on services, devices and infrastructure over which they have no control, who have their own business models that drive their decision-making. This means that data, even when they are generated and processed for humanitarian purposes, are vulnerable to the same level of exploitation as any other data controlled by these third parties, whether it is the service provider, the social media platform or a banking institution.

And whilst one would hope that these third parties would be designing systems which by default would protect people and their data, that is often not the case. You don’t have to look too far to see a myriad of examples of data breaches, data leaks, hacking, mission creep, and other violations of privacy and data protection. And many industries are able to flourish on their ability to exploit data as seen with the growing data broker industryadvertising or fintech.

These types of practices by industry are compounded by the lack of transparency of what metadata is generated through the use of their services. Metadata, i.e. data about data, is generated for every single transaction across our digital interactions be it communications, financial, or to access services. The lack of accountability and regulation raises further concerns as to how such is data is then further processed to inform decision-making processes, i.e. social media data to assess the credit worthiness of a person,for profiling, i.e. advertising, or to monitor sentiments and behaviours.

Our new report with the International Committee of the Red Cross - The humanitarian metadata problem: ‘Doing no harm’ in the digital era-raises key questions about the practices of corporations across sectors and their impact on the humanitarian sector.

We highlight how companies are failing to understand and acknowledge the responsibilities they have when their business models, either intentionally or not, are exposing people who are already in the most vulnerable positions to even greater risk. Those seeking protection should not have their location tracked, recipients of cash transfer programmes should not have their data exploited by financial institutions, and those seeking or sharing information through platforms must not be subject to increased surveillance, profiling and targeting.

These companies must take responsibility and respond to the concerns highlighted in our joint report. They can no longer hide behind excuses of ignorance. Civil society, the media, and academia will continue to expose these practices but there are real opportunities for humanitarian organisations to use their own leverage to demand legal and regulatory safeguards across their services and in particular for humanitarian metadata.They must act to protect the impartiality, neutrality and independence of humanitarian actionin an ever-growing digital world.

Find out more about Privacy International’s work challenging corporate surveillance, and sign upto keep up to date! 

Learn more