Privacy International defends the right to privacy across the world, and fights surveillance and other intrusions into private life by governments and corporations. Read more »


Chapter: 

Opportunities for securing health information

Even if we were to deploy eHealth systems in developing countries and emergency situations with the same safeguards we are developing for systems elsewhere in the world, there would remain significant challenges. The environmental and cultural differences are vast, even within a given country. We therefore need careful consideration and planning on the ground even as we devise policies and strategies at the national, regional, and international levels.
One of the ideal mechanisms for ensuring that privacy and security is through the infrastructure providers: the key implementing partners, the international organisations and community, and the funders.

Implementing Partners

At the earliest of stages, eHealth systems need to be designed with privacy and security in mind. This would require implementing partners to hold discussions on privacy and security as part of the early processes around devising a new system. This could be done as part of eHealth readiness assessments.1 Adding privacy and security at a later date is insufficient as vulnerabilities will exist, inconveniences will be avoided, and it will be nearly impossible to impose principles of data minimisation. Enabling this will require the use of risk assessments, including privacy impact assessments.2

We must also ensure that local educational and training programmes are continued throughout the life-cycle of a project. All institutions and their staff that interact with the eHealth systems need regular training regarding privacy and security issues, and the attendant concerns about responsibility, accountability, and transparency. Each institution needs security and privacy staff members who can administer the appropriate privileges and maintain oversight mechanisms including audit trails.

International community

The international community must provide leadership in this domain and share the best practices from around the world. We are heartened to hear of interesting initiatives within industry to develop best practices and even standards on privacy and security. Similarly, international organisations are in an ideal position to promote discussion and consideration of these issues. The WHO’s planned work on developing thought-leadership on patient identifiers is also a promising development.

Funders

The funders of these projects and initiatives have the heaviest responsibility in ensuring that their projects are fit for purpose. The worst case scenario must be avoided: an eHealth system that increases access to healthcare to vulnerable people while making them vulnerable to abuse through weak privacy and security controls. A false sense of security is a great breach of trust and confidence.

Funders must do far more than they are doing to date, by requiring partners and grantees to conduct assessments and consider these ethical issues at the outset. Resources will also be required for regular audits and subsequent follow-up work. Privacy and security are processes, not products or plug-ins.
Funders must also consider law and policy change as an integral component of the deployment of a new system and practices. They should promote legal rights and effective regulatory controls and accessible rights of remedy even before the systems are specified.

Next steps

As we move forward in the eHealth privacy and security space, we aim to assist these actors in these efforts by:

  • engaging with the international health community (e.g. WHO, ICRC, etc.) to identify the most pressing dimensions of eHealth privacy and security;
  • further engaging with the funding bodies that make eHealth systems possible in developing countries and humanitarian and relief operations;
  • continuing our engagement with technology developers and experts in order to assist in the design and implementation of privacy-friendly and secure health information systems;
  • developing detailed case studies to increase the knowledge base in this area;
  • developing recommended content coverage for ethics courses in medical schools in developing countries; and
  • developing recommended policy frameworks with which to guide future decision-making.
Concluding remarks

There are times when the interests of the funders, international community, and implementing partners will conflict with the issues that we have raised in this report. There will also be times where conflicts will arise between each community, or even within each community. This is a sign of a healthy discourse about technology and policy.

These disagreements will occur particularly when these communities are all acting in the best interests of the patients and citizens. These disagreements may be about data-sharing for health surveillance, reporting in order to understand programme effectiveness, using information for accounting to manage costs, or accessing information for medical research. These same debates occur around the world and we must always recall that a well-rounded debate is necessary in order to have a healthy discourse.

The challenge for developing countries and humanitarian operations is that we have a tendency to think and act on behalf of the citizens and patients. At times like these we must remember that underlying everything we are doing and all the issues raised in this report is a fundamental understanding within the practice of medicine that we are here to protect the rights of humans.

Footnotes

  • 1. Cf. ‘e-Health readiness assessment tools for healthcare institutions in developing countries’, S. Khoja et al. Telemed J E Health, 13(4): 425-31, 2007.
  • 2. See, for instance, ‘A Conceptual Privacy Impact Assessment on Canada’s Electronic Health Record Solution’, Blueprint Version 2, Canada Health Infoway, February 12, 2008; as well as ‘Privacy impact assessment in the design of transnational public health information systems: the BIRO project’, C T Di Iorio et al., J Med Ethics, 35: 753–761, 2009.