Privacy International defends the right to privacy across the world, and fights surveillance and other intrusions into private life by governments and corporations. Read more »


Chapter: 

Medical privacy in practice

EHealth systems have much to offer medicine in developing world and humanitarian operations. The deployment of effective technologies to facilitate and manage the provision of healthcare where there was previously little infrastructure will result in leaps and bounds improvement of healthcare. With a weaker legal infrastructure, a likely lack of deliberative and consultative regimes, and scarce resources we will be compelled to question many of the fundamentals of privacy and confidentiality.

Even as countries like the United States and the United Kingdom are trying to deploy electronic medical record systems with varying success, they too are encountering privacy and security concerns.1 A recent study from the U.S. claims that security and privacy problems result in data breaches that cost the healthcare system billions of dollars, even as it is difficult to detect these breaches in security, and few resources are applied to ensure security and privacy.2 Meanwhile a recent survey in the U.S. found that 97% of Americans believe that medical institutions should not be allowed to share or sell sensitive health information without consent.3

In resource-constrained environments the situation is even more problematic. Neither the patients nor the practitioners are particularly aware of rights and responsibilities.4 Literacy may be minimal, so notices are insufficient. Populations may be more mobile and therefore patient registration may be even more important and yet difficult to achieve. Care providers may be responsible for larger numbers of patients. Staff may not be trained in procedures. The technical infrastructure may vary, with problems with electricity, so running additional processes and procedures may prove too challenging. Multiple organisations may be operating in the same space, with implementing partners and government agencies, whereby it may be difficult to identify the primary custodians of the information. All of these barriers are exacerbated in humanitarian operations.

The greatest irony is that the protection of confidentiality and privacy is perhaps even more important in these very same environments. Different societies have a variety of ideas of what is acceptable, aberrant, and abhorrent. In some countries it is HIV status,5 in other countries it is your mental health, or whether you are likely to develop diabetes.6

  • The rights of young people, and especially the impoverished and underprivileged, require special attention. Issues of consent become particularly problematic when dealing with these youth in developing country contexts.7
  • Religion and morals may play stronger roles in social lives, and in turn any information that calls into question an individual’s abidance by social and religious morals may hurt that individual’s reputation. One’s ‘moral fibre’ may be questioned, thus leading to social exclusion.
  • Sexuality is a sensitive topic in nearly all cultures, but the ramifications of wrongful disclosure in some contexts may result in severe actions being taken against individuals, sometimes even involving death.8
  • Diagnoses can be interpreted in a variety of ways, leading to discrimination, or worse. For instance, the diagnosis of a recessive genetic disorder can also inadvertently reveal non-paternity if the father and child are both tested (a recessive disorder requires the disease causing mutation to be present in both parents and for the child to inherit both copies). There are many countries where adultery is a criminal offence for women and in some cases inadvertent release of such information could lead to severe punishment.
  • While these are particularly sensitive, they are also integral to many of the health programmes in developing countries. The ailments that lead to stigma and social exclusion are exactly what we need to treat. We need patients to be willing to come forward and share, and not recoil in fear of information being leaked. Once entrusted we need to ensure that our systems and procedures live up the faith they have placed in them.

    The most basic safeguard of individual consent is possibly a luxury that is not afforded to individuals in some environments where access may be limited, or even provided through another individual (e.g. in some contexts, women may not access healthcare when it is provided by men, so husbands and fathers will go to the surgeries on behalf of the women). If the basic safeguards that are enshrined in rights and principles are too difficult to maintain, then rather than abandoning all safeguards we must seek others, and do so urgently. Otherwise wrongful disclosures can result in the breakdown of family cohesion, social exclusion, and persecution.

    Organisational dynamics of data practices

    We must also recognize the threats that emerge as a result of the inherent dynamics of health organizations at work – how medical practitioners and administrative staff collect, store, use, and share health information.

    It is well established in the field of information security that the greatest threats to an organisation’s information assets come from within. The threat of internal abuse of sensitive medical information therefore should not be underestimated when designing threat models and building safeguards for privacy and security. In this context, internal abuse typically involves staff inappropriately accessing or disclosing medical information, without the patient’s authorization. These acts are motivated by spite, curiosity, or simply caprice. We learned during our engagement with medical practitioners from the developing world that such incidents regrettably occur all too often and, when sensitive or embarrassing information is disclosed, can lead to patients being chastised by community members.

    We must also consider the threat of external abuse of medical information, including unauthorised disclosure through covert channels. Whereas the potential for abuse of medical information stored on paper records is physically limited, abuse is still possible and the consequences great for those affected. Of course the introduction of information and communication technology into the healthcare context multiplies and complicates the risks of outsider abuse of medical information9, but the fundamental problem is not technological in nature but rather organisational: healthcare organisations are either unable or unwilling to secure their records and guard patient privacy.

    External abuse can also result from data-sharing, which too often goes unquestioned by the medical community. In developing countries, data-sharing for medical research or disease surveillance purposes very regularly takes place without patients’ awareness or informed consent. A warning of the dangers of these practices and the implications for privacy comes from Haiti: the Haitian government requested the medical records of all individuals infected with HIV from the public health organizations working in the country. Government officials wanted to use the data to populate a national database for calculating and tracking the prevalence of HIV. While many organizations complied, others were hesitant about sharing such sensitive information without first consulting their patients.10 To our knowledge, Haiti lacks a legal framework for the protection of privacy, and the government does not provide guidance to organisations on how to provide safeguards while information sharing.

     

    Capturing the users and understanding empowerment

    In the design of any system we always need to ensure that we capture faithfully the interests and characteristics of the stakeholders. If you wrongly presume that the users are proficient with technology, then you risk building a system that is overly complicated for average users. Likewise, if you wrongly presume that your users are seeking simple solutions, then you are likely to frustrate them with simplistic interfaces and functions. As the user base increases and the scope of use widens, it becomes difficult to envision a single type of user, and so we must design our systems with greater care and with multiple audiences in mind.

    The risks are much greater in the design of public-facing systems as there is another stakeholder to consider: the individual. This individual, whose information is being processed by a system can also a citizen or a consumer. In the case of eHealth, the individual patient is sometimes an amalgam of both a citizen and a consumer.

    The way we consider these individuals affects the way we design our processes and technologies. In this sense, designing for privacy differs from designing for security, in that only the former considers the essence of the individual and the according responsibilities placed upon the institutions and the technologies. Put simply: citizens and consumers have rights that are not defined or limited by technology. In our experience and in our discussions with many system designers, one of the great challenges in providing care in developing countries and humanitarian operations is that developers make difficult presumptions about the people who are implicated by the systems.

    Where there have been policy discussions about the constitution of the ‘individual’, mostly in North America and Europe, the individual patient is considered a citizen or consumer who can make decisions. As an eHealth policy becomes more developed, the language of individual empowerment emerges almost naturally. This matches well with European human rights laws that require that any eHealth system that collects information must be established either under law or with the consent of the individual. Even if there is a law or consent, the individual still retains his or her rights about how the information is used, and consent may be withdrawn.11

    We have indeed encountered this first hand in humanitarian operations, where individuals are seeking access to emergency care and services, and are in turn relatively unconcerned with issues around consent or information control. Empowerment comes with access to services that are otherwise inaccessible.
    In our discussions with systems developers and policy-makers they applied the same thinking to the security of medical information. In their minds the risks of unapproved information disclosure were quite low: the only users of the systems would be authorised individuals accessing the information for the purpose at hand. The ‘western’ concerns regarding malicious hackers or security vulnerabilities in software12 were dismissed because the general population lacked the computing resources to try to break into systems. As a result, security would be sorted out at a later time, if at all. The insider threat was not even considered.

    Under the rubric of ‘medical informatics for developing countries’, many assumptions were being made about the type of environment where these systems may be deployed, and many other assumptions were being made about the types of individuals implicated. It was our impression that the worst case scenario was being considered in both situations: individuals were so much in need of healthcare that they cared for little else; and resources were so poor that the risks of abuse were limited. Put simply, patients are poor and users are noble. Even if this is an adequate assessment of the needs and threats, and from our limited field research and interviews we can say that we were unable to verify such claims, we cannot imagine that this situation necessarily permeates every medical environment in every developing country. Assumptions made in the abstract and enshrined into technology are just another form of universalism.

    To remedy these types of problems, systems designers are compelled to look at more than mere risk perception. Independent risk analyses are needed to complement the consultation with stakeholders. Inadvertent or unintentional breaches and disclosure of personal information are still breaches. We must consider systems and practices that are considered essential in other environments and then must justify why we would exclude such safeguards as we develop systems for developing countries and humanitarian operations. Threat analyses and privacy impact assessments conducted at the earliest of stages assist in identifying and understanding the risks of information collection and processing.
    After all, if the development goal is achieved and the societies in which we implement these technologies eventually develop sustainable economies, social structures, and political systems, then it is possible that the same risks that exist in the rest of the world may one day apply in developing countries. The infrastructure that is left behind by our development goals in eHealth may limit that society to choose to move beyond a perceived ‘needs’-based approach to one based on the rights of autonomous patients.

    Footnotes

    • 1. ‘Do summary care records have the potential to do more harm than good? Yes’, Ross Anderson, BMJ 340: c3020, 2010.
    • 2. ‘Benchmark Study on Patient Privacy and Data Security’, Ponemon Institute LLC, November 2010.
    • 3. Zogby International Online Poll conducted for patientprivacyrights, ‘2000 Adults’ views on privacy, access to health information, and health information technology’, published November 2010. Findings available at http://patientprivacyrights.org/wp-content/uploads/2010/11/Zogby-Result-...
    • 4. ‘The importance of patient privacy during a clinical examination’ Shailaja Tetali, Indian Journal of Medical Ethics, IV(2): 65, April-June 2007.
    • 5. ‘Religious leaders key in the Middle East’s HIV/AIDS fight’, Jan McGirk, The Lancet, Volume 372, Issue 9635, 279-280, July 26 2008.
    • 6. In our discussions and interviews, we heard a number of references to social stigmas, including mental health and diabetes. For referenced examples see ‘Experience of social stigma by people with schizophrenia in Hong Kong’, by Sing Lee, Margaret T.Y.Lee, Marcus Y.L. Chiu, Arthur Kleinman, British Journal of Psychiatry, 2005 186: 153-157; ‘Living on the Edge: The Stigma of Diabetes’ MedIndia, November 10, 2008; and ‘Social stigma and discrimination: a care crisis for young women with diabetes in India’, DiabetesVoice, May 2009, Volume 54, 37-39. In our interviews with practitioners in developing countries, they identified a myriad of domains where individuals could be harmed through inappropriate information processing.

      As examples:

      • In many parts of the world women are particularly vulnerable, as they may be discriminated against in the provision of healthcare (if they are even able to seek access to services because of gender discrimination issues, or in the case of mHealth they may not have direct access to mobile devices). This is particularly problematic when involving reproductive rights, including the sensitive issues of sexual activity and abortion.In Indonesia, there are practices of mandatory pregnancy tests, virginity tests, and ‘a web of discriminatory laws and practices that deny Indonesian women who become pregnant outside marriage full access to maternal care and reproductive health. ‘Left without a choice: Barriers to reproductive health in Indonesia’, Amnesty International, 2010. Available at: http://www.amnesty.org/en/news-and-updates/report/indonesia-left-without...
      • 7. ‘Obtaining informed consent: observations from community research with refugee and impoverished youth’, R. Nakkash et al., J Med Ethics, 35:638-643, 2009.
      • 8. ‘Uganda's Rolling Stone paper told to stop outing gays’, BBC News, November 1, 2010.
      • 9. ‘Privacy, Information Technology, and Health Care’, Thomas C. Rindfleisch, CACM, 40(8): 92-100, 1997.
      • 10. ‘Electronic records pose dilemma in developing countries’, Nature Medicine, 16(3): 249, March 2010.
      • 11. Consent Mechanisms for Electronic Health Record Systems: A Simple Yet Unresolved Issue’. K.T. Win and J.A. Fulcher, Journal of Medical Systems, 31, 91-96, 2007.
      • 12. See, for example, ‘Killed by Code: Software Transparency in Implantable Medical Devices’, Karen Sandler et al. Software Freedom Law Center, 2010. Available at: http://www.softwarefreedom.org/resources/2010/transparent-medical-device...