The hidden cost of digital health services

New research from our partners at the Centre for Internet & Society (CIS) reveals Indian health websites and apps are sharing intimate health-related data with third parties such as Facebook and Google. 

Key findings
  1. Data entered into health apps or website is shared in ways that are likely to be a violation of people's privacy expectations. This often happens without sufficient transparency or consent. 
  2. Our partners at the Centre for Internet & Society (CIS) conducted research into nine web and mobile applications across the Indian healthcare sector which reveals some of the sensitive data they share with third parties, including advertisers. 
  3. Together we recommend best practices to limit this troubling sharing.
Long Read

Introduction

Data about our health reveals some of the most sensitive, intimate - and potentially embarrassing - information about who we are. Confidentiality is, and has always been, at the very heart of medical ethics. People need to be able to trust their doctors, nurses and other healthcare providers so that they are not afraid to tell them something important about their health for fear of shame, judgement or social exclusion.

It’s no surprise then that data protection regimes around the world often treat data related to health as a special separate category of data. More stringent rules on how it can be collected and shared are put in place, in part because of this sensitivity and its potential impact on our dignity and rights. Health data deserves the highest standards of care and attention.

Too often, however, we hear that new online or digital ways of managing our health don’t live up to these standards. And in India, the Digital Personal Data Protection Act does not provide a distinct category to protect health data - it is regulated under the broader umbrella of personal data.

The hidden “value” of digital health services

Many of us use digital health services not just for their convenience, but as a way to avoid the complexity and discomfort of talking to a healthcare professional face-to-face. But what if we told you that far from being more discreet, many online services share data related to your health widely? Your data may not be staying only with your chosen health service: multiple advertisers and data brokers may be listening in, taking notes, ready to sell you something.

To find out more, our partners at the Centre for Internet & Society (CIS) conducted research into nine web and mobile applications across the Indian healthcare sector. CIS used PI’s Data Interception Environment to try to see what’s going on behind the slick apps and websites that offer convenience first when we’re worried about our health. 

CIS looked at digital services providing information about symptoms and conditions, selling drugs through online pharmacies, and even allowing users to directly book tests and minor surgeries.

Privacy International’s Data Interception Environment (DIE) tool can be used to analyse how data flows between an app and the servers it communicates with. It uses open-source tools to intercept the network activity of mobile applications, allowing you to see what data is being shared by apps from your device to the app companies and third parties.

PI has used the DIE to research everything from low cost phones to menstruation apps. And we’ve made it openly available for public use so that you can do your own research on how devices and apps use your data.

Learn more about PI’s Data Interception Environment here

CIS discovered that some of the sensitive data users inputted into these healthcare services are shared directly with third parties, including advertisers. This may happen either without any consent banner presented to the user, or with relevant (but limited) information buried in the terms and conditions or privacy policy of the service. The result? Any user too busy or too overwhelmed by legalese and complicated language to read through the whole privacy policy will remain unaware of an app’s data sharing practices.

Advertisers want this data for the purpose of targeted digital advertising. They can combine it with what they already know about us to build up an ever-more detailed profile about who we are, what we like and where we go.

The advertising value of pregnant women

Targeted digital advertising is a lucrative business that accounts for the majority of the income of big tech companies like Google and Facebook (and also feeds smaller AdTech companies you might have never heard of). This can lead to people being profiled and their personal situations or vulnerabilities targeted and exploited. For example, information about who is pregnant is particularly valuable to advertisers. New and expecting parents are likely to change their purchasing habits. 

In PI’s last investigation into menstruation apps, we found that the data of an average person in the US is worth $0.10, while a pregnant woman’s will be $1.50. As of 2022, Gizmodo reports advertisers are still actively segmenting the data of pregnant women, with costs per user reached by an advert going as high as $2.25.

As PI has covered in the past, access to data around women’s health is of particular value to advertisers. But sharing such sensitive data so widely may not just come as a shock and make people feel uncomfortable because of targeted advertising: it may even lead to unsafe or threatening situations. We have seen examples of law enforcement agencies buying geolocation data which may reveal who has visited a sexual health clinic. And we have heard concerns about hostile actors (including governments) collecting data on people who have been seeking information about abortion or other socially controversial health issues (for example in relation to HIV/AIDS, sexuality or drug use).

Long Read

Your personal data can be collected by companies from many different sources and shaped into a "secret identity". This is when companies use information about you to assume your personality traits and predict your behaviour, and sell this profile onto others. But who are the companies behind this practice?

Who wants your health data?

If you’ve ever installed and used a health app before, you might know how quickly you can start seeing ads for health-related services somewhere else. To look inside how this profiling works, we’re going to go through what can happen when a theoretical user tries a range of online healthcare services and apps that were tested by CIS. 

Gleaning information from our searches for symptoms to our shopping for medications, advertisers and trackers can build an incredibly detailed overall picture of our health, and make inferences about us as a result.

These inferences could result in certain products or services being aggressively advertised to us. Some of these ads may be fraudulent or misleading, whilst others may be from companies more interested in advertising their product than providing support.

Company policies are changing in relation to how (and in which countries) health data is used in advertising - potentially in response to the EU’s Digital Services Act, which bans targeted advertising based on health-related data. Big Tech companies might want to gather information about your health for all sorts of reasons, however, besides advertising. For example, Google has a number of interests in health: it owns FitBit, is working with health insurance companies, has developed AI-powered skin-care tools and previously acquired patient data from the UK National Health Service.

Information about your health might also be passed on to other organisations. The end result could be a health insurer inflating their premiums, or employers making changes to their hiring or sick leave policies

CIS’s research demonstrates rampant sharing of health data with advertisers. To visualise this better, let’s play the role of a hypothetical online healthcare user in India and see what advertisers find out from day-to-day healthcare queries. 

All of the examples and screenshots below are taken directly from CIS’s research. Read the Technical Annex here.

A day in the life: using digital health services in India

You wake up in the morning with pain in your abdomen, and so you search online to see what might be wrong. You find some information about abdominal pain on the Apollo Clinic site:

Apollo Clinic website screenshot
  • The page URL - which includes the name of the selected symptom - is shared with Google, LinkedIn, and Facebook (see Technical Annex).

After some reading you think there might be a problem with your kidneys. You do some more research about kidney disease and end up on Netmeds to learn more:

Netmeds website screenshot (circle added)
Long Read

Our 2018 complaint against French AdTech company Criteo led to a €40 million fine for failing to ensure that data subjects had provided their consent to processing, to sufficiently inform them and to enable them to exercise their rights.

Following on from our initial reaction, we answer some questions about the decision below.

  • The medical condition searched for is sent to Google, Bing, Facebook, and Criteo (see Technical Annex).

Note that Criteo is a large French AdTech company recently fined €40m for failing to ensure that the data it received from its 40,000+ website partners had been lawfully collected.

Finally, you decide that you need to take a urine test and arrange one through MaxLab:

MaxLab website screenshot

You feel calmer and assure yourself that you’re dealing with the problem, unaware that these big companies are receiving this information, potentially to target you with ads later. Those ads might try to convince you that you need further treatment or medication, keeping your concerns at the front of your mind and increasing your worry about the state of your health. The sharing of your data may also result in some of the wider harms outlined above, especially if it is also made available to third parties such as insurers or law enforcement agencies. 

Two weeks later

The test results haven’t come back yet and after your worries start to grow, you decide to look for some help with how to deal with your anxiety. Again, your first step is to look online and you end up back on Netmeds:

Netmeds website screenshot (circle added)

PI has investigated mental health websites, and what they share with advertisers, in the past.

In 2020, we looked at Doctissimo - France’s largest online health information provider. As a result of a complaint we made, France’s Data Protection Authority (the CNIL) fined the company €380,000 for failing to obtain consent of individuals to the collection and use of their health data.

After learning more about anxiety, you decide to book a virtual consultation with Practo:

 

Practo website screenshot
  • When looking for a doctor, the chosen medical speciality and exact GPS location of the user is shared with Google and Facebook.
  • When trying to book an appointment, the doctor’s last name, medical specialisation, appointment time and appointment cost are also shared with Google and Facebook (see Technical Annex).

You receive a digital prescription from Practo for a drug and decide to buy it from Netmeds’ online pharmacy:

Netmeds website screenshot
  • Even before clicking “Add to Cart”, the name of the drug is sent to Google, Bing, and Facebook.
  • Once added to cart, the name of the drug is again sent to Google, Bing, Facebook, and this time Criteo.
  • Facebook and Google get the cost and quantity of the drug added to the cart, as well as the medical condition that led you to the drug (see Technical Annex).

Online advertisers are not only able to predict when your mood is low, but even actively bid to show ads to people classified as depressed. Because of this, you might start to see adverts around mental health following you around in unrelated apps on your phone and in your web browser.

Two of the serious side effects of the drug you buy are losing/gaining weight and changes in your period. Just to make sure, you install an app to track your periods (Maya) and one for your weight (Healthify):

Healthify app screenshot
  • Every action you undertake in the Healthify app - signing up, adding a workout activity, tracking food, or completing an objective - is shared with Facebook.
  • During sign-up, information such as email address, name, age, gender, weight and medical conditions, is shared with CleverTap and Intercom.
  • Additionally, your email address, name, age, gender and weight are shared with Google (see Technical Annex).

Maya has been the subject of PI investigations in the past. Both Facebook and Google are notified when you launch the app, and Google gets analytics for any page you visit on it. The companies are also notified when a user adds a symptom, mood, or birth control method (though they are not told the specific symptom/medication).

The Maya app also has a section for users to record when they last had sex and whether it was protected or unprotected. While this feature is not mandatory, its use is also sent to Google (see Technical Annex).

The true cost of sharing your health data

This hypothetical user’s story begins to illustrate the true cost of sharing health data with some online services CIS analysed. A summary of all apps covered and their data collection practices is shown in the table below (see Technical Annex): 

Overview of CIS research findings

The overall picture painted is that our data is shared, often without sufficient transparency or consent, in ways that are likely to be a violation of our privacy expectations. The apps covered in CIS’s research have more than 10 million installations. This means that intimate details of millions of people in India have been shared with third parties.

Apps and websites should be transparent about when they share our data. Indeed, transparency is a key component in data protection legal frameworks.

While the privacy policies of the services covered in this report do usually mention the processing of health data and its sharing with third parties (e.g. Netmeds Privacy Policy), this does not overcome the “information asymmetry” between the providers and the users of online healthcare services. Those running the show know much more about the collection and use of private health data than those relying on the services. Reducing that asymmetry will mean people are able to make more informed choices about the online services they use.

Report

This guide was designed for our partners across the world who want to see strong data protection laws in their countries. We identify some key points that they can use in their advocacy.

This version of the guide is the full guide that you can download as a single resource.

When it comes to sensitive health data, best practice should be to be very upfront and explicit regarding data sharing, or even better, not share the data at all. Users should not have to have to dig around to find out whether their sensitive health data may be shared with mysterious and gargantuan companies.

Burying this information in privacy policies - which the vast majority of users will neither read nor understand the wider implications of - is also not good enough to meet the need for explicit consent. That need is a legal requirement under many data protection laws around the world for the processing and sharing of sensitive health data. As a minimum, such consent must be explicit, freely given, unambiguous and reflect an individual’s informed choice.

Using an app to manage intimate and sensitive aspects of our lives should not come at the expense of our privacy. 

India’s 2023 Digital Personal Data Protection Act

India’s 2023 Digital Personal Data Protection Act (“DPDP Act”) goes some way towards placing these requirements on the apps and websites CIS has analysed, but there is more work to be done. The DPDP Act has not corrected the problems CIS identified when it was first proposed and fails to give special protection to sensitive health-related data. And because the DPDP Act doesn’t require privacy notices to indicate the recipients of data, we don’t think consent based on those notices can ever be fully informed. Given the push towards digitisation of healthcare, better and more effective protection of health data is essential.

Websites and apps should not only comply with their legal obligations, but also live up to the trust that users have placed in them when deciding to use their service to help them manage and improve their health. In order to guide best practices, we make the following recommendations:

Recommendations for online health websites and apps

Any app or website that offers a service designed to support someone manage their health should of course comply with all relevant data protection laws, such as the EU General Data Protection Regulation (GDPR), the US Health Insurance Portability and Accountability Act (HIPAA) and India’s Digital Personal Data Protection Act.

Even where not required to do so by law, we recommend that such apps should always:

  • Undertake in-depth data protection and human rights impact assessments that consider the potential harms users could experience.
  • Limit the data collected: only data that is necessary for the stated purpose of the website or app should be collected.
  • Limit data sharing only to what is strictly necessary for the purpose of providing the services offered. This requires checking default data sharing settings of tools provided by third parties such as software development kits (SDKs) or third-party data management tools.
  • Obtain consent that is free, unambiguous, informed and explicit before collecting any data. This entails providing clear and transparent information upfront, not buried in a privacy policy, about what data will be collected, who will have access to it, and how it will be used.
  • Only share data with third parties if users actively and explicitly opt-in to that sharing.
  • Never share health-related data for the purposes of advertising.
  • Give users access to and control of their data. People should be able to access, modify, and/or delete their personal data from the app or website.

By following these recommendations, healthcare apps and websites can prioritise the privacy and security of personal data, build trust with users, and promote responsible data handling practices. It can be financially viable to offer a service for free using non-targeted advertising.

Recommendations for users

Key Resources

Online tracking is a widespread practice with questionnable ethics and legal backing. Learn how to limit your data from being collected unwillingly and disrupt the tracking industry!

Even if they will not fully counter the kind of tracking that we have described in this report, we recommend that people make full use of existing privacy settings and tools. 

To learn more about protecting yourself from online tracking, see our guides below. 

It's also always worth taking a moment to look through a privacy policy or do some research into a company before installing and using their apps if you can.

Our guides

Android

  • Delete your advertising ID, or reset it regularly. 
    • Reset can be found on most Android devices under Settings > Google > Ads > Reset Advertising ID.
    • On newer Android versions, there is also the option to delete your advertising ID. This can be found under Settings > Google > Ads > Delete Advertising ID
  • Opting out of ad personalisation 
    • This can be found on most Android devices under Settings > Google > Ads > Opt out of personalized advertising

Apple

  • Opt-out of cross-app tracking 
    • Go to Settings > Privacy > Tracking and turn “Allow Apps to Request to Track” OFF

App Permissions

  • Regularly review the permissions that you have given to different apps and limit them to what it strictly necessary for the way in which you want to use that App.
  • For an individual app, the permissions you have granted can be found on most Android devices by long-pressing on the app, clicking info, then Permissions.

Other useful guides