No Body's Business But Mine: How Menstruation Apps Are Sharing Your Data
PI undertook dynamic analysis of various menstrution apps using its own data interception environment to look at the data they share with Facebook.
- Research highlights that the menstruation apps we have exposed raise serious concerns when it comes to their compliance with their GDPR obligations, especially around consent and transparency.
In December 2018, Privacy international exposed the dubious practices of some of the most popular apps in the world.
Out of the 36 apps we tested, we found that 61% automatically transfer data to Facebook the moment a user opens the app. This happens whether the user has a Facebook account or not, and whether they are logged into Facebook or not. We also found that some of those apps routinely send Facebook incredibly detailed and sometimes sensitive personal data. Again, it didn’t matter if people were logged out of Facebook or didn’t have an account.
This sharing happens through the Facebook Software Development Kit (SDK), a set of software development tools that can be used to develop apps for a specific operating system. In an email to us on 29 December 2018, Facebook described how their product works:
“Developers can receive analytics that allow them to understand what the audience of their app enjoys and improve their apps over time. Developers may also use Facebook services to monetise their apps through Facebook Audience Network. Subject to that Facebook user's prior consent, Facebook may also use this data to provide that user with more personalised ads.”
Facebook routinely receives data users, non-users and logged-out users outside its platform through Facebook Business Tools. For instance, any website that has integrated a Facebook “Like” button or a tracking pixel automatically sends data to Facebook.
Facebook's SDK for Android allows app developers to integrate their apps with Facebook’s platform and contains a number of core components: Analytics, Ads, Login, Account Kit, Share, Graph API, App Events and App Links. For example, using Facebook's SDK allows apps to use a "Login with Facebook" based authentication, meaning users can log in using their Facebook account.
Following the revelations, two thirds of the companies we exposed have updated their apps. This year, we decided to follow the same methodology to look into the apps we share some of our most sensitive data with: menstruation apps.
Menstruation apps are not just concerned with your menstruation cycles. As our partner organisation Coding Rights showed in their research, Menstruapps – How to Turn Your Period Into Money (For Others), they collect information about your health, your sexual life, your mood and more – all in exchange for telling you what day of the month you’re most fertile or the date of your next period. In fact, the data you share with your menstruation app is probably information you would not share with others.
We therefore wanted to make sure that they keep this information to themselves, rather than sharing it with other companies. We initially looked at the most popular apps: Period Tracker by Leap Fitness Group; Period Tracker Flo by Flo Health, Inc.; Period Tracker by Simple Design Ltd.; and Clue Period Tracker by Biowink.
We did a dynamic analysis of the apps using our data interception environment (available here and see Annex 1 for methodology) to look at the data that those apps share with Facebook. We were pleased to see none of those apps did, including Clue, which changed their practices after we called them out in our first round of checks.
But what about other apps? The ones that may not be the biggest players but still boast millions of users? We set out to look at apps we noticed were popular in different parts of the world and decided to look at Maya by Plackal Tech, MIA by Mobapp Development Limited, My Period Tracker by Linchpin Health, Ovulation Calculator by Pinkbird, Period Tracker by GP International LLC and Mi Calendario by Grupo Familia.
Period Tracker by GP International LLC did not appear to share any data with Facebook. The other apps we looked at on the other hand turned out to be a little more indiscreet.
As we will expose in this report, Maya by Plackal Tech and MIA by Mobapp Development Limited conducted – at the time of the research – what we believe to be extensive sharing of sensitive personal data with third parties, including Facebook. However, we are pleased to announce after we shared this report with Maya by Plackal Tech, they said:
The full responses from all the companies we contacted and which responded to us are available in the annex.
Linchpin Health did not respond. MIA did not wish to have their response published.
Feeling anxious? Got lucky last night? Having some health issues? Tell Maya and they’ll let Facebook and others know (oh, and they’ll share your diary too…)
Medical data is among the most sensitive data one can collect. Confidentiality is at the heart of medical ethics and countries that have data protection laws traditionally have a separate regime for health data, which includes health data, which are considered sensitive data. Thus, when Maya asks you to enter how you feel and offers suggestions of symptoms you might have - suggestions like blood pressure, swelling or acne - one would hope this data would be treated with extra care. But no, that information is shared with Facebook.
There is a reason why advertisers are so interested in your mood; understanding when a person is in a vulnerable state of mind means you can strategically target them. Knowing when a teenager is feeling low means an advertiser might try and sell them a food supplement that is supposed to make them feel strong and focused. Understanding people’s mood is an entry point for manipulating them. And that is all the more worrying in an age when Facebook is having so much impact on our democracies, as the Cambridge Analytica scandal revealed. Indeed, it is not just advertisers that will want to know how we feel; as elections approach, political parties may want to know if we feel anxious, stressed or excited so that they can adapt their narratives accordingly.
Like other menstruation apps, Maya is also gathering data about our intimate life - requesting information about when you have had sex and whether the intercourse was protected or not.
In their response to us Maya states:
“All data accessed by Maya are also essential to the proper functioning of the product. Predicting information pertaining to menstrual cycles is complex and dependent on thousands of variables.” (See Annex 2)
We understand that certain personal data is necessary to provide the service to users. It is hard to see, however, how whether you’ve had unprotected sex or not is relevant to predicting menstruation cycles.
Maya is not just asking you to click to enter information. It’s also encouraging you to enter your own notes and comments in sections like “Reminders” or in a diary-like section. Considering the nature of the app, we would expect the information users enter to be of a sensitive nature. As we conducted traffic analysis, we entered “something very sensitive entered here” in the diary section of the app to see what would happen. The result? What we wrote was shared with Facebook.
So far, we have highlighted what we perceive as the most sensitive data that Maya shares with Facebook. But it is worth remembering that it is not just your mood, medical data, sexual intercourse and personal notes that gets shared with Facebook. In fact, it is every single interaction between you and the app. When you open the app, how you navigate through the app, the dates of your menstruation cycle, and so on.
But as we did the traffic analysis, we noticed Facebook was not the only one getting that data. Everything was also shared with another third party that appeared as “wzrkt.com”.
On the picture above, you can see what data sharing with wzrkt.com looks like. Here, we went to the “symptoms” section of the app and we clicked on “diarrhea” and “nausea.”
So, who is “wzrkt.com”? Wzrkt stands for “Wizard Rocket”, the former name of a company now known as CleverTap. In their response to our report, CleverTap describe themselves as “a customer retention platform that helps consumer brands maximize user lifetime value, optimize key conversion metrics, and boost retention rates.”
“We may share Your information with our sponsors, and/or business partners. Your Information could be shared so that you may receive newsletters, offers, information about new services, and other information, if applicable. The information collected from You and other users may be analysed in different manners.”
Besides this general information, no other information is provided to users about the exact recipients with whom data might be shared, what this data could entail, and what these “different manners” are. The users are not even told if this data anonymised or not.
If you have unprotected sex, MIA will tell you what to do. And share it with Facebook and others
MIA Fem by Mobapp Development Limited (over 1 million downloads on Google Play) was the next app we looked at. Maya, MIA - different companies, similar names and – at the time of research – similar practices.
And, just like Maya, Facebook is not the only third party that will get access to your data if you use MIA. Everything that is shared with Facebook is also shared with AppsFlyer. AppsFlyer is “a service that enables app owners to analyse and interpret the performance of their marketing efforts.” (cf. AppFlyer’s response Annex 6).
Before you start, MIA wants to know if you intend to use the app as a regular period tracker, or if you are trying to get pregnant and using it to maximise your chances. Effectively, this does not make any difference in terms of how you use the app or what the app has to offer. The big difference, of course, is for the advertisers. The moment you click on the icon to let the app know you are trying to get pregnant, you are immediately targeted with an ad for a premium version to of the app to help you conceive. The information is also shared with Facebook.
The data of pregnant women is particularly valuable to advertisers: expecting parents are consumers who are likely to change to their purchasing habits. In the US for instance, an average person’s data is worth $0.10, while a pregnant woman’s will be $1.50.
Like all menstruation apps, MIA starts by asking you for the date of your last period, the duration of your periods and the duration of your cycle. This is all shared with Facebook and AppsFlyer.
When you click on it, you will be presented with a collection of articles that have been tailored for you based on what you have selected, and occasionally based on other information that MIA has inferred about you, like your menstruation cycle phase. For instance, we selected ‘masturbated’ in the section on sex and were recommended an article called “Masturbation: What You Want to Know But Are Ashamed to Ask.”
And this is how MIA gets to share your most intimate data with Facebook. See below how MIA shares information about your alcohol consumption, the ups-and-down of your sex life, or when you experience cramps during your ovulation phase (you may not even know you are ovulating but MIA has figured that out for you and it is letting Facebook know).
Beyond the health and lifestyle questions that shape your “Personal feed,” MIA has a separate section for “Reminders” - by reminders they mean a reminder for birth control pill. By asking people to enter this data, MIA is, again, collecting medical data. Far from treating it with utmost care, it is shared with Facebook and AppsFlyer. Here, we entered “Name of my Pill” where MIA was asking to enter the name of our pill. In their response to these findings, Appsflyer stated:
“In this case we have reached out to the app developer and reminded them of this and will work with them to ensure our services are not used to collect any such personal information.” (See Annex 6)
It should be underlined that the observations mentioned above referred to the practices of MIA at the times of the research, namely in May 2019.
What about the other apps?
We also looked at the following apps and found that they all informed Facebook when you open the app:
- My Period Tracker by Linchpin Health (over 1 million downloads on Google Play),
- Ovulation Calculator by Pinkbird (over 500,000 downloads on Google Play),
- Mi Calendario by Grupo Familia (over 1 million downloads on Google Play)
As we highlighted earlier, this already reveals information, which could be potentially used for advertising purposes, and it is all the more worrying that this happens without the users’ consent.
Mi Calendario by Grupo Familia was also using an outdated version of the Facebook SDK, which presents security concerns.
What does the law say about all this?
When it comes to data protection, the big divide remains whether your app is either based in the European Union or offers services to users who are in the European Union, or if it is based outside of the European Union and not meant for users in the European Union. If you are in the EU, you are protected by the General Data Protection Regulation (GDPR).
Privacy International has been calling out the practices of companies that set different standards for their EU customers and their non-EU customers, as we believe everyone should benefit from the high standards of protection GDPR has set.
GDPR obliges data controllers (in this case, the company that owns the app) to provide adequate information to data subjects (the users) so that they are properly informed about the use of their personal data. In practice this is mostly done through privacy policies, which provide some basic information to users regarding possible uses, purposes, transfers, among other things, of people’s personal data. Those policies need to be written in concise, plain, understandable and user-friendly language. However, the problem most of the time is that these policies contain vague and generic wording or merely provide for indicative or non-exhaustive lists of what the company can do with your data. European data protection laws, namely the GDPR, oblige controllers to provide data subjects with information relating, at least, to the contact details of the controller, the purposes and legal bases under which their personal data will be processed, information about the recipients to which their personal data will be disclosed, including third country transfers, as well as basic information regarding the exercise of their data protection rights, such as the right to access their personal data, request their erasure or lodge a complaint with their regulator etc. This information needs to be provided at the point of collection of personal data from users.
Maya by Plackal Tech
Maya, like every app we have reviewed for this research, processes large amounts of personal data, including data relating to health, which could be deemed as a special category data (sensitive data) under EU data protection laws, as we highlighted before.
Plackal Tech also states that they “may also collect the precise location of your device when the app is running in the foreground or background”. They “may also derive your approximate location from your IP address”.
It is questionable whether this extensive data collection is strictly necessary for providing the service requested by users and, accordingly, raises a series of questions regarding the compatibility of these apps with EU data protection law. For example, the principle of data minimisation requires controllers to process the minimum amount of personal data that is necessary for providing the service.
MIA by Mobapp Development Limited
GDPR applies to MIA as the data controller is based in the EU (Cyprus) and the app is available for download on the Google Play Store UK. In other words, as EU users located in the UK are able to download and use the app, MIA seems to be offering its services to EU users and therefore needs to abide by its GDPR obligations.
My Period Tracker by Linchpin Health
The wide reach of the apps that our research has looked at might mean that intimate details of the private lives of millions of users across the world are shared with Facebook and other third parties without those users’ free, unambiguous and informed or explicit consent, in the case of special-category (sensitive) personal data, such as data relating to a user's health or sex life.
Our research highlights that the apps we have exposed raise serious concerns when it comes to their compliance with their GDPR obligations, especially around consent and transparency. Indeed, EU data protection laws seeks to ensure that users maintain control over their personal data at all times and that they should be aware of the exact and specific purposes these data might be used for by controllers, namely companies. It equally applies to controllers that process data within the EU/EEA and to controllers that might be based outside the EU/EEA but still target EU users with their services
This raises interesting points. First, even when GDPR applies, for example, in EU/EEA countries, this does not mean that controllers abide by the regulation. As our research illustrates, apps targeting EU users need to comply with, among others, strict consent and transparency obligations regarding the processing of personal data, but they often fail to do so. This should lead to a call for stronger enforcement - EU data protection laws have always been there, what is needed is effective and fruitful investigations by regulators.
Secondly, while apps that are located in Europe might be failing to meet their GDPR obligations, EU users are still provided with an appropriate right of redress, such as the possibility to raise the issue with the controller directly, or to file a complaint before their national supervisory authority, or even to bring a case against the controller before national courts. However, the case is not the same for users based in countries without proper data protection laws or with data protection laws that lack effective enforcement. The practices highlighted by this research should serve as an example of abuse that should prompt law-makers and regulators to uphold users’ rights.
Companies should also not escape their responsibilities. Facebook have announced they will launch a tool that will enable their users to stop apps and businesses sharing their data with the social network, which will address the problem for some users. However, it is insufficient, as it will fail to protect app users who do not have a Facebook profile.
The responsibility should not be on users to worry about what they are sharing with the apps they have chosen. The responsibility should be on the companies to comply with their legal obligations and live up to the trust that users will have placed in them when deciding to use their service. In order to guide best practices, we are suggesting the following recommendations:
Recommendations for menstruation apps
- Undertake in-depth privacy and risk impact assessments when designing their applications with consideration for their users and the potential harms they could experience.
- Limit the data collected, many menstruation apps appear to request superfluous data - including sensitive personal data - to build a profile of their users. Only data that is necessary for the purpose the app states should be collected.
- Limit data sharing only to what is strictly necessary for the purpose of providing the services. This requires checking default data sharing settings of tools provided by third-parties such as Facebook's SDK or third-party data management tools.
Recommendations for non-EU governments
- Implement effective data protection legislation which complies with internationally recognised data protection standards and aligns with their national and international human rights obligations to protect people's dignity and autonomy, in order to ensure that the processing of personal data by public and private entities is effectively regulated.
Recommendations for Facebook
- Facebook needs to better explain how it uses the data that it automatically receives through the Facebook SDK, how long the data is stored and if it is being shared.
- Facebook should do more to offer products and services that make it as easy as possible for developers to protect the privacy of their users by design and by default. For instance, the default implementation of the SDK should not automatically transmit data the second an app is launched.
- Facebook should take steps to make it easier for people to exercise their data rights on all personal data that Facebook stores, whether they have a Facebook account or not.
Recommendations for regulators
- Ensure data protection laws are properly enforced.
- Give extra scrutiny to apps that under the pretence of necessity disproportionately collect vast amounts of health data (including sexual health data) and share it without the explicit consent of users.
- Ensure app developers abide by transparency requirement of EU data protection laws.
- Make sure users maintain control over their data and can meaningfully exercise their data protection rights.
Recommendations for users
Even if they will not affect the kind of tracking that we have described in this report, we recommend that people make full use of all existing privacy settings, including:
- Resetting your advertising ID regularly. This can be found on most Android devices under, Settings > Google > Ads > Reset Advertising ID.
- Limitting ad personalization by opting out of ad personalization in the Android settings. This can be found on most Android devices under, Settings > Google > Ads > Opt out of personalized Advertising
Regularly reviewing the permissions that you have given to different apps and limitting them to what it strictly necessary for the way in which you want to use that App. This can be found on most Android devices under, Settings > Apps or Application Manager (depending on your device, this may look different) > tap the app you want to review > Permissions.