We don’t want to sell our data, we want data rights!

News & Analysis
Data exploitation machine

Dear will.i.am,

We saw your piece in the Economist and were very excited to learn that you care about privacy as much as we do. At PI we expose government and corporate bad behaviours, we disrupt their plans, and identify a hopeful path forward.

That’s why we very much agree with you that people need much more protection, transparency and control over their personal data. Cheers for: “I want to have it clearly explained in plain language who has access to my camera, to my photos, who’s listening to my microphone, and who gets to use this information.” Just like you, we also believe that people urgently need real choice and the ability to restrict how others use and abuse their data. 

But we also think that putting a price tag on people’s data is a very bad idea. Monetization and personal data just don’t go well together!

  • Data rights offer something that property rights do not: governance. 

Data rights give people authority over their data. These rights can be quite powerful: they offer the right to access, to change, to move or to delete data; the right to know who's collecting it, where it is, where it's going, who has access to it, for what purposes.

We just used the expression “your data”, but what we really mean is any data that can be linked to a unique individual. That includes the data you knowingly generate, such the photos or posts on your social media account, but also data you indirectly generate and that is automatically collected about you, like your location history, your browsing history. It also includes information about you that has been derived, inferred or predicted from other sources. For example, data rights allowed one of our staff members to get access to all the data that an online tracking company collected from one of the browsers they used. This included data about the websites they have visited, but also the many things that this company predicted about them: their gender, their income, their number of children, as well as all lots of, so-called “consumer segments” that other companies have placed them in: their shopping habits, their interest, and even (wrongly!) how much alcohol they consume at home.

Data rights offer a system of control and protection that is much more comprehensive than ownership, and these rights continue to exist even after you share your data with others. They apply to data that others collect about you with or without your knowledge and they also apply to the insights and conclusions that they make about you.  

As an analogy, think about the data that you generate as forming your informational body. As much as you control your own physical body, you want to have control and authority over your informational body. You want autonomy to decide what to do with it. You don't want to be financially dependent on companies to pay you for your data body. You also don't want to sell your kidney for money.

  • Let's not get confused about what we want to demand: people don't believe in "free" services anymore and they want a stop on exploitative data processing systems. 

 You are right though, many people feel helpless in fighting “the data monarchs”, as you call them. People have lost control over the data they generate, and they often don't know what others collect about them, how it's used, and with whom it’s being shared.

That's why Privacy International filed complaints against data brokers, ad tech and credit scoring companies. This is why we uncovered that mobile apps automatically send data to Facebook without your knowledge, regardless if you have an account.

You’re demanding that data ownership becomes a human right for everyone, but it’s important to note that both privacy, and data protection are already fundamental rights in the European Union. Data rights are at the very core of data protection regimes that already exist around the world, but as is so common in many domains of law and tech, not all of these regimes are strong enough to protect people from new threats. That’s why the EU (and thereby also the UK) has adopted a new data protection law that became effective in May 2018. This new law isn’t perfect. Nonetheless, it’s one of the strongest data protection laws in the world and in many ways, we’re standing at a crucial crossroad: now is the time to make sure that these rules are meaningfully and fiercely enforced so that those who exploit our data actually comply with them. 

  • A different take on addressing the data monopolies

European privacy rules offer mechanisms for addressing data monopolies. One of them is the right to data portability. If users can pull their data from one company and move it to another in an easy way, they won’t get trapped in the walled data farms of the big corporations, and more innovation and healthy competition can emerge.

Europe is also discussing a legislative proposal (the ePrivacy Regulation) that could boost the GDPR and increase individual protection, provided that big industry’s efforts to weaken it will not prevail. Strong privacy rules together with modern, reformed antitrust framework provide a better foundation for addressing the “data monarchies”.

That's what we should be aiming for, that’s what we’re working on and that’s why we feel that demands for data ownership and monetization, while well-intended, are driving the conversation away from our preferred solutions. We already have a strong data rights system in place, let’s channel our efforts into making it easy to use and widely adopted before jumping into applying free market rhetoric as a universal panacea.

  • We’re sceptical that the data market will sort itself out

We also have some very practical concerns that make us sceptical about data monetization. Your individual data is actually not that valuable. While the entire data market might be worth $3trn, as you quote in your piece, it’s access to huge aggregate data that is valuable.

In fact, the Cambridge Analytica scandal is a perfect example of the kinds of inequalities that introduces.

According to records of the transaction, Cambridge Analytica paid about 0.75 US Dollar per US voter profile. Each profile contained at least the following data points: forename, surname, gender and location, along with four GS-modelled components: “big five personality scores” according to OCEAN (openness, conscientiousness, extraversion, agreeableness, and neuroticism); a Republican party support score; a political involvement/enthusiasm score; and a political volatility score.

0.75 $ is a small amount for such comprehensive intimate data, especially compared to the outrage and distress that many people felt when they learned that the company has used their data to meddle in elections. 

Fundamental data rights, not a hypothetical data market, enabled the UK regulator to issue significant fines against Facebook and a number of other actors that were involved in the scandal. Data rights are also what allowed people outside the EU to investigate the scandal and hold the company to account. For instance, David Caroll, an associate professor at Parsons School of Design in New York, has successfully sued Cambridge Analytica and its parent company SCL Elections Ltd to recover his data and reveal its source. His motivation is very similar to what you are asking for in your article: Caroll wants to “ensure that my personal data was not used for purposes I consider unsettling or unlawful”. The only reason this lawsuit by an American man is possible is because both Cambridge Analytica and SCL Elections were based in London, and thus subject to European data protection laws. 

So much remains to be done. As powerful as data rights are, they are not a silver bullet. For instance, market dominance and other distortions are a growing concern and we’re encouraging regulators to take action. But data rights are fundamental to redress the imbalance of power. Why don’t we join forces to ensure that everyone around the world can enjoy strong and enforceable rights over their data? Let’s also make sure that governments don’t abuse their powers or misuse privacy to hide in secrecy. Let’s build a world where people are back in charge, where individuals, journalists and other watchdogs master these powerful tools to hold those that exploit our data to account.

This letter was written by Privacy International Mozilla Fellow Valentina Pavel.

Photo credit: Owni /-), Data money, CC BY NC

A previous version of the piece implied that Cambridge Analytica has been fined for their involvement in this scandal. The piece was updated on 7.02.2019 to make the text less ambiguous. The company has been fined for failing to respond to an access request by the Information Commissioner’s Office (ICO). SCL Elections also pleaded guilty to a charge of breaching the Data Protection Act by failing to comply with an enforcement notice from the ICO at a hearing at Hendon magistrates court. There are ongoing investigative matters.