Competition and Data
In the digital economy there is a trend towards corporate concentration. This is true for social media platforms, search engines, smart phone operating systems, digital entertainment, or online retailers. Meanwhile, the way in which market dominance is measured traditionally does not always capture the extent of their control: firstly, their products and services are often “free” and secondly, it’s often not clear in which “markets” and “sectors” these companies operate, since there is so much horizontal and vertical integration. These factors pose serious challenges to antitrust and competition authorities. They also threaten privacy and data protection.
Facebook, Google, Apple, Netflix, Amazon are often collectively referred as FANG or GAFA and, together with Microsoft and a few others, dominate sectors of the global digital market, such as social media, search, video contents, etc. Their dominance often cut across different markets.
For example, around 80% of smart mobile devices globally run on the Android operating system, controlled by Google. Nearly the other 20% run on Apple IOS. Facebook’s global social media market share is over 60% on all devices (and up to 70% on mobile devices.) Google dominates the global market of general internet search services with over 85% of the market.
With the public spotlight on Facebook as a result of the Cambridge Analytica scandal, the scrutiny on its market dominance has increased, and so have the calls to break up the company. But the trend towards concentration is also increasingly true for the less public facing companies which make up the hidden digital ecosystem, such as data brokers.
It is also true for companies other than those based in Silicon Valley. In China, for example, Alibaba (e-commerce), Tencent (social media) and Baidu (online search) dominate the national market and are moving beyond China.
This trend is fuelled by the increasing reliance of many sectors of the economy on data, particularly personal data. Personal data is increasingly valuable in the digital economy. Why for example, would Facebook pay 21.8 billion USD for a company like Whatsapp which was losing money?
The value of personal data increases as more and more data is combined with it, and this incentivises companies to pursue business strategies aimed at collecting as much data as possible. With the development and integration of artificial intelligence technologies, it is likely that users’ data will become even more important for these companies, since their data is an essential input to train AI models. And given the growing importance of data across all sectors of the economy, the concentration is likely to continue and expand to other markets.
The effects of this concentration of power are significant, and they are not limited to online and offline privacy. These companies can act as gatekeepers, for example by regulating how we can access information on the web, including in some cases (e.g. Google, Apple, Amazon) which applications can we install on our devices. And they can track and profile us across devices to predict and influence our behaviour. This is no longer ‘just’ affecting the realm of digital advertising. Increasingly corporate powers encroach on the functioning of democracy and have profound societal impacts.
For example, profiling is increasingly used by political parties to identify and target potential supporters. While data-driven campaigning has been deployed for decades, the granularity of data that is available and the complexity of the data processing is something new. Recently, the practice of targeting voters by using profiling and exploiting personal data have raised concerns (and in some cases led to the opening of investigations) about political manipulation and the impact of such profiling on the democratic process in countries such as the US, the UK, and Kenya.
Such manipulation places a handful of corporate actors on the frontline of preserving our democracies. We need to question whether this power should be wielded by so few companies
Policy makers and regulators have been very slow to react to this concentration of corporate power fuelled by our personal data. Recent scandals, such as the Facebook/Cambridge Analytica, have led to some soul searching by politicians and renewed calls to curb the powers of digital corporate giants, but the urge of the moment can also lead to a rush into ill-informed or narrow-minded bills or policies.
Significantly, antitrust regulators in the US and the EU and several competition experts have begun to reflect on the need to reform or at least re-interpret laws and policies to address the challenges posed by the exploitation of data by big corporations. Those challenges include the exclusion of new actors from the market due to dominant platforms refusing to allow users to move their data to new competitors, the leveraging of personal data across different platforms owned by the same company (Facebook owns Instagram and Whatsapp, for instance), and the lack of transparency on the data that is observed and inferred from users, which leads to unfair competitive advantages. The lack of transparency exists at three levels: users’ level, process level, and market level. Opacity at each level has implications for competition, consumer protection, and privacy.
Privacy harms that result from lack of competition
Information and power asymmetries between companies and users have significant negative implications for competition. Dominant companies are able to impose unfair conditions on users. Because users’ data is a valuable commodity (a “proxy for price”), these unfair conditions tend to include excessive and exploitative collection and processing of users’ personal data.
This lack of choice is caused by a combination of factors: the significant relevance of network effects in these markets - where the utility of a service increases the more people use it, meaning that entrants require a ‘critical mass’ of users in order to compete, while users may only use the competing service when it has been generally adopted; lock-in of users; lack of alternatives; imposition of terms and conditions with poor privacy safeguards. For example, Whatsapp forcing its users to accept new terms and conditions that led to the sharing of personal data with Facebook.
Companies such as Google, Facebook, Twitter and others continue to impose terms and conditions to users which allow them to collect, analyse and share personal data in ways that people do not understand (or cannot genuinely consent to.) For example, an Associated Press investigation found that many Google services on Android devices and iPhones store people’s location data even if they opted out of such tracking through their phone’s privacy settings. Dominant companies also continue to find ways to obtain yet more data in order to maintain and expand their control on the market. For example, Alphabet Inc.’s Google and Mastercard made a deal to give Google access to Mastercard transactions as a way to strengthen Google’s dominance, thereby increasing Google’s market power in digital advertising and potentially excluding potential competitors. Technically it raises complex competition questions related to vertical integration of personal data.
These privacy harms are directly caused by the business models of companies in dominant positions, which increasingly rely on the availability of users’ data, and can impose excessive collection of data on people who have become “captive users” to their providers, given their lack of genuine choice.
Lack of competition on privacy
People demand both confidentiality and security of their digital communications and the protection of their personal data. In a competitive market, it should be expected that the level of privacy and data protection offered to individuals would be subject to genuine competition, i.e. companies would compete to offer privacy friendly services.
Instead, in digital markets that are characterised by increased corporate concentration, companies in a dominant position have no incentive to adopt businesses models and practices that enhance people’s privacy. Instead, they may seek to exclude any privacy enhancing players from any of the markets where they can exert market power.
For example, Google’s ban of mobile ad-blocker Disconnect (among other services) from the Google Play Store recently led to a case before the European anti-trust authority and a record $5.1 billion fine against the company.
Companies that exploit personal data often view privacy and data protection legislation as a threat to their business models. They fought against strong data protection provisions in the EU General Data Protection Regulation; and are still fighting against data protection reforms in the USA and elsewhere. Tellingly, their Annual reports for their stakeholders and investors paint data protection laws as harmful to their businesses.
For example, Alphabet Inc.’s 2017 Annual Report notes in relation to data protection regulation that “these legislative and regulatory proposals […] could, in addition to the possibility of fines, result in an order requiring that we change our data practices, which could have an adverse effect on our business and results of operations. Complying with these various laws could cause us to incur substantial costs or require us to change our business practices in a manner adverse to our business.”
It is not that users don’t care about privacy. Rather it is the market conditions that do not allow companies to compete to offer the best privacy friendly services. More than a self-fulfilling prophecy, it is an imposed reality on users.
Privacy as a right – data is more than a mere ‘economic asset’
Privacy is a fundamental human right, recognised by numerous international human rights instruments, including in Article 17 of the International Covenant of Civil and Political Rights, which has been ratified by 172 countries around the world. As a result, governments have obligations to enact laws to effective protect individuals’ personal data.
As individuals have a fundamental right to the protection of their personal data, personal data “cannot be conceived as a mere economic asset”. Privacy and data protection laws must be applied in order to limit corporate activities which infringe individuals’ privacy. And this applies also when such activities are not technically anti-competitive.
This is not such a new or revolutionary proposition. For example, it is already recognised in EU competition law that the protection of media pluralism can and should trump merely economic consideration in order to avoid excessive media concentration with negative effects on democracy as a whole.
Similar considerations should apply in relation to the protection of privacy, particularly as the effects of tracking individuals, profiling, and harvesting of personal data have significant implications on democratic institutions.
What are antitrust regulators doing
When assessing market power, competition authorities have tended to focus on price and outputs, giving little to no consideration to other factors affecting competition, such as consumer welfare, quality, innovation, and privacy; and as well as the different relevant markets at play in these cases (e.g. advertising, social media, search engines, online entertainment, etc.)
This narrow approach misses the increasingly important competition implications of the collection of personal data, particularly when done at scale. In turn it fails to take into consideration the multiple effects that gaining personal data has on certain types of digital services. The network effects of the online market can raise the importance of gaining or losing a user because of the importance of personal data (at scale) for the functioning of certain algorithms, such as those that underpin the effectiveness of targeted advertising.
An example of this narrow approach was the European Commission’s review of the proposed merger between Facebook and Whatsapp in 2014. The Commission noted that Facebook and Whatsapp had different business models, including notably different privacy policies, with WhatsApp's strong commitment to user privacy opposed to Facebook's ubiquitous tracking and profiling practices. However, the Commission wrongly assumed that consumers would easily detect degradation of privacy protection after the merger, and it failed to consider how the merger increased entry barriers for new competitors given the several, strong data-driven network effects, that were indeed leveraged in several manners by Facebook to drive its growth.
Two years after the merger, Facebook was fined for misleading the EU during the merger probe: it had told the regulators it couldn’t combine WhatsApp data with its other services but moved to do so shortly after the deal was finalized. But the warning signs should have been picked up during the merger’s review. After all, why would Facebook spend over 21 billion USD for a company operating in a market which, according to the Commission’s review, had low entry barriers, little likelihood of market power from network effects and small data advantage?
Since then, there are signs that some competition authorities are recognising the need to consider privacy and data protection implications. For example, the German competition authority has noted that “where access to personal data of users is essential for the market position of a company [here, Facebook], the question of how that company handles the personal data of its users is no longer only relevant for data protection authorities. It becomes a relevant question for the competition authorities, too.”
Unsurprisingly data protection authorities have warned of the privacy threats posed by increased market concentration and they have argued for the need to integrate data protection in the assessment of potential abuses of dominant position and of mergers of companies operating in the digital market.
Meanwhile, antitrust regulators in the US and the EU have recently opened consultations to consider the challenges posed by the exploitation of data by big corporations. Privacy International submitted initial views to the US Federal Trade Commission.