PI calls for urgent action following UK Parliament 'fake news' report
Privacy International welcomes the focus on data and privacy contained in the final report by the UK House of Commons Digital, Culture, Media and Sport Committee (DCMS) on Disinformation and ‘fake news’. Beyond our control, companies and political parties have banded together to exploit our data. This report establishes essential steps to remedying this downward spiral. An important part of the democratic process is freedom of expression and right to political participation, including the right to form views, thus the focus should not be on measures that risk impinging these rights but on limiting the exploitation of data.
Privacy International supports a number of demands in the DCMS report and will work towards:
- Reviewing and strengthening electoral law - in particular the powers for the electoral commission.
- Applying competition law to address abuse of power, starting with the comprehensive audit by the CMA.
- Strengthening of current data protection law as it relates to political campaigning, through the implementation of a Code of Practice on political campaigning and restrictions on the exemption for political parties.
- Pressuring regulators to enforce the law, including to make clear that inferred data is personal data.
- Pressuring data brokers and ad tech companies to comply with the law by uncovering the ways in which the hidden data ecosystem routinely exploits peoples' data.
Urgent reform is needed
The DCMS report found that "electoral law is not fit for purpose and needs to be changed to reflect changes in campaigning techniques" which includes online microtargeting made possible by profilling. It rightly recommends that there should be total transparency in political campaigning, stronger powers (and higher fines) for the Electoral Commission to investigate and sanction abuses.
As Privacy International has often noted in recent years (including during the 2017 election in Kenya), while data-driven campaigning has been deployed for decades, the granularity of data that is available and the complexity of the data processing is new and raises serious concerns about political manipulation and the impact of such profiling on democratic processes.
Facebook's dominance and data exploitation
It is important that the report recognises the role of personal data and criticises a business model that aims to gain market dominance by exploiting people's data. Much of the Committee's inquiry was on the practices of Facebook (finding categorically "Facebook intentionally and knowingly violated both data privacy and anti-competition laws"). Privacy International remains concerned that other major tech companies also rely on personal data to build their market power, at the expense of an open market and ultimately of individuals' rights.
Large tech companies are able to impose conditions on users that tend to include excessive and exploitative data collection and processing. In other words, the privacy harms are directly caused by the business models of companies in dominant market positions which can impose excessive collection of data on people who have become “captive users.” Privacy International supports the recommendation by the Committee for the Competitions and Market Authority (CMA) to conduct "a comprehensive audit of the operation of the advertising market on social media." As recently noted, antitrust authorities in Europe are beginning to pay due attention to the distortion of competition caused by the exploitation of personal data by companies like Facebook, Google, Amazon, and other tech giants. More inquiries and eventually more remedial action is needed to address companies' abuse of power.
Privacy International welcomes the Committee's insistence that inferred data should be treated just like any other personal data. Inferred data can reveal shockingly invasive insights. In contrast to the information that users (more or less) knowingly share, the vast majority of people have no idea that such additional data even exists. Nonetheless, such profiling is the very essence of the contemporary data industry. That's also why profiling plays a central role in Privacy International's complaints against data brokers, credit referencing agencies and ad tech companies filed in November 2018.
The European data protection rules enshrined in GDPR unambiguously define profiling - what we need now is enforcement action that sends a clear signal to companies that they can no longer pretend that inferred data falls entirely outside the scope of data protection law.
As the Parliamentary report makes clear, profiling and other abuses of personal data are no longer solely the concerns of commercial advertising. These practices are now encroaching on the democratic process, in the UK as well as across the world. The Committee echoes concerns already expressed by the Information Commissioner's Office (ICO) - the data protection regulator, and the Electoral Commission that these practices have been employed by political parties and political organisations to target individuals.
Exempting themselves: political parties are the pushers
Privacy International has long raised these concerns and we regret that Parliament missed the opportunity to regulate it during the adoption of the Data Protection Act 2018. The Act includes a provision – not surprisingly, little contested by all parties in Parliament - which permits political parties to process personal data ‘revealing political opinions’ (without the individual’s consent), for the purposes of their own political activities. There is nothing in the provision to prohibit delegation of such activities to a third parties specialising in profiling, for example. As such this provision is open to the very abuses identified in this report and may be used to facilitate targeted and exploitative political advertising. Privacy International wrote to the main UK political parties asking for a commitment not to use profiling and targeting techniques in their future political campaigns; we are still awaiting responses.