In big tech we trust: is Apple's and Google's COVID-19 reputation laundering enough to make us forget about their past?
While more and more governments and companies are asking us to trust them with their COVID-19 solutions, their surfacing and potentially unlawful data exploitation practices do not engender trust.
This week, we read that a former Apple contractor who blew the whistle on the company’s programme to listen to users’ Siri recordings has decided to go public, in protest at the lack of action taken as a result of the July 2019 disclosures. The news adds to a series of revelations that have been reported over the past months.
While the issue raises serious questions regarding the compatibility of such practices with data protection laws, at the same time, it highlights a wider problem that becomes extremely relevant during these times of public health crisis: the race to the bottom big tech is jumping on with regard to health data.
This is not the first that a tech company is caught red-handed listening on their users. On 11 July 2019, the Irish Data Protection Commission (DPC) received a data breach notification from Google, following reports that contractors could listen to recordings made from people’s conversations with their Google Assistant. In April 2019, a similar investigation revealed that thousands of Amazon employees around the world are listening in on Amazon Echo users.
What these examples suggest is that our personal data is being treated just like any other commodity. The way in which dominant players currently amass, generate and analyse data often lacks transparency and seeks to maximise the amount of data available, often through unfair means.
At the same time, there's a push to negotiate trade agreements covering digital trade, with the aim to set global rules friendly to corporate interests, such as low standards for cross border data flows, combined with further restrictions on reviewing the algorithms that big tech are building with our data, and using in various ways to rule our lives.
Our personal data, including data about our health, are at the very core of the business model of these companies, since it’s an essential input to train AI models, including search engine results. Our data is the product they trade. And given the growing importance of data across all sectors of the economy, this concentration is expanding to other sectors, such as health.
Just a few months ago, it was revealed that Google was collecting health data records as part of a project it has named “Project Nightingale”. This was part of an agreement Google had with a health care provider, whose immense scope purportedly allowed Google to amass data for about a year on patients in 21 US states. The data included lab results, doctor diagnoses and hospitalization records, among other categories, which amount to a complete health history, including patient names and dates of birth.
But Google is not the only big tech company wishing to enter health related markets. In July 2019, the UK NHS announced that it was teaming up with Amazon “to allow elderly people, blind people and other patients who cannot easily search for health advice on the internet to access the information through the AI-powered voice assistant Alexa”. While the largely redacted deal between Amazon and the Department of Health stated that no patient records are shared with Amazon, Amazon might still be able to amass large quantities of health related data due to the user generated queries around symptoms and health conditions.
Big tech's greedy hand won't probably stop here. According to the former contractor, the Apple data collection involved vast quantities of data, including sensitive personal data. As he put it: "I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs with no intention to activate Siri whatsoever."
All of this is happening at a time when we are being asked to trust industry and governments with their contact tracing apps. That data, recording all our interactions, would be a goldmine of data for exploitation. Governments and companies claim they are not going to exploit this data. Their track records do not engender trust.