Apple opens the door to mass surveillance
Apple has announced that it will start scanning photos in the US to improve child safety, but their plans will ultimately endanger everyone.
Today Apple announced a set of measures aimed at improving child safety in the USA. While well-intentioned, their plans risk opening the door to mass surveillance around the world while arguably doing little to improve child safety.
Among the measures, Apple has announced that it is to introduce “on-device machine learning” which would analyse attachments for sexually explicit material, send a warning, and begin scanning every photo stored on its customers’ iCloud in order to detect child abuse.
Apple, which has for years marketed itself as a global leader on privacy, is at pains to reassure that these measures are “designed with user privacy in mind” - because, as they explain it, all the processing is done on each user’s device. But make no mistake: no matter how Apple describes it, these measures undermine encryption and threaten everyone’s privacy and security.
As one of the world’s biggest tech companies, the decisions Apple make matter. This is a clear signal to every government around the world that Apple - and inevitably their entire industry - have the technology and the will to carry out mass surveillance. By opening the floodgates, even for something as important as protecting children, Apple and the rest of the industry will inevitably be unable to resist doing the same for other reasons and for other governments.
While these plans are ostensbily aimed at improving child safety in the US, it is undeniable that the same technical approach an be widened to include other categories of images and content. Indeed, it has already been applied for counter-terrorism purposes, and there are consistent calls for tech companies to use such an approach to identify copyright infringments.
At the same time as opening the doors for global mass surveillance, such client-scanning technology will also arguably do little to improve child safety. It is an approach that is easily to circumvent: criminals will either simply not use iCloud, or over time be able to produce false negatives by modifying content to ‘game’ the scanning techniques to avoid detection. Conversely, such an approach could also result in false positives, allowing malicious actors to create images or other content in ways that the scanning technology misclassifies it as child sexual abuse. This is something that can be of particular interest for those wishing to silent whistleblowers, investigative journalists or political opponents, by simply sending them illegal content. In short, client-side scanning weakens the security of communications and opens the door to abuses.
There is no doubt that child safety needs to be protected and that big tech companies like Apple must do everything they can in order to help. But as with any interference with the right to privacy, the risks of introducing measures that weaken privacy and security of communications need to be clearly spelled out and addressed. Any such interference with the right to privacy should be justified and balanced against the risks. Unfortunately, we just haven’t seen enough evidence to suggest that this is the case.
There are also alternatives which wouldn’t inevitably open the door for mass surveillance. With reports of police forces being overwhelmed by the scale of child abuse, it is easy for governments - some of which have deliberatly cut funding to the criminal justice systems - to point to tech companies as the problem. But ultimately the responsibility rests with governments to insure that they invest enough resources into dealing with the scale of the problem, something they have been more than willing to do on issues such as terrorism.
Outsourcing the blame and responsibility to companies, while at the same time only causing further risks, protects no one.