How facial recognition is spreading in Italy: the case of Como
The municipality of Como, Italy, purchased a facial recognition system, which was bought, installed, and tested for months with little transparency and despite the lack of a clear legal framework. To do so it embraced a narrative of technological innovation pushed by Huawei but was forced, after the intervention of the Italian Data Protection Authority, to suspend the system.
- Como spent public money on a system that can’t be lawfully used
- Como used the Data Protection Impact Assesment as a seal of approval after having already bought the facial recognition system and failed to acknowledge the differences between a standard video-surveillance system and a facial recognition one
- Huawei approached the municipality of Como in order to sell their facial recognition technologies within the broader concept of smart city and innovation tech
Como is one of the most advanced cities in Italy in the use of facial recognition technology (FRT). An investigation for the Italian Wired magazine published in June 2020 exposed how the system had been bought, installed and tested for months with little transparency and despite the lack of a clear legal framework.
The investigation was entirely based on tools available to everyone, such as Freedom of Information requests (FOI requests. Similar to PI’s campaign 'Unmasking policing, inc', it sought to expose the role of private companies, in particular, Huawei, in pushing for the deployment of a facial recognition system in Como, despite the lack of adequate fundamental rights’ guarantees and with local authorities not taking adequate stock of the serious implications that facial recognition technology has for people’s rights.
As mentioned in the tender’s documents provided by Comune di Como, the "innovative video surveillance system” is composed of six cameras that cover a public green area located close to the city’s railway station. The system is capable of capturing the faces of passers-by, as well as detecting loitering, sending an alert when a package is left unattended, and offering a tripwire function (i.e. checking if someone trespasses a forbidden area). It is much more advanced than a traditional CCTV system. It also allows operators to search for any individual on a given “blacklist” or to set up a “red list” of people deemed VIPs whose images will not be recorded.
These rather intrusive functions are missing from the Data Protection Impact Assessment (DPIA) we obtained from the Municipality of Como via a FOIA request sent through the FOIA4journalists project (Transparency International Italy) but are clearly highlighted in the decision issued by the Italian Data Protection Authority (DPA).
All these features are part of Huawei’s Matrix Intelligence cloud platform. The company that won the tender and then installed Huawei technologies was A2A Smart City, a company of the A2A Group that is among the leaders in the energy sector in Italy and is operating also in the field of technologies for the growth of smart cities.
In September 2018, Huawei approached the municipality of Como in order to showcase their products for smart city technologies, highlighting their success cases and other installations in Italy. After that, the municipality of Como set up a public tender for the FR system. A2A Smart City won the tender—being the only company to participate.
The technical report written by A2A Smart City uses language that is not appropriate to describe ethnic groups. For example, it refers to Asian people as “yellow.” This racist language is particularly unsettling coming from a company peddling facial recognition and was totally ignored by the Municipality. Facial recognition systems have been consistently been shown to be less accurate in recognising people of colour, women or minorities, leading to discrimination.
At the same time, being subject to constant and pervasive surveillance enabled by facial recognition systems poses a major threat not only to our privacy but to our society as a whole: we cannot be free if we are constantly monitored by those in power. Despite the global debate on these issues, the DPIA produced by the Municipality of Como does not take into account the accuracy of the facial recognition algorithm used nor the risks for the right to privacy, for example.
On February 26, following our investigation, the Italian DPA issued a decision against the Municipality of Como mandating the suspension of the FRT system. According to the DPA, the processing of biometric data carried out by the Municipality of Como is illegal.
Como spent public money on a system that can’t be lawfully used. They could have avoided this by producing a proper DPIA, analyzing all the aspects of facial recognition, before actually publishing a call for tender. In fact, the DPIA that was obtained through FOI requests was produced after the tender was awarded to A2A Smart City, an Italian company providing digital services for urban use.
As the screenshot below illustrates, in an email exchange from 17 January 2019 between A2A Smart City and the Municipality of Como, the company explains that it is contacting the DPA and is "defining an activity list". However, as we can infer from the Italian DPA decision, the Authority was not contacted by the company and the investigation started following our request for information.
The project devised by the Municipality, again obtained through Freedom of Information requests, is very similar to the one presented by Huawei. The municipality is adopting the wording of Huawei boasting about the “innovative” technology and presenting it as a core part of the smart city project.
In November 2019, the municipality launched a new tender, this time for the enhancement of the video surveillance system. The documents published on the municipality's website also highlight the need to equip some existing video cameras with facial recognition technology. The tender, for the annual amount of € 261,777, was again awarded to A2A Smart City in January 2020. The project foresees the installation of new "intelligent" cameras also in other areas of the city. Despite the measure issued by the DPA, the municipality installed this new series of facial recognition cameras in other areas of the city in May 2020. According to Como, the new cameras, now turned off, had been installed for future use. Basically, the cameras had been installed “just in case” their use would be deemed legal in the future.
The installation of this facial recognition system in the city of Como is the latest event in a series of security measures taken by the current city government, led by a right-wing coalition since 2017. 2016 was a complex year for Como since hundreds of migrants were stranded in the park outside the station (the same park where the facial recognition system has been installed) after being stopped at the Swiss-Italian border only 4km away.
The documents obtained through the FOI requests carried various references to those events from four years ago. Despite the 2016 crisis having ended in three months, a document written by the head of the city’s Municipal Police stated that those events created “inevitable degradation problems and spread a sense of insecurity among citizens”.
However, when the local Questore (the Chief of Police) was asked for a comment for our Wired investigation about the crime situation in the city and in the park, they said that the Province of Como is one of the safest in the country and that the era outside the station is not interested by any specific crime situation, except the regulation of minor risks common for urban transit area.
In August 2020 local newspapers from the Como area reported the poor results of the testing of the facial recognition system. According to news reports, camera haven't passed the test launched by the municipalities and failed to actually recognize faces, contrarily to the technical specifications of the technology. Again according to news reports, the Municipality blamed these poor results on a software update which was not originally envisioned. It is unclear when, how long and under which circumstances this testing phase took place. Local newspapers defined this latest development as “a new embarrassment”.
The case of the use of facial recognition in Como is a clear sign of how controversial technologies with profound social implications are spreading loosely in urban spaces, even in quiet contexts, such as Como, a middle-sized town in Northern Italy, known for its own Lake and with low crime rates. The case is also a reminder of how fundamental transparency is when it comes to the introduction of security practices and surveillance technologies. Finally, it shows the power of accountability practices, such as FOI and the crucial role of investigative reporting in shedding light on the technological implications of surveillance in the datafied society.
LINK TO ALL FOIA DOCS: https://www.documentcloud.org/search/projectid:49022-Como-Tokamachi
This piece was written by:
Laura Carrer – Head of FOIA4journalists project, Transparency International Italy and Hermes Center for Transparency and Digital Rights Fellow
Riccardo Coluccini – Freelance Journalist, Vice-President of Hermes Center for Transparency and Digital Human Rights
Philip Di Salvo – Post-doctoral researcher at Università della Svizzera italiana (USI), Lugano, Switzerland, and freelance journalist