The Italian police also consulted the databases of the most controversial facial recognition company in the world

The Italian police also consulted the databases of the most controversial facial recognition company in the world

This is revealed by a Buzzfeed investigation. Between 100 and 500 searches conducted in the archives of Clearview Ai, which uses photos trawled from the net without consent

The photos ended up in the Clearview Ai database The Italian State Police also made use of the services of Clearview Ai , the most talked about US company specializing in facial recognition. Its technology is able to identify a person by comparing a photo of him with an archive of images trawled from the web and social networks. According to the New York startup, there are about 3 billion photos of faces in its databases, used to train an algorithm sold to police forces and private companies.

At the end of a long investigation carried out on the basis of Internal data from Clearview Ai, the news site BuzzFeedNews revealed how police departments, prosecutors, universities and ministries in 24 countries, in addition to the United States, performed more than 14,000 searches with Clearview Ai's facial recognition software, including 2018 and 2020. Furthermore, according to the interviews conducted by journalists, many of these searches were carried out by police officers without authorization or control from their superiors.



The European market

The New York-based company has tried to make its way across Europe and even in some countries known for their repressive governments, such as Saudi Arabia or the United Arab Emirates. In the European Union, BuzzFeed has confirmed the use of the software in Denmark, Finland, Latvia, Sweden, the Netherlands, Spain, Portugal, France, Slovenia, Ireland, Malta, by the same Interpool - the European police agency - and in Italy . In our country, the searches conducted on Clearview would have been between 101 and 500, but the State Police did not want to answer any questions posed by BuzzFeed about it.

Clearview Ai commercialized its facial recognition system in Europe, offering free trials at police conferences, where it was often presented as a tool for tracking victims of sexual abuse. According to BuzzFeed, in October 2019, law enforcement from 21 different nations and Interpol gathered at the European Cybercrime Center in The Hague, the Netherlands, to sift through millions of files for identification of victims of child abuse. At the meeting, some external participants who were not Europol staff members presented Clearview Ai as a support tool for the investigation. At the end of the conference, according to a Europol spokesperson, many specialists have decided to exploit what they have learned and start using the software in their home countries.

Images collected without consent

L use of Clearview's software has been strongly contested by the privacy authorities of several countries and was outlawed in Canada last February. In fact, according to the Canadian guarantor, the company would have created a system that "inflicts large-scale damage on all members of the company, who find themselves constantly in a police file", lending itself to unlimited surveillance.

This is because, to train its algorithms, the company used billions of face images available on the internet, taking them from festival programs, academic books or other public images. All without asking the consent of the people concerned, putting anyone at risk of being registered in one of the largest surveillance databases in the world. Last March, Wired Italia site coordinator Luca Zorloni revealed that he ended up in this database with 13 of his photos, taken from various sites including Twitter.

Clearview has managed to stockpile billions of images taken by the big social networks, before they became aware of it and imposed themselves to stop it. Facebook, for example, intervened stating that "the scraping of personal information violates our policies, so we asked Clearview Ai to stop accessing or using data from Facebook or Instagram". Like the Menlo Park company, Linkedin and YouTube have also taken measures to combat the collection of their users' profile images.

An invasion of privacy

The use of the software of Clearview Ai is so controversial that LAPD has distanced themselves from the company and promised to use only proprietary technologies for facial recognition. While in the European Union its use is under the lens of the privacy authorities. Investigations into the company have already been carried out in France and Germany. In June last year, the body that brings together European privacy guarantors stated that "the use of a service such as Clearview Ai by law enforcement agencies in the European Union would, at present, probably not consistent with the data protection regime of the European Union ". Australia and Great Britain have also opened a joint investigation against Clearview for its use of personal data.


School - 7 minutes ago

The Ministry of Education wants to create an ad hoc app to control the teachers' green pass


Tech multinationals promise Biden a billion dollar cybersecurity plan


What we know about Tim's data breach

Topics

Cybersecurity Europe Gdpr Italy Legal Startup surveillance United States globalData.fldTopic = "Cybersecurity, Europe, Gdpr, Italy, Legal, Surveillance, startup, United States "

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.




Powered by Blogger.