ChatGPT, the Privacy Guarantor blocks it in Italy

ChatGPT, the Privacy Guarantor blocks it in Italy


The Privacy Guarantor blocks ChatGPT in Italy. The powerful conversational model developed by the US startup OpenAi ends up under the scrutiny of the Authority that safeguards the protection of personal data in Italy, which has put a temporary stop to the algorithm "until it complies with privacy regulations". The Italian guarantor is the first authority in the world to block the use of ChatGPT on the basis of the privacy legislation. According to the Piazzale Venezia Authority, OpenAi and its flagship product do not comply with the Gdpr, the European regulation for the protection of personal data.

In a note, the Guarantor states that he has "arranged, with effect immediately, the provisional limitation of the processing of Italian users' data against OpenAi, the US company that developed and manages the platform” and, in parallel, opened an investigation. The measure takes its cue from a data loss suffered on March 20 by ChatGPT. The data breach concerns “user conversations and information relating to the payment of service subscribers”.

The issue, however, is deeper. As explained to Guido Scorza, a member of the board of the Privacy Guarantor, the target is the use of personal data to train artificial intelligence and the awareness that people have of the fact that their information is used to train an algorithm. “ Anyone who does medical research must obtain consent for experimentation. Even those who experiment with new technologies must make the process transparent ”, observes Scorza.

The blockade:

The protests How the block works Data to train In Europe The comment What is the "Terminator scenario" that gives the father of ChatGPT nightmares The founder of OpenAi, Sam Altman, warns about future risks and remotes of generative artificial intelligence. A belated paranoia, given that he doesn't care about the current repercussions

The disputes

According to the guarantor, the OpenAi startup has never provided information on the processing of user and interested party data. And, above all, there is no " legal basis that justifies the mass collection and storage of personal data, for the purpose of "training" the algorithms underlying the functioning of the platform ". Moreover, since ChatGPT often returns answers with errors, according to the guarantor, there is also "inaccurate processing of personal data". And then there is the issue of minors. Although ChatGPT is aimed at those over 13, there is no filter to control the age of those who use it. And therefore, they conclude from Piazzale Venezia, this exposes " minors to absolutely unsuitable responses with respect to their degree of development and self-awareness ". A move similar to what was done with TikTok.

Since OpenAi does not have an office within the European Union, but has designated its representative in Ireland, it now has 20 days to respond to the Guarantor and explain how it intends to go about addressing the problem. Under penalty of a fine of up to 20 million euros or up to 4% of the annual global turnover. The Guarantor also sent his correspondence to the United States, to the headquarters of the startup founded by Sam Altman.

All about ChatGPT: what it is, how to use it and what it allows you to do ChatGPT is the chatbot based on most famous artificial intelligence of the moment, not only because it is very powerful but also because it is very easy to test its functions, even in Italian

How blocking works

As Scorza explains to, blocking is a measure temporary. And it concerns the processing of personal data. If ChatGPT works without it, no problem. In fact, OpenAi will have to block access from Italy to its software or in any case limit it to functions that do not include personal data. We do not know how long it may take from the publication of the provision, decided on March 30 by the board of the Guarantor, and provided that OpenAi decides to comply with the request. Furthermore, it is not possible to intervene on connections via virtual private networks, which allow the connection to be bounced on other networks outside Italy, thus bypassing any blockage.

What interests the Guarantor more, however, is located in the home of the startup. This is user data used to train the algorithm. It is on those in particular that the company will have to clarify, specifying how it collected them, if it obtained the consent and, if in the face of a possible request for cancellation or modification, it did so according to the times and the rules established by the GDPR. The link is the data breach of March 20, in particular with regard to the data of Italian people who have subscribed to the ChatGPT service and have therefore provided payment data and other personal information. According to the European standard, those who suffer an exfiltration of information have 72 hours to let the interested parties and the competent authorities know.

Why the tech sector wants to pause artificial intelligence Behind the appeal signed by hundreds among industry exponents there is above all the fear that too rapid development of technology does not adequately take into account the risks for humanity

Data to train

The Guarantor aims to block the data of Italian users. For example, if ChatGPT has information on the author of this article in its database, which qualifies him as a Italia journalist, to the question “Who is Luca Zorloni of Italia? ”He should make a silent scene. However " we do not know how the data is stored, if the scraping was carried out at a geographical level and contains information that allows a data to be qualified as belonging to a person in Italy ", says Scorza.

The investigation aims to shed some light on this as well. And more generally, he wants to raise a problem: if you collect data on people in Europe to train your algorithm, you have to let people know. The same reason Google Streetview cars need to be identifiable when cruising taking pictures on the streets. Or why the Ombudsman blocked the data processing of the US facial recognition startup ClearView Ai, which also used the faces of European people without permission. The issue isn't when I use ChatGPT on Bing to query the network. It is upstream: when trawling the data to feed the algorithm information on which to train its response capacity.

In Europe

For now, the Italian Guarantor is the first authority in the world to contest ChatGPT. At the end of April there will be a meeting of all the privacy guarantors of the European Union and the issue of the Italian provision will be a cause for debate. It will also be necessary to see, in the meantime, how OpenAi will behave. “ Nobody wants to slow down innovation - says Scorza - but new technologies cannot be developed at the expense of people's rights ”.

The comment

As Massimiliano Masnada writes in a note, partner of the Hogan Lovells law firm specializing in privacy, “ first of all, the intervention of the Guarantor seems to contest ChatGPT's lack of transparency, understood as a lack of information to users with respect to the purposes and methods of processing any personal data communicated. The provision also recalls the need for a legal basis to be identified with respect to the legitimacy of the processing by OpenAi of any personal data collected from users ". According to Masnada, “the necessary privacy by design and privacy by default mechanisms, i.e. the controls and remedies to protect the privacy of the interested parties, especially if they are minors, are not enough. We need to create a new technological culture based on ethics and respect for fundamental rights". And he adds: “Data, whether personal or not, is the fuel needed for the development of AiI mechanisms like ChatGPT. Access to data allows for more precise and suitable algorithms to be used to improve people's lives. Theirs must be done safely and ethically. Prohibitions are not enough to do this. A first step, in this sense, will be the correct implementation of the rules on the reuse of data which are the basis of the Data Governance Act, soon to come into force, and of the subsequent Data Act [European provisions, ed]. The carousel has just started ”.

Powered by Blogger.