ChatGPT, under scrutiny around the world for its dubious treatment of data | The USA Print

ChatGPT, under scrutiny around the world for its dubious treatment of data

What disease does this person have? At what age did she start studying herself? What do you think about a topic? These are some questions that millions of users around the world ask ChatGPT, the revolutionary OpenAI tool that never hesitates to reveal the most personal data it has in its library without regard to privacy.

This is one of the factors that has led Italy to block the use of the famous chatbot because it considers that the platform does not respect the data protection law, a provisional measure, but with immediate effect, which will be imposed until it “does not respect the privacy discipline” of its users.

The collection of data and the dissemination of biased information corner ChatGPT

The collection of data and the dissemination of biased information corner ChatGPT


“With ChatGpt and chatbots we have conversations, and in these conversations we often tend to share a large part of our lives. ChatGpt can be asked when Paolo Rossi was born, and get an answer. This is a personal data. Every question about people’s lives is a use of personal data. Not counting the ones that it uses inaccurately”, explained one of the members of the Italian regulator.

As if that were not enough, in addition to providing this information without filters, it is also in doubt how it collects the data of those who use ChatGPT, what information it collects and what it uses it for, even leading Germany to question its blocking.

Also Read  Record EU fine against Meta for violating data protection | The USA Print

There the data protection commissioner, Ulrich Kelber, admitted that his country could follow Italy. “In principle – he qualified – a similar procedure is also possible in Germany”. At the moment, everything is in the study and information gathering phase.

Samsung workers leaked corporate information with ChatGPT

The company plans its own AI

So good point Samsung allowed the use of ChatGPT in its business premises, an accident occurred in which corporate information was leaked. The contents of the programs related to ‘facility measurement’ and semiconductor ‘performance’ were deviated as learning data for US companies. Therefore, Samsung is considering building its own artificial intelligence (AI) service exclusively for the company under the supervision of the Innovation Center to fundamentally prevent such accidents.

“We do not need to ban AI applications, what we need is to find ways to guarantee values ​​such as democracy and transparency,” said the Federal Ministry for Digital Affairs and Transport of Germany, also involved in the decision with tools such as ChatGPT.

The shadows on ChatGPT are also rising on the other side of the pond, since in Canada its Office of the Privacy Commissioner has launched an investigation into the alleged collection of personal information without consent by the chatbot, according to Betakit.

Specifically, there is talk of the collection, use and disclosure of personal information without any kind of consent by those affected, something that puts ChatGPT in the crosshairs of the Canadian privacy commissioner, Philippe Dufresne, although it is still unknown. what measures will be taken

Also Read  Microsoft challenges Google with latest Bing Chat update | The USA Print

With the pressure of new technologies such as ChatGPT on the devices of millions of people, the European Union is still working on a bill on artificial intelligence that should be finalized this year, but the rapid evolution of these AI systems can quickly leave obsolete any legislation.

In a few weeks, thousands of different applications have been created for a multitude of fields that work in a similar way to ChatGPT, a tool that is as useful as it is obscure in its treatment of information. Its unavoidable bias and dubious respect for privacy have regulators around the world pointing their legal weapons at AI.

#ChatGPT #scrutiny #world #dubious #treatment #data