The mayor of a certain region is electing to take legal action, an unprecedented move directed towards the creator of the AI chatbot ChatGPT. This follows after their program published falsehoods that he was associated with a bribery scandal overseas.
Councillor Hood is contemplating a lawsuit against ChatGPT for depicting him as being implicated in a bribery scandal from the early 2000s, which would be incorrect.
OpenAI received a letter expressing concern from Cr Hood. The company is known for creating ChatGPT and has been the recipient of said letter. Legal experts deem the area of law being evaluated in this case as ‘uncharted territory.’ It is an unprecedented development that does not hold much precedence.
Recently, Mayor Brian Hood of the Hepburn Shire Council west of Melbourne was made aware that the chatbot ChatGPT provided an erroneous description concerning his part in a foreign bribery occurrence from the early 2000s.
Councillor Hood had prior working experience at Note Printing Australia; however, it wasn’t conventional. He did what was right and reported to the authorities about the bribe payments to win contracts from foreign officials.
Complicating matters of suing an “online intermediary” for defamation is the issue of jurisdiction, according to media law specialist Professor David Rolph.
Legal action being taken by Mr. Hood might set a precedent in Australia, whether companies focusing on Artificial Intelligence can be sued for wrongful claims supplied by their chatbots.
Citing Note Printing Australia’s actions, today the ABC was told by Richard Di Natale that he uncovered and exposed them.
Richard Di Natale say:
“Became a prosecution witness and went through all of that process through numerous court cases”.
“According to ChatGPT, I was one of the offenders, that I got charged with all sorts of serious criminal offences. I was never charged with anything,”
OpenAI, the maker of ChatGPT, has its logo. When someone asked ChatGPT about the bribery incident and Robin Hood’s part, it incorrectly assumed that he was a guilty party, not the whistleblower.
Under Australian law, Hood had issued a “concerns letter” to OpenAI, the company behind ChatGPT. This letter was required for anyone who felt aggrieved and wanted action taken within 28 days.
Since no responses have been made, Brian Hood expressed dismay upon realizing that ChatGPT had inaccurately characterized him as a perpetrator in a previous foreign bribery occurrence in the early 2000s. Astonishment and surprise overwhelmed him when reading it, which caused him to read it several times.
Brian Hood says:
“What was disturbing was that some paragraphs were correct and precise. They had the right facts, figures, names, dates, places, etc.
“It was always very black and white as to what my role was. And I gave evidence in a whole number of court cases and a parliamentary inquiry.”
OpenAI has been asked to comment on the matter. Moreover, representatives of OpenAI have been contacted for any potential responses.
Legal Burdens Of Suing Chatbots: What You Need To Know
Professor Rolph, a media law specialist at the University of Sydney, suggested that if Mr. Hood was to face any legal proceedings, he might face several “different issues.”
Professor Rolph says:
“One of the issues that we have with a lot of online intermediaries is the basic question of jurisdiction … can you actually bring a proceeding against them in an Australian court?”
“A lot of these internet intermediaries are based offshore, a lot of them in the United States, which will often raise all sorts of problems.”
Professor Rolph expressed his concern regarding the legalities around AI technologies, such as ChatGPT, which are still uncharted. Despite this, Cr Hood reported the answers given by ChatGPT about him were incorrect.
Professor Rolph went on to say:
“Those sorts of technologies do pose a lot more burdens on people who want to sue for defamation,”
“And I think that’s a function of the nature of the technologies.
“Now, I think it’s much more difficult because these technologies are so dynamic, and so sort of variable … And they create more forensic burdens on people who want to sue to protect their reputations.”
Legal experts indicate that if Cr Hood moves forward with a defamation case, there will be challenges he must overcome. This is according to OpenAI’s website.
Cr Hood says:
“Our text models are advanced language processing tools that can generate, classify, and summarise text with high levels of coherence and accuracy”
The researchers from Georgetown University’s Center for Security and Emergency Technology have written an academic paper that explores the possible ramifications of language models like ChatGPT on producing deceptive text.
It is recommended that AI developers, social media companies, and government agencies should collaborate on a more cooperative approach to help prevent chatbots from distributing inaccurate information.
As we continue to explore the use of AI in the legal system, it will be important to balance the benefits these systems can offer and the potential risks they pose. By addressing these concerns, we can help ensure that AI’s use in the legal system is transparent, ethical, and accountable and that justice is always served fairly and impartially.