How YouTube’s Case At Supreme Court Could Shape Protections For ChatGP Users

The Supreme Court’s upcoming decision on diminishing the power of the internet shield could drastically affect ChatGPT and other progressively advanced AI technologies. This ruling will decide the scope of legal protections available to internet companies.

The U.S. Supreme Court is set to deliver a decision by the end of June regarding Alphabet Inc’s YouTube and if the site can be taken to court for its video recommendation features.

The legal issue to be tested in that case is whether the U.S. law governing technology platforms, exempting them from responsibility for content posted by users, applies when algorithms are utilized to recommend content to users.

The relevance of what the court decides concerning these issues extends further than social media platforms. Its ruling could affect developing discussions on whether companies that construct generative AI chatbots, such as ChatGPT from OpenAI, should be held responsible for the content generated by their products.

The experts in the tech and legal fields have determined that companies such as those invested in by Microsoft Corp (MSFT.O) or Bard from Alphabet’s Google should be safeguarded from any possible defamation or breach of privacy lawsuits.

Similarly to the algorithms powering YouTube’s video suggestion tools, the algorithms behind generative AI tools like ChatGPT and GPT-4 operate. Experts state that this similar approach is one of the key reasons why these tools are so successful.

Cameron Kerry, an expert on AI and currently a visiting fellow at the Brookings Institution in Washington, DC, plays a big role in this field.

Cameron Kerry says:

“The debate is really about whether the organization of information available online through recommendation engines is so significant to shaping the content as to become liable,”

“You have the same kinds of issues with respect to a chatbot.”

In February, during arguments in the Supreme Court, much uncertainty was shown concerning whether the court should lessen the safeguards provided by Section 230 of the Communications Decency Act of 1996.

Justice Neil Gorsuch noted that AI tools which generate “poetry” and “polemics” would not be likely to receive such legal protections as alluded to in the case, unrelated. However, it may be to generative AI.

The emerging conversation about Section 230 immunity applying to AI models trained with large online data sets and producing original works is getting attention with this case.

Generally, Section 230 protections for a technology platform apply to content originating from third-party users and not information created by the company. Courts have yet to determine whether these protections extend to a response from AI chatbots.

Understanding The Consequences Of Your Actions – A Guide

Senator Ron Wyden, a Democrat, noted that the liability shield should not extend to generative AI tools since they “create content,” which he mentioned while helping draft the law during his House of Representatives tenure.

Wyden says:

“Section 230 is about protecting users and sites for hosting and organizing users’ speech. It should not protect companies from the consequences of their own actions and products,”

Despite bipartisan opposition to its immunity, the technology industry has stood firm in favor of preserving Section 230, citing tools such as ChatGPT and how they direct users to exist content in response to incoming queries, much like search engines.

Vice president and general counsel of NetChoice, a tech industry trade group, Carl Szabo, oversees the organization’s operations.

Carl Szabo says:

“AI is not really creating anything. It’s taking existing content and putting it in a different fashion or different format,”

AI developers, Szabo warned, could face immense obstacles should Section 230 be weakened. Without it, said Szabo, they would be potentially exposed to an overwhelming number of legal claims which would handcuff innovation.

Some suggest that courts may take a moderate approach when considering AI model-generated responses that may be deemed harmful. Specifically, they will look closely at the context in which those responses were created.

If an AI model appears to be borrowing elements from existing sources, the shield may still be applicable. However, ChatGPT and other chatbots have been known to come up with responses that do not appear related to any online source – a situation in which experts concur is not likely shielded.

Hany Farid, a technologist, and UC Berkeley professor, is unconvinced that AI developers should be exempt from legal proceedings due to their designed, trained, and utilized models. He considers it illogical to assume that they are not responsible for any results related to these creations.

Farid says:

“When companies are held responsible in civil litigation for harms from the products they produce, they produce safer products,” “And when they’re not held liable, they produce less safe products.”

The family of Nohemi Gonzalez – a 23-year-old student from California who Islamist militants fatally shot in a 2015 Paris rampage – is appealing the dismissal of their lawsuit against YouTube to the Supreme Court.

The lawsuit alleges that Google, through YouTube’s algorithms, unlawfully suggested videos of the Islamic State militant group – responsible for instigating the Paris attacks – to certain users, thus providing “material support” to terrorism.

It is clear that the legal landscape for AI and language models is rapidly evolving, and companies and developers must be mindful of the legal implications of their products and services. As AI and language models become increasingly integrated into our daily lives, we must balance protecting intellectual property rights and fostering innovation and creativity in AI. The Supreme Court’s decision in the YouTube case will be a significant step toward defining the legal parameters of this rapidly evolving field.

Source: Reuters

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top