Categories: Technology

Reddit sues Anthropic for ‘scraping’ user comments to train Claude

BEIJING (AP): Social media platform Reddit sued the artificial intelligence giant Anthropic on Wednesday, claiming that it is illegally “scraping” the comments of millions of Reddit users to train its chatbot Claude.

Reddit claims that Anthropic has used automated bots to access Reddit’s content despite being asked not to do so, and “intentionally trained on the personal data of Reddit users without ever requesting their consent.”

Anthropic said in a statement that it disagreed with Reddit’s claims “and will defend ourselves vigorously.”

Reddit filed the lawsuit Wednesday in California Superior Court in San Francisco, where both companies are based.

“AI companies should not be allowed to scrape information and content from people without clear limitations on how they can use that data,” said Ben Lee, Reddit’s chief legal officer, in a statement Wednesday.

Reddit has previously entered licensing agreements with Google, OpenAI and other companies that are paying to be able to train their AI systems on the public commentary of Reddit’s more than 100 million daily users.

Those agreements “enable us to enforce meaningful protections for our users, including the right to delete your content, user privacy protections, and preventing users from being spammed using this content,” Lee said.

The licensing deals also helped the 20-year-old online platform raise money ahead of its Wall Street debut as a publicly traded company last year.

Anthropic was formed by former OpenAI executives in 2021, and its flagship Claude chatbot remains a key competitor to OpenAI’s ChatGPT. While OpenAI has close ties to Microsoft, Anthropic’s primary commercial partner is Amazon, which is using Claude to improve its widely used Alexa voice assistant.

Much like other AI companies, Anthropic has relied heavily on websites such as Wikipedia and Reddit, which are deep troves of written materials that can help teach an AI assistant the patterns of human language.

In a 2021 paper co-authored by Anthropic CEO Dario Amodei – cited in the lawsuit – researchers at the company identified the subreddits, or subject-matter forums, that contained the highest quality AI training data, such as those focused on gardening, history, relationship advice, or thoughts people have in the shower.

Anthropic in 2023 argued in a letter to the U.S. Copyright Office that the “way Claude was trained qualifies as a quintessentially lawful use of materials,” by making copies of information to perform a statistical analysis of a large body of data. It is already battling a lawsuit from major music publishers alleging that Claude regurgitates the lyrics of copyrighted songs.

But Reddit’s lawsuit is different from others brought against AI companies because it doesn’t allege copyright infringement. Instead, it focuses on the alleged breach of Reddit’s terms of use, and the unfair competition, it says, was created.

The Frontier Post

Recent Posts

Florida aims to ban vaccine mandates for schoolchildren

Florida is aiming to become the first US state to cancel all of its vaccine…

5 minutes ago

Google must pay $425 million in class action over privacy, jury rules

(Reuters): A federal jury determined on Wednesday that Alphabet’s Google must pay $425 million for…

14 minutes ago

Instagram rolls out iPad app with Reels at the center to take on TikTok

(Reuters) : Instagram launched a dedicated iPad application on Wednesday, placing its short-form video feature…

18 minutes ago

Death toll rises to 60 in Nigeria boat accident, officials say

(Reuters) : At least 60 people have died and dozens were rescued after a boat…

28 minutes ago

US’ denial of visas to Palestinian officials could backfire

Dr. Abdel Aziz Aluwaisheg The Trump administration decided last week to revoke the visas previously…

38 minutes ago

Israel, US appear to prefer Hamas to peaceful Abbas

Daoud Kuttab Two recent decisions by the US government are puzzling to any observer of…

38 minutes ago

This website uses cookies.