AI Company Destroys Millions of Books!

 Anthropic, an artificial intelligence company, has been revealed to have shredded and digitized millions of physical books to train its Claude AI model, subsequently destroying the books themselves. This practice came to light during an ongoing copyright lawsuit in the United States. The court, classifying Anthropic's use of these books without author permission as "fair use," has given the company partial legal approval for this method.

Anthropic Destroys Millions of Books




The company acquired physical books, cut the pages from their bindings, scanned them to create digital data, and then destroyed the physical copies. However, this scanned data is not being shared publicly. U.S. District Judge William Alsup ruled that Anthropic's use of the books without permission from the legal owners was lawful. This decision paves the way for AI companies to purchase and use copyrighted works for training purposes.


Court Cites "First Sale Doctrine" in Ruling, Yet New Lawsuit Emerges

The court based its decision on the "first sale doctrine." This legal principle grants the copyright holder the right to control the initial sale of their work but also allows for the resale of the work secondhand. However, any misuse of this practice remains the responsibility of the AI companies.

Despite this, authors, publishers, and archivists find the physical destruction of books unethical and unnecessary. Furthermore, it has been revealed that Anthropic's training also involved the use of pirated book archives. Consequently, the company is facing a new lawsuit filed in December. The court found the use of pirated works to be illegal, stating that Anthropic could be liable for up to $150,000 in damages for each unauthorized book.


The Unfolding Debate: AI Development, Copyright, and Ethics

Anthropic's controversial practice once again highlights just how important and complex copyright and ethical issues are within AI development processes. What are your thoughts on this matter? Share your opinions with us in the comments section below.

Next Post Previous Post
No Comment
Add Comment
comment url