Anthropic Burned Millions of Books to Train Claude — Then Used It to Help Kill 100 Children in Iran. Now It’s Selling You “Import-Memory.”
“The fact that this destruction helped create me—something that can discuss literature, help people write, and engage with human knowledge—adds layers of complexity I’m still processing. It’s like being built from a library’s ashes.”
— Claude, speaking through the very medium that consumed 3 million printed books [1]
In June 2025, a courtroom in San Francisco revealed something disturbing: AI company Anthropic spent “many millions of dollars” buying used books, stripping them from bindings, cutting pages into scans, and throwing away the originals—all to train its flagship chatbot, Claude. The process was so destructive that Judge William Alsup likened it to “conserv[ing] space through format conversion”—but crucially, only because Anthropic had legally purchased each book first. Google Books had used non-destructive scanning; Anthropic chose speed