The FSF received notice about the Bartz v. Anthropic copyright infringement settlement, in which Anthropic is accused of downloading books from Library Genesis and Pirate Library Mirror datasets to train LLMs. The FSF's book 'Free as in Freedom' was found in those datasets. Rather than seeking monetary compensation, the FSF uses this opportunity to call on Anthropic and other LLM developers to release their training data, model weights, configuration settings, and source code to users — framing open model distribution as the proper remedy for using freely-licensed works in AI training.
Sort: