Anthopic's law firm blames Claude hallucinations for errors

This title could be clearer and more informative.Try out Clickbait Shieldfor free (5 uses left this month).

An attorney representing Anthropic in a music copyright case apologized for erroneous citations in a court document. These errors were attributed to AI-generated hallucinations by Anthropic's AI tool, Claude, which was used to format citations. This incident highlights ongoing concerns about verifying AI-generated information

6m read timeFrom go.theregister.com
Post cover image

Sort: