Latest Developments
Tracking Fake Citations in Court (e.g., AI Hallucinations)
This tracker documents instances where attorneys, pro se litigants, and other parties cite non-existent cases in court filings or other legal settings, and the court explicitly calls it out. While AI-generated hallucinations are a common cause, we also include cases of human error. Our dataset includes pre-2023 examples identified through WestLaw searches for a set of keywords including "non-existent case" and subsequent manual verification. In borderline instances, we add a specific note as to the inclusion decision.
Case | Court | Filer | Date | Outcome | Description | Links |
---|
AI Used in Court (Evidence & Interpretation)
This tracker tracks cases where someone cites to AI as evidence or for use in statutory interpretation. This may involve hallucinations and mistakes on the part of the AI system, but it is distinguished from the other tracker in that it is not a made-up case citation resulting from human error or use of AI to generate legal arguments and research.
Case | Court | Use | Date | Description | Links |
---|
Generative AI Copyright Litigation
Case | Court | Status | Description | Links |
---|
AI Liability & Defamation
Case | Court | Claims | Status | Description | Links |
---|
AI & First Amendment Challenges
Case | Court | Status | Description | Links |
---|