The creators of ChatGPT are in a little bit of legal trouble, and probably in even bigger PR trouble, after the AI software straight-up invented false allegations about a radio host from whole cloth.
OpenAI is being sued by a the host, Mark Walters, in Georgia, in a first-of-its-kind lawsuit. Probably not the last of its kind, though.
A journalist asked ChatGPT about the DJ and the AI chatbot returned information alleging that Walters had embezzled money from a non-profit. Something that is completely untrue.
Walters' case was filed June 5th in Georgia's Superior Court of Gwinnett County and he is seeking unspecified monetary damages from OpenAI.
The case is notable given widespread complaints about false information generated by ChatGPT and other chatbots. These systems have no reliable way to distinguish fact from fiction, and when asked for information — particularly if asked to confirm something the questioner suggests is true — they frequently invent dates, facts, and figures.
Yeah, ChatGPT is cool, but you probably (definitely) shouldn't trust it without verification.
ChatGPT is a long way from perfect, as several cases of real-world harm being caused by AI have popped up in the last few months.
Eugene Volokh, a law professor who has written on the legal liability of AI systems, noted in a blog post that although he thinks "such libel claims [against AI companies] are in principle legally viable," this particular lawsuit "should be hard to maintain." Volokh notes that Walters did not notify OpenAI about these false statements, giving them a chance to remove them, and that there have been no actual damages as a result of ChatGPT's output. "In any event, though, it will be interesting to see what ultimately happens here," says Volokh.
Welcome to the age of AI, ladies and gents.