Judge rebukes Minnesota over AI errors in 'deepfakes' lawsuit

Reuters
14 Jan
Judge rebukes Minnesota over AI errors in 'deepfakes' lawsuit

By David Thomas

Jan 13 (Reuters) - Minnesota Attorney General Keith Ellison cannot rely on a misinformation expert whose court filing included made-up citations generated by artificial intelligence, a federal judge ruled in a case involving a "deepfake" parody of Vice President Kamala Harris.

The Friday decision from U.S. District Judge Laura Provinzino in Minnesota federal court stems from an expert declaration Ellison's office submitted in November. Ellison is defending a Minnesota law that bans people from using deepfakes – videos, pictures or audio clips made with AI to look real – to influence an election.

But one of Minnesota's experts in the case, Jeff Hancock, a misinformation specialist and a Stanford University communication professor, used fake article citations generated by AI to support the state's arguments, the court found.

Hancock told the judge he used ChatGPT-4o while drafting his declaration, which likely "hallucinated" two citations he made in his filing, and apologized for the oversight.

Although Provinzino said she does not believe Hancock intentionally cited fake sources generated by AI, it "shatters his credibility with this court," she wrote on Friday.

The judge noted the "irony" that Hancock, "a credentialed expert on the dangers of AI and misinformation, has fallen victim to the siren call of relying too heavily on AI — in a case that revolves around the dangers of AI, no less."

Provinzino said she would exclude Hancock's expert testimony in deciding whether to grant a preliminary injunction blocking the Minnesota deepfakes law, and prohibited Ellison from filing amended testimony from Hancock. Provinzino declined to block the law in a separate Friday order.

Hancock and Ellison's office did not immediately respond to requests for comment.

The law, which was enacted in 2023, is being challenged as unconstitutional by Minnesota Republican state lawmaker Mary Franson and Christopher Kohls, a political satirist who operates under the screenname "Mr Reagan."

Franson and Kohls' lawyers at the Upper Midwest Law Center and the Hamilton Lincoln Law Institute did not immediately respond to requests for comment.

Kohls created a parody video showing the first presidential campaign ad of Harris, a Democrat, with AI-generated narration that sounded like Harris. The video was posted on X by Elon Musk, the social media site's billionaire owner, and reposted by Franson.

Kohls is also challenging the constitutionality of two California laws regulating AI-generated deepfakes about elections and electoral candidates. Those laws are also being challenged by Musk's X Corp and the Babylon Bee, a satirical website.

The case is Christopher Kohls, et al. v. Keith Ellison, et al., U.S. District Court of Minnesota, 0:24-cv-03754

For Christopher Kohls and Mary Franson: Alexandra Howell, Douglas Seaton and James Dickey, of Upper Midwest Law Center, and M. Frank Bednarz, of Hamilton Lincoln Law Institute

For Keith Ellison: Allen Barr, Angela Behrens, Elizabeth Kramer and Peter Farrell, of the Minnesota Attorney General's Office

For Chad Larson: Kristin Nierengarten and Zachary Cronen, of Rupp, Anderson, Squires & Waldspurger

(Reporting by David Thomas)

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Most Discussed

  1. 1
     
     
     
     
  2. 2
     
     
     
     
  3. 3
     
     
     
     
  4. 4
     
     
     
     
  5. 5
     
     
     
     
  6. 6
     
     
     
     
  7. 7
     
     
     
     
  8. 8
     
     
     
     
  9. 9
     
     
     
     
  10. 10