Close Menu
SportyVibes.live –

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Wrexham rise showcases English football pyramid in US, says Anton Ferdinand

    July 11, 2025

    Federal agents clash with protesters during ICE raid at Southern California farm

    July 11, 2025

    Rangers Announce Disappointing Joc Pederson Update

    July 11, 2025
    Facebook X (Twitter) Instagram
    Trending
    • Wrexham rise showcases English football pyramid in US, says Anton Ferdinand
    • Federal agents clash with protesters during ICE raid at Southern California farm
    • Rangers Announce Disappointing Joc Pederson Update
    • Sunshine and sopor as India force Stokes and England to change their gung-ho approach | England v India 2025
    • Salisbury and Stefani agonisingly miss out on Wimbledon mixed doubles title | Wimbledon 2025
    • Nate Landwehr | ‘The Train’ Comes Home’
    • Best Leash for Running with Your Dog
    • How climate change is affecting prized tea-growing regions in China and Taiwan
    Facebook X (Twitter) Instagram Pinterest Vimeo
    SportyVibes.live –SportyVibes.live –
    • Home
    • News
    • Cricket
    • Combat
    • Fitness
    • Football
    • Basketball
    • Tennis
    • Gear
    • Highlights
    SportyVibes.live –
    Home»News»AI hallucination in Mike Lindell case serves as a stark warning : NPR
    News

    AI hallucination in Mike Lindell case serves as a stark warning : NPR

    Sports NewsBy Sports NewsJuly 10, 2025No Comments7 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    AI hallucination in Mike Lindell case serves as a stark warning : NPR
    Share
    Facebook Twitter LinkedIn Pinterest Email

    MyPillow CEO Mike Lindell arrives at a gathering of supporters of Donald Trump near Trump’s residence in Palm Beach, Fla., on April 4, 2023. On July 7, 2025, Lindell’s lawyers were fined thousands of dollars for submitting a legal filing riddled with AI-generated mistakes.

    Octavio Jones/Getty Images


    hide caption

    toggle caption

    Octavio Jones/Getty Images

    A federal judge ordered two attorneys representing MyPillow CEO Mike Lindell in a Colorado defamation case to pay $3,000 each after they used artificial intelligence to prepare a court filing filled with a host of mistakes and citations of cases that didn’t exist.

    Christopher Kachouroff and Jennifer DeMaster violated court rules when they filed the document in February filled with more than two dozen mistakes — including hallucinated cases, meaning fake cases made up by AI tools, Judge Nina Y. Wang of the U.S. District Court in Denver ruled Monday.

    TikTok researchers and users say there is yet another type of deception to look out for on the hit video app: Deepfake videos that copy the exact words of a real creator but in a different voice. In this case, the screenshot on the left is the original TikTok video; on the right is a copy using an apparently AI-generated persona.

    “Notwithstanding any suggestion to the contrary, this Court derives no joy from sanctioning attorneys who appear before it,” Wang wrote in her decision. “Indeed, federal courts rely upon the assistance of attorneys as officers of the court for the efficient and fair administration of justice.”

    The use of AI by lawyers in court is not, itself illegal. But Wang found the lawyers violated a federal rule that requires lawyers to certify that claims they make in court are “well grounded” in the law. Turns out, fake cases don’t meet that bar.

    Kachouroff and DeMaster didn’t respond to NPR’s request for comment.

    The error-riddled court filing was part of a defamation case involving Lindell, the MyPillow creator, President Trump supporter and conspiracy theorist known for spreading lies about the 2020 election. Last month, Lindell lost this case being argued in front of Wang. He was ordered to pay Eric Coomer, a former employee of Denver-based Dominion Voting Systems, more than $2 million after claiming Coomer and Dominion used election equipment to flip votes to former President Joe Biden.

    A screenshot of the AI-generated video of Christopher Pelkey produced by Tim Wales and Scott Yentzer, consulted with Stacey Wales.

    The financial sanctions, and reputational damage, for the two lawyers are a stark reminder for attorneys who, like many others, are increasingly using artificial intelligence in their work, according to Maura Grossman, a professor at the University of Waterloo’s David R. Cheriton School of Computer Science and an adjunct law professor at York University’s Osgoode Hall Law School.

    Grossman said the $3,000 fines “in the scheme of things was reasonably light, given these were not unsophisticated lawyers who just really wouldn’t know better. The kind of errors that were made here … were egregious.”

    There have been a host of high-profile cases where the use of generative AI has gone wrong for lawyers and others filing legal cases, Grossman said. It’s become a familiar trend in courtrooms across the country: Lawyers are sanctioned for submitting motions and other court filings filled with case citations that are not real and created by generative AI.

    Damien Charlotin tracks court cases from across the world where generative AI produced hallucinated content and where a court or tribunal specifically levied warnings or other punishments. There are 206 cases identified as of Thursday — and that’s only since the spring, he told NPR. There were very few cases before April, he said, but for months since there have been cases “popping up every day.”

    Charlotin’s database doesn’t cover every single case where there is a hallucination. But he said, “I suspect there are many, many, many more, but just a lot of courts and parties prefer not to address it because it’s very embarrassing for everyone involved.”

    What went wrong in the MyPillow filing

    The $3,000 fine for each attorney, Judge Wang wrote in her order this week, is “the least severe sanction adequate to deter and punish defense counsel in this instance.”

    The judge wrote that the two attorneys didn’t provide any proper explanation of how these mistakes happened, “most egregiously, citation of cases that do not exist.”

    Wang also said Kachouroff and DeMaster were not forthcoming when questioned about whether the motion was generated using artificial intelligence.

    Secretary of State Marco Rubio attends a signing ceremony for a peace agreement between Rwanda and the Democratic Republic of the Congo at the State Department, June 27, 2025, in Washington.

    Kachouroff, in response, said in court documents that it was DeMaster who “mistakenly filed” a draft version of this filing rather than the right copy that was more carefully edited and didn’t include hallucinated cases.

    But Wang wasn’t persuaded that the submission of the filing was an “inadvertent error.” In fact, she called out Kachouroff for not being honest when she questioned him.

    “Not until this Court asked Mr. Kachouroff directly whether the Opposition was the product of generative artificial intelligence did Mr. Kachouroff admit that he did, in fact, use generative artificial intelligence,” Wang wrote.

    Grossman advised other lawyers who find themselves in the same position as Kachouroff to not attempt to cover it up, and fess up to the judge as soon as possible.

    “You are likely to get a harsher penalty if you don’t come clean,” she said.

    An illustration picture shows ChatGPT artificial intelligence software, which generates human-like conversation, in February 2023 in Lierde, Belgium. Experts say AI can be incredibly useful for lawyers — they just have to verify their work.

    An illustration picture shows ChatGPT artificial intelligence software, which generates human-like conversation, in February 2023 in Lierde, Belgium. Experts say AI can be incredibly useful for lawyers — they just have to verify their work.

    Nicolas Maeterlinck/BELGA MAG/AFP via Getty Images


    hide caption

    toggle caption

    Nicolas Maeterlinck/BELGA MAG/AFP via Getty Images

    Trust and verify

    Charlotin has found three main issues when lawyers, or others, use AI to file court documents: The first are the fake cases created, or hallucinated, by AI chatbots.

    The second is AI creates a fake quote from a real case.

    The third is harder to spot, he said. That’s when the citation and case name are correct but the legal argument being cited is not actually supported by the case that is sourced, Charlotin said.

    This case involving the MyPillow lawyers is just a microcosm of the growing dilemma of how courts and lawyers can strike the balance between welcoming life-changing technology and using it responsibly in court. The use of AI is growing faster than authorities can make guardrails around its use.

    It’s even being used to present evidence in court, Grossman said, and to provide victim impact statements.

    Earlier this year, a judge on a New York state appeals court was furious after a plaintiff, representing himself, tried to use a younger, more handsome AI-generated avatar to argue his case for him, CNN reported. That was swiftly shut down.

    Despite the cautionary tales that make headlines, both Grossman and Charlotin view AI as an incredibly useful tool for lawyers and one they predict will be used in court more, not less.

    Rules over how best to use AI differ from one jurisdiction to the next. Judges have created their own standards, requiring lawyers and those representing themselves in court to submit AI disclosures when it’s been used. In a few instances judges in North Carolina, Ohio, Illinois and Montana have established various prohibitions on the use of AI in their courtrooms, according to a database created by the law firm Ropes & Gray.

    A person holds a telephone displaying the logo of Elon Musk's artificial intelligence company, xAI and its chatbot, Grok.

    The American Bar Association, the national representative of the legal profession, issued its first ethical guidance on the use of AI last year. The organization warned that because these tools “are subject to mistakes, lawyers’ uncritical reliance on content created by a [generative artificial intelligence] tool can result in inaccurate legal advice to clients or misleading representations to courts and third parties.”

    It continued, “Therefore, a lawyer’s reliance on, or submission of, a GAI tool’s output—without an appropriate degree of independent verification or review of its output—could violate the duty to provide competent representation …”

    The Advisory Committee on Evidence Rules, the group responsible for studying and recommending changes to the national rules of evidence for federal courts, has been slow to act and is still working on amendments for the use of AI for evidence.

    In the meantime, Grossman has this suggestion for anyone who uses AI: “Trust nothing, verify everything.”

    case hallucination Lindell Mike NPR serves stark Warning
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleTwo hat-tricks in consecutive overs: Kishor Kumar Sadhak makes history with rare feat | Cricket News
    Next Article Michael Jordan’s former Chicago-area estate listed on Airbnb
    Sports News
    • Website

    Related Posts

    News

    Federal agents clash with protesters during ICE raid at Southern California farm

    July 11, 2025
    News

    How climate change is affecting prized tea-growing regions in China and Taiwan

    July 11, 2025
    News

    US widens public benefit restrictions for undocumented immigrants | Donald Trump News

    July 10, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Lisa Nandy removes herself from final decision on leader of football regulator | Lisa Nandy

    June 2, 202551 Views

    Beat writer doubts that the Lakers can land Walker Kessler

    June 12, 202521 Views

    Mubi, A Streamer For Cinephiles, Is Now Officially Indispensable

    June 2, 202511 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews
    Football

    Robertson returns as County stick with manager Cowie

    Sports NewsJune 2, 2025
    Highlights

    Spanish GP: Max Verstappen admits George Russell crash ‘shouldn’t have happened’

    Sports NewsJune 2, 2025
    Highlights

    Max Verstappen-George Russell collision: F1 world champion admits move ‘was not right’

    Sports NewsJune 2, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Warriors add sharpshooter in second round of new NBA mock from Yahoo

    June 2, 20250 Views

    Erin Blanchfield rips Maycee Barber after UFC Fight Night cancellation: ‘She needs to fix her life’

    June 2, 20250 Views

    Eagles have $55 million in dead money salary cap

    June 2, 20250 Views
    Our Picks

    Wrexham rise showcases English football pyramid in US, says Anton Ferdinand

    July 11, 2025

    Federal agents clash with protesters during ICE raid at Southern California farm

    July 11, 2025

    Rangers Announce Disappointing Joc Pederson Update

    July 11, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Disclaimer
    • Get In Touch
    • Privacy Policy
    • Terms & Condtition
    © 2025 sportyvibes. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.