×
“Nobody could convince me it wasn’t her.” AI voice cloning scam costs Florida mom $15K.
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A Florida woman was conned out of $15,000 by scammers who used artificial intelligence to clone her daughter’s voice, convincing her that her daughter had killed someone in a car accident. The sophisticated scam highlights the growing threat of AI-powered voice cloning technology in criminal schemes, where scammers can now replicate loved ones’ voices with alarming accuracy to exploit family bonds and trigger panic-driven financial decisions.

How the scam unfolded: Sharon Brightwell received a call last Wednesday from what appeared to be her daughter April Monroe’s phone number, with an AI-cloned voice claiming she had hit a pregnant woman while texting and driving.

  • “There is nobody that could convince me that it was not [her voice],” Brightwell told WFLA. “I know my daughter’s cry, even though she’s an adult, I still know my daughter’s cry.”
  • A man then posed as Monroe’s attorney, demanding $15,000 cash for bail and warning that revealing the purpose to the bank would affect her daughter’s credit.
  • After Brightwell withdrew the money and gave it to a driver, scammers called again claiming to be relatives of the “victim,” demanding an additional $30,000 for the supposed death of an unborn baby.

The emotional toll: Monroe’s son was with Brightwell during the entire ordeal and experienced severe distress until Monroe texted him during her lunch break, revealing the scam.

  • “My voice was AI cloned and sounded exactly like me,” Monroe wrote on a GoFundMe page. “After you hear your child in distress, all logic is out the window.”
  • Monroe’s son “hunched over to throw up” when he first saw her and realized she was safe.
  • “To tell you the trauma that my mom and son went through that day makes me nauseous and has made me lose more faith in humanity,” Monroe wrote.

What experts recommend: The family now encourages proactive measures to prevent similar AI voice cloning scams.

  • They suggest establishing a “code word” to use in emergency situations to verify identity.
  • Monroe has filed a police report and an investigation is underway with the Hillsborough County Sheriff’s Department.

Why this matters: This case demonstrates how AI voice cloning technology is being weaponized by criminals to exploit emotional vulnerabilities, making traditional phone scams far more convincing and dangerous for unsuspecting families.

AI Startup Perplexity Valued at $18 Billion With New Funding

Recent News

Amazon scholar says personalized AI agents will spark next ChatGPT moment

The challenge lies in teaching AI your personal quirks, like when you lock doors.

4 AI agents successfully organize – though with hands held – world’s first AI-coordinated live event

The agents operated autonomously for 26 days, handling everything from venue booking to livestream coordination.

Human coder beats OpenAI’s AI by 9.5% in grueling 10-hour contest

The exhausted Polish programmer may have scored one of humanity's final victories in this domain.