this post was submitted on 20 Jul 2025
94 points (100.0% liked)

news

24179 readers
502 users here now

Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.

Rules:

-- PLEASE KEEP POST TITLES INFORMATIVE --

-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --

-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --

-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today/ . Twitter screenshots still need to be sourced or they will be removed --

-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--

-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--

-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --

-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --

founded 5 years ago
MODERATORS
 

Article

DOVER, Fla. (WFLA) — A Florida woman is warning others after falling victim to an elaborate AI-powered scheme that used cloned audio of her daughter’s voice to demand thousands of dollars in fake bond money.

Sharon Brightwell told Nexstar’s WFLA that the ordeal began on July 9 when she received a call from a number that looked like her daughter’s. On the other end of the line, a young woman was sobbing, claiming to have been in a car crash.

There is nobody that could convince me that it wasn’t her,” Sharon said. “I know my daughter’s cry.”

The caller said she had hit a pregnant woman while texting and driving and claimed her phone had been taken by police. A man then got on the line, claiming to be an attorney representing her daughter. He told Sharon that her daughter was being detained and needed $15,000 in bail money in cash.

“He gave very specific instructions,” Sharon said. “He told me not to tell the bank what the money was for, that it could affect my daughter’s credit.”

Following his instructions, she withdrew the money and placed it in a box as directed. A driver showed up to her house to pick up the package.

But it didn’t stop there.

Sharon received another call saying the unborn child had died and that the family, described as “Christian people,” had agreed not to sue her daughter if she provided another $30,000.

That’s when her grandson stepped in. He was on the phone with a family friend who quickly called Sharon directly this time with her real daughter on the line.

“I screamed,” Sharon said. “When I heard her voice, I broke down. She was fine. She was still at work.”

The family believes the suspects used videos from Facebook or other social media to create a convincing AI-generated replica of her daughter’s voice.

“I pray this doesn’t happen to anyone else,” Sharon said. “My husband and I are recently retired. That money was our savings.”

Now, the family is urging others to take precautions, including creating a private “code word” to verify identities over the phone in emergency situations.

“If they can’t give it to you,” Sharon said, “hang up and call them directly.”

A report has been filed with the Hillsborough County Sheriff’s Office. The family also launched a GoFundMe campaign to help recover their financial losses.


top 25 comments
sorted by: hot top controversial new old
[–] came_apart_at_Kmart@hexbear.net 6 points 7 hours ago

A man then got on the line, claiming to be an attorney representing her daughter. He told Sharon that her daughter was being detained and needed $15,000 in bail money in cash.

“He gave very specific instructions,” Sharon said. “He told me not to tell the bank what the money was for, that it could affect my daughter’s credit.”

Following his instructions, she withdrew the money and placed it in a box as directed. A driver showed up to her house to pick up the package.

"make the check out to Ice Station Zebra. that's my loan out. it's totally legit."

i literally read this article to find out if they asked for gift cards, because that was what i was expecting.

... anyway, i love how the US has some of the least sophisticated infrastructure for end users' secure payment processing in the G20 (people STILL use personal paper checks), and the most underfunded/nonexistent investigative resources for these types of financial crimes, but some of the most sophisticated phone scam technology and social engineering rings. when people call the cops about this shit, they're just like, "unless you are calling me to give me the address of some immigrants to shoot, we can't do anything for you. don't bother filing a report, because it fucks with our stats." it's fucking christmas for this shit here.

also, i don't care who has the emergency, if a "lawyer" tells me to put any amount of cash in a box and leave it somewhere, i am hanging up and will deal with the fall out at the next family get together.

[–] Assian_Candor@hexbear.net 11 points 10 hours ago

Reason 3,344,583 to shut the fuck up and not spray your braindead takes on tiktok using your real name

[–] queermunist@lemmy.ml 30 points 15 hours ago (4 children)

We're going to have to start using pass phrases when we call our families.

[–] TraschcanOfIdeology@hexbear.net 11 points 13 hours ago

I already do. It started as a cute inside joke that makes no sense, and has to be used in a very specific way for us to know we're actually talking to each other, but we discovered it was useful to figure out if everything was good.

[–] blobjim@hexbear.net 3 points 10 hours ago

Or just use something like Signal where spoofing doesn't happen.

[–] Cysioland@lemmygrad.ml 18 points 15 hours ago

That's been a common advice here for years because of the grandparent scams

I was thinking, families need to start using safe words to confirm identity.

[–] tricerotops@hexbear.net 43 points 16 hours ago (2 children)

never use voice fingerprinting that banks offer btw

[–] Aradino@hexbear.net 15 points 13 hours ago

The Australian welfare agency, centrelink, has it as a mandatory "feature"

You have to say "in Australia, my voice identifies me." Every time you call

[–] FALGSConaut@hexbear.net 11 points 13 hours ago (1 children)

Oh god thats a thing? I thought the fingerprinting/facial recognition stuff was bad enough (you won't catch me using that shit) but now its voice recognition too?

[–] tricerotops@hexbear.net 4 points 6 hours ago

it's been around for a while, it's a popular feature for banks and credit card issuers so you can "save time" when you call them. Anyway it was a horrible idea 10 years ago when they introduced it and it's an extraordinarily horrible idea today when we have voice duplication tech just out there.

[–] PoY@lemmygrad.ml 18 points 16 hours ago (3 children)

I used to wonder what all those spam calls were that used to happen where no one was ever on the other end, but they just kept calling incessantly about 7 - 8 years ago. Now I'm pretty sure it had to do with fingerprinting voices.

[–] PorkrollPosadist@hexbear.net 17 points 14 hours ago

I assumed the purpose of these was to see if the person being called would answer in the first place, to filter long lists of potentially spam-able numbers to a shorter list of active ones.

[–] Aradino@hexbear.net 14 points 13 hours ago (1 children)

PorkrollPosadist is right. It's just to check who actually answers the phone because most people don't these days

[–] PoY@lemmygrad.ml 7 points 13 hours ago (1 children)
[–] Aradino@hexbear.net 9 points 13 hours ago

Could be, but it's not likely. They'd get mostly just confused "hello?" And not much else so I doubt it'd be worth it compared to other ways to get someone's voice.

[–] Le_Wokisme@hexbear.net 12 points 15 hours ago (1 children)

doubt you get much figerprinting out of hello. HELLO? (unintelligible) click

[–] PoY@lemmygrad.ml 7 points 15 hours ago

iirc it was just dead silence on the other end.. and more than once I cursed whoever was calling and told them to stop calling and whatnot before I finally just stopped saying anything at all

[–] SorosFootSoldier@hexbear.net 21 points 16 hours ago

Starting to think this whole ai thing was a bad idea.

[–] Maeve@kbin.earth 11 points 16 hours ago

Don't put your family in your phone as: Mom, Dad, Sissy, Unc. Just use their first names.

[–] frogbellyratbone_@hexbear.net 8 points 15 hours ago

just sent this to my family just so they can know. my mom is still pretty coherent but sometime i worry, less about this but more other scams that i regularly warn her about.

this shit is wild.

[–] Gorillatactics@hexbear.net 4 points 13 hours ago (1 children)

Where do they get the samples from to get model that can fool close rlatives?

[–] WaterBowlSlime@lemmygrad.ml 4 points 13 hours ago

The article said Facebook or other social media. So if you have anything with your voice online, the potential to mimic you is there.

I have some speech quirks (not sure how to word it) that I'll use mostly around family members. After this stuff started happening I made sure to tell my mom to use it as a key word.

[–] Anissem@lemmy.ml 11 points 17 hours ago

So amazingly fucked up