Thelma – The Real-Life Voice Scam That Made It into the Movies | McAfee Blog
This has to be a first. Something from our blogs got made into a movie.
We’re talking about voice scams, the soundalike calls that rip people off. One such call sets the action in motion for a film released this weekend, “Thelma.”
The synopsis of the comedy reads like this …
“When 93-year-old Thelma Post gets duped by a phone scammer pretending to be her grandson, she sets out on a treacherous quest across the city to reclaim what was taken from her.”
What times we live in, where voice scams form the premise of a film. In fact, writer/director Josh Margolin based the film on a phone scam that targeted his grandmother (yet one that they were lucky to shut down.) With that, it gives us a reminder that voice scams like these occur, and occur often.
What are voice scams?
Voice scams have been around for some time. They play out like an email phishing attack, where scammers try to trick people into forking over sensitive info or money — just in voice form over the phone. The scammer poses as someone the victim knows, like a close family member.
Yet the arrival of AI has made voice scams far more convincing. Cheap and freely available AI voice cloning tools have flooded the online marketplace in the past couple of years. They’re all completely legal as well.
Some cloning tools come in the form of an app. Others offer cloning as a service, where people can create a clone on demand by uploading audio to a website. The point is, practically anyone can create a voice clone. They sound uncanny too. Practically like the real thing, and certainly real enough over the phone. And it only takes a small sample of the target’s voice to create one.
Our own labs found that just a few seconds of audio was enough to produce a clone with an 85% voice match to the original. That number bounced up to 95% when they trained the clone further on a small batch of audio pulled from videos.
How do voice scammers create voice clones?
As to how scammers get a hold of the files they need, they have a ready source. Social media. With videos harvested from public accounts on YouTube, Instagram, TikTok, and other platforms, scammers have little trouble creating clones — clones that say whatever a scammer wants. All it takes is a script.
That’s where the attack comes in. It typically starts with a distress call, just like in the movie.
For example, a grandparent gets an urgent message on the phone from their grandchild. They’re stuck in the middle of nowhere with a broken-down car. They’re in a hospital across the country with a major injury. Or they’re in jail overseas and need to get bailed out. In every case, the solution to the problem is simple. They need money. Fast.
Sure, it’s a scam. Yet in the heat of the moment, it all sounds terribly real. Real enough to act right away.
Fearing the worst and unable to confirm the situation with another family member, the grandparent shoots the money off as instructed. Right into the hands of a scammer. More often than not, that money is gone for good because the payment was made with a wire transfer or through gift cards. Sometimes, victims pay out in cash.
Enter the premise for the movie. Thelma gets voice-scammed for thousands, then zips across Los Angeles on her friend’s mobility scooter to get her money back from the voice scammers.
The reality is of course more chilling. According to the U.S. Federal Trade Commission (FTC), nearly a million people reported a case of imposter fraud in 2023. Total reported losses reached close to $2.7 billion. Although not tracked and reported themselves, voice clone attacks certainly figure into this overall mix.
Voice scams target everyone. Not just Thelma
Even as we focus on the character of Thelma, voice clone attacks target people of all ages. Parents have reported cases involving their children. And married couples have told of scams that impersonate their older in-laws.
Common to each of these attacks is one thing: fear. Something horrible has happened. Or is happening. Here, scammers look to pull an immediate emotional trigger. Put plainly, they want to scare their victim. And in that fear, they hope that the victim immediately pays up.
It’s an odds game. Plenty of attacks fail. A parent might be sitting at the dinner table with their child when a voice clone call strikes. Or a grandchild might indeed be out of town, yet traveling with their grandmother when the scammer gives her a ring.
Yet if even a handful of these attacks succeed, a scammer can quickly cash in. Consider one attack for hundreds, if not thousands, or dollars. Multiply that by five, ten, or a dozen or so times over, a few successful voice clone scams can rack up big returns.
How to protect your family from voice scams
Yet you can protect yourself from these attacks. A few steps can make it more difficult for scammers to target you. A few others can prevent you from getting scammed if a voice clone pops up on the other end of the phone.
Make it tougher for scammers to target you by:
Clear your name from data broker sites. How’d that scammer get your phone number anyway? Chances are, they pulled that info off a data broker site. Data brokers buy, collect, and sell detailed personal info, which they compile from several public and private sources, such as local, state, and federal records, in addition to third parties. Our Personal Data Cleanup scans some of the riskiest data broker sites, shows you which ones are selling your personal info, and helps you remove your data.
Set your social media accounts to private. Scammers sift through public social media profiles in search of info on their targets. In some cases, an account can provide them with everything they need to launch an attack. Family names, family interests, where the family goes for vacation, where family members work — and videos that they can use for cloning. By making your accounts private, you deny scammers the resources they require. Our Social Privacy Manager can do this for you across all your accounts in only a few clicks.
Prevent getting scammed by:
Recognize that voice clone attacks are a possibility. As we’re still in the relatively early days of AI tools, not everyone is aware that this kind of attack is possible. Keeping up to date on what AI can do and sharing that info with your family and friends can help them spot an attack. As we’ve reported here before, voice clones are only the start. Other imposter scams run on video calls where a scammer takes on someone else’s voice and looks. All in real-time.
Always question the source. In addition to voice cloning tools, scammers have other tools that can spoof phone numbers so that they look legitimate. Even if it’s a voicemail or text from a number you recognize, stop, pause, and think. Does that really sound like the person you think it is? Hang up and call the person directly or try to verify the info before responding.
Set a verbal codeword with kids, family members, or trusted close friends. Even in the most high-tech of attacks, a low-tech precaution can keep everyone safe. Have a codeword. Save it for emergencies. Make sure everyone uses it in messages and calls when they ask for help. Further, ensure that only you and those closest to you know what the codeword is. This is much like the codewords that banks and alarm companies use to help ensure that they’re speaking to the proper account holder. It’s a simple, powerful step. And a free one at that.