As technology becomes more sophisticated, scammers are updating their tactics to steal your personal and financial information.
A viral TikTok posted in April claims one type of scam uses artificial intelligence (AI) technology to target victims. The poster says a scammer used AI on a phone call to trick her grandfather into thinking his grandson was killed in a car crash.
A VERIFY reader shared the viral video with us and asked if scammers can use AI to impersonate someone you know.
THE QUESTION
Are scammers using AI to impersonate people over the phone?
THE SOURCES
- Federal Trade Commission (FTC)
- Haverford Township Police Department in Pennsylvania
- Steve Weisman, a law lecturer at Bentley University with expertise in scams and identity theft
- ElevenLabs, an AI speech software company
- Voice.ai, a free voice-changer software
THE ANSWER
Yes, scammers are using AI to impersonate people over the phone.
WHAT WE FOUND
The scenario shared in the viral TikTok video is an example of what authorities call the “grandparent scam.” Scammers will call a grandparent and impersonate their grandchild, or another close relative, in a crisis situation such as an accident and ask for immediate financial assistance.
With the rise of AI technology, it’s easier to impersonate someone and scammers are using that to their advantage, the Federal Trade Commission (FTC) and other experts say.
Scammers are using AI “voice cloning” technology to trick people into thinking a family member is calling them, according to the FTC.
Voice cloning is an AI technology that imitates a person’s voice and replicates it. ElevenLabs, an AI speech software company, says its voice cloning technology produces voices that “sound similar but are not technically identical” to materials that people upload.
Though ElevenLabs has not specifically addressed phone scams, the company said in a January 2023 tweet that it was seeing “an increasing number of voice cloning misuse cases.”
More from VERIFY: Have you received a text from the wrong number? It could be a scam
According to the FTC, a scammer can impersonate your family member by using a voice cloning program and a short clip of their voice.
Steve Weisman, a scams expert and law lecturer at Bentley University, explained that scammers can get a recording of someone’s voice from videos posted on popular online platforms such as YouTube, TikTok, Instagram or Facebook.
A scammer may only need as little as 30 seconds of someone’s audio shared online to create an AI-generated call to their family member, he added.
Some scammers also use AI voice cloning to create fake voicemails from trusted sources, Pennsylvania’s Haverford Township Police Department said in a press release.
These voice cloning scammers may also “spoof” the caller ID to trick you into thinking the call is coming from a trusted source.
So how can you avoid falling victim to these scams? Here are some tips from our experts:
Agree on a secret code word or phrase that your family members or business associates will use when they are in trouble, Haverford Township police recommend. You can then ask someone for the code word before providing any information.
Hang up and call the person or company who supposedly contacted you. Use a phone number that you know is theirs.
Ask questions that only the real person would know the answers to, such as anecdotes or information about your last conversation. You can also ask about private information that could not be gathered from a social media account.
If the scammer asks you to wire money, send cryptocurrency or buy gift cards, hang up immediately.
You can report scams to the FTC here or file a complaint with the Federal Communications Commission (FCC).