News & Media
silver robots
PhonlamaiPhoto / Getty Images

Protecting Against AI Voice Scams

Voices created with artificial intelligence (AI) can be produced to sound human. To protect against associated scams, consumer advocates recommend taking precautionary steps.

NEW YORK –Think of a robot's voice. What does it sound like? Does it sound stilted and stuttering? Perhaps it sounds like Arnold Schwarzenegger from the popular Terminator robot movies?

Believe it or not, AI voices can replicate yours almost perfectly. It could mimic your inflections and intonations, making it nearly indistinguishable from your true voice.

Imagine if someone uses your voice to ask for money for an emergency. This is a real-life scenario spreading worldwide as AI voice scams become more prevalent.

Fortunately, you have ways to defend yourself against AI-powered schemes. Let's discuss the most recent methods. The 4 ways to protect yourself from AI voice scams:

  1. Fact-check and verify.
  2. Listen for unusual cues.
  3. Avoid sharing your voice online.
  4. Have a family password.

1. Fact-check and verify

NBC News reported that AI-powered voice calls discouraged New Hampshire citizens from voting in the upcoming presidential primary last January.

That is why CNBC reminded the public to fact-check and verify phone calls to ensure they aren't scams. For example, be skeptical if a high-ranking official calls you directly. The same goes if you get a call from a relative requesting a huge sum of money. Ask yourself if your loved one is truly in danger and if the emergency is real.

If you suspect something's wrong, drop the call and then reach your relative through their contact details. They might confirm that they weren't in immediate danger.

2. Listen for unusual cues

If you remember, the introduction says artificial intelligence can mimic your voice 'almost perfectly.' That's because AI voice calls may still leave subtle signs of artificial intelligence.

For example, you might notice robotic speech patterns, unnatural pauses or alterations in speaking tone. You might suspect something's off if the distressed person suddenly becomes monotone. Also, you may hear pronunciation errors, especially for non-English words.

These cues can be difficult to notice when you're panicked in response to someone in need. Nonetheless, watching out for these signs can help you avoid AI voice scams.

3. Avoid sharing your voice online

Reem Alattas, the Director of Value Advisory for Spend Management at the German tech firm SAP, posted on LinkedIn about avoiding AI voice scams. She warned people not to post videos or audio of their voices online as much as possible.

Strangers could take those samples and feed them to an AI voice generator to replicate your likeness. Of course, that can be difficult as we post videos of ourselves on social media for fun and work. That is why you should minimize the risk by restricting your profiles.

4. Have a family password

The Electronic Frontier Foundation is a nonprofit organization 'defending civil liberties in the digital world,' and it shared advice on avoiding AI voice-cloning scams, too.

It suggests having a family password. Discuss with your family a specific phrase or word that you can all remember. Agree that everyone should ask for that password whenever a family member is in an emergency.

You can get creative by turning it into a spy-thriller-like question-and-answer portion. The key is to make sure the family password is unique and easy to remember.

However, this method might not be advisable for the elderly because they may forget the secret code, especially in an emergency.

©Philippine Daily Inquirer