How Scammers Use Artificial Intelligence to Clone Voices and Steal Money?

Sabiq Mirzai
TechCrate
Published in
2 min readSep 24, 2024

--

Image from AI-driven voice cloning scams on the rise, warns McAfee | Editorji

This is one of the dangerous sides of artificial intelligence technologies. Scammers use deepfake and voice cloning technologies to imitate the voices of a victim’s relatives or close friends. They use this voice in phone calls to deceive victims and ask for money under the pretense of help. This method is becoming more widespread as voice cloning technologies improve, making it harder for the average person to recognize the fraud.

Here’s how this scam works:

Collecting a voice sample:

Image from Rubrik

Criminals gather a small voice sample from the victim’s relatives, often from videos or voice recordings shared on social networks.

Cloning the voice

Image from HackerOne

Using artificial intelligence, they clone a voice based on that sample. The cloned voice can sound so real that even close family members might not notice the difference.

Requesting help

Image from SUN

The scammers use this cloned voice to contact the victim and, posing as a relative, urgently ask for money under the guise of needing help. In most cases, these scams emotionally manipulate the victims to confuse them and extract money.

Detecting such scams can be difficult, but a few precautionary measures are recommended:

  • In any urgent call requesting help, even if the voice sounds familiar, ask additional questions.
  • Contact the relative or friend directly through other means (such as messaging or a different phone call) to verify the information.
  • Always double-check the information before sending any money.

To combat the dangerous use of such technologies, law enforcement, technology companies, and individual users must stay vigilant and informed.

--

--