cancel
Showing results for 
Search instead for 
Did you mean: 

Fraudsters imitate the voice of your loved ones with AI

MarianneF
Analyste en sécurité de l'information

RETOUCHE-i1464792046-1200x800.jpg

 

Artificial intelligence (AI) has taken a new step . . . and unfortunately, so have fraudsters. With AI, scammers can now realistically reproduce anyone’s voice: a friend, colleague, or even family member. Fraudsters impersonate someone you trust to get you to disclose sensitive information or transfer money, which greatly increases the success of their scams. Find out how to recognize and protect yourself from vishing.

 

How does vishing work?

Fraudsters only need a few seconds of audio to reproduce someone’s voice. They can easily obtain these samples from:

 

  • Telephone calls from unknown or unverified numbers
  • Social media videos
  • Simple voice messages

Then, with AI tools, they can recreate an almost identical version of the voice and use it to trap their victims.

 

The numbers speak for themselves

On average, vishing victims lose about $1,400 per incident. Fraudsters exploit trust and panic to extract money from them. Fraud in Canada has nearly doubled in 10 years, from 79,000 cases in 2012 to 150,000 in 2022, demonstrating how technologies like AI enable fraudsters to refine their techniques.

 

A chilling example

Tech reviewer Bruno Guglielminetti has tested a generative artificial intelligence that can reproduce voices, even those of the deceased. The tool was able to imitate the voice of Apple founder Steve Jobs by analyzing the speeches, lectures, and interviews he gave throughout his life. The reviewer then took the opportunity to ask a few questions to this “Steve Jobs” recreated by AI, and the result is disturbingly real. Listen to Bruno Guglielminetti’s podcast (in French).

 

How can we protect ourselves?

Vishing scams are not flawless. Follow these tips to detect scams:

 

Don’t rely on the voice – If you receive an unusual or unexpected request over the phone, such as someone asking for money or a password, always verify the caller’s identity through another channel (email, text message, or other phone line). You can also ask a specific question, such as: “What meal did you order the last time we went to a restaurant?” The AI, programmed to deliver its fraudulent scenario, may have difficulty answering such a specific question.

 

Take your time – Scammers play on urgency and panic to get you to act quickly. Analyze the request carefully. If you’re being asked for money in the form of a gift card or a cryptocurrency, for example, there’s something fishy going on.

 

Listen for abnormalities – If the caller is taking long pauses, using unusual phrases, or if you hear strange or echoey sounds on the call, you may be faced with an AI fraud.

 

Vigilance is your best defence!

 

What to do in the event of fraud

When you become aware of fraud, whether or not you have been a victim, it is recommended you report the incident to the Canadian Anti-Fraud Centre. In addition, if you have been robbed of money or personal information as a result of vishing fraud, contact your local police service.

 

Trust your instinct

During a vishing call, any information given without caution can expose you to fraud. Exercise caution and trust your instinct. With AI, only constant vigilance can ensure your information is protected.

 

 

Read also: