Scammers can fake voice messages using AI: how to avoid becoming a victim

Modern technologies not only open up new opportunities, but also become tools for scammers.

Details

In particular, the State Special Communications Service warns that criminals are increasingly using artificial intelligence to imitate people's voices in fraudulent schemes.

Scammers hack accounts in messengers or social networks and, using previously sent voice messages from the victim, clone their voice using AI.

Then they contact the account owner's friends or relatives with a request to borrow money.

To be convincing, attackers send voice messages that sound like a real person.

Victims may not even realize that it is not their acquaintance who is writing to them, and they easily fall into the trap.

If you receive a request for money, be sure to call the person personally.

Ask questions that only your acquaintance.

For example, ask about a detail related to your shared memories: “What we did together the last time we met?”.

Set up two-factor authentication. This feature makes it much more difficult for fraudsters to access your account.

You will need an additional code to log in, which is sent to your phone or email.

Even if your password is stolen, it will be impossible to log in to your account without the code.

Do not send voice messages in chats that could be compromised.

If a message looks suspicious, for example, has errors or sounds unnatural, it is worth stopping and checking the information.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116