Modern technologies not only open up new opportunities, but also become tools for scammers.
Details
In particular, the State Special Communications Service warns that criminals are increasingly using artificial intelligence to imitate people's voices in fraudulent schemes.
Scammers hack accounts in messengers or social networks and, using previously sent voice messages from the victim, clone their voice using AI.
Then they contact the account owner's friends or relatives with a request to borrow money.
To be convincing, attackers send voice messages that sound like a real person.
Victims may not even realize that it is not their acquaintance who is writing to them, and they easily fall into the trap.
If you receive a request for money, be sure to call the person personally.
Ask questions that only your acquaintance.
For example, ask about a detail related to your shared memories: “What we did together the last time we met?”.
Set up two-factor authentication. This feature makes it much more difficult for fraudsters to access your account.
You will need an additional code to log in, which is sent to your phone or email.
Even if your password is stolen, it will be impossible to log in to your account without the code.
Do not send voice messages in chats that could be compromised.
If a message looks suspicious, for example, has errors or sounds unnatural, it is worth stopping and checking the information.
The founder of Telegram believes that Zuckerberg «is not risking anything», since it has become…
In Germany, 75-year-old politician Bernd Vogel got into a scandal because of "hot" comments under…
The German company German Bionic introduced its latest exoskeleton Apogee ULTRA at the CES 2025 exhibition. The…
NASA's Jet Propulsion Laboratory in Los Angeles has been ordered to evacuate due to wildfires.…
Prosecutors of the Odessa Regional Prosecutor's Office sent an indictment to the court against a…
Finnish police have seized 20 luxury watches from the founder of the cryptocurrency project Hex,…