Mon. Nov 18th, 2024

A grandmother with AI appeared to talk to phone scammers

A grandmother with AI has appeared to talk to phone scammers

Grandma with AI/unsplash

Despite all the shortcomings of artificial intelligence (for example, encouraging people to eat deadly mushrooms), sometimes it can be used for good purposes. O2, the largest mobile operator in the UK, has deployed an AI-based voice chatbot.

The chatbot, called Daisy, or “dAIsy,” mimics the voice of an elderly person, who is a common target for phone scammers.

Daisy's goal is to automate “scumbaiting,” or the practice of deliberately wasting phone scammers' time to keep them away from potential real victims for as long as possible. Scammers use social engineering to take advantage of the naivety of the elderly, convincing them, for example, that they owe taxes and will be arrested if they don't transfer funds immediately.

However, when the con man calls Daisy, they have a long conversation that ultimately leads to nothing. If they get to the point where the scammer asks for personal information, such as bank details, Daisy will make up fake information. O2 says it can get in touch with fraudsters in the first place by adding Daisy's phone number to the “soft target” lists fraudsters use to find potential customers.

200% Deposit Bonus up to €3,000 180% First Deposit Bonus up to $20,000

Grandma with AI appeared to talk to phone scammers

Grandma with AI/O2/Daisy AI

In a video demonstrating Daisy, excerpts from real conversations show how fraudsters are getting more and more annoying, stay on the phone for up to 40 minutes and hope to get a credit card number or bank details. The AI ​​model that O2 has built sounds very convincing – it does all the processing in real time, but luckily it makes things easier as older people tend to speak quite slowly.

Of course, the problem with chatbots like Daisy is that the same technology can be used for the opposite purpose – we've already seen real people, such as CEOs of large companies, fake their votes to trick others and make them send money to scammers. Elderly people are already quite vulnerable. If they get a call from someone who says it's their grandchild, they'll almost certainly believe it's a real voice.

Ultimately, the ideal solution would be to block scam calls and shut down organizations, who engage in these scams. Operators have gotten better at identifying scammers and blocking their numbers, but it's still a cat-and-mouse game. Scammers use automated dialing tools that allow them to dial numbers in rapid succession and are only notified when they receive an answer. An AI bot that frustrates scammers by answering and wasting their time is better than nothing.

Natasha Kumar

By Natasha Kumar

Natasha Kumar has been a reporter on the news desk since 2018. Before that she wrote about young adolescence and family dynamics for Styles and was the legal affairs correspondent for the Metro desk. Before joining The Times Hub, Natasha Kumar worked as a staff writer at the Village Voice and a freelancer for Newsday, The Wall Street Journal, GQ and Mirabella. To get in touch, contact me through my natasha@thetimeshub.in 1-800-268-7116

Related Post