close
close

Semainede4jours

Real-time news, timeless knowledge

This AI-generated grandmother thwarts scammers by telling tall tales about her cat
bigrus

This AI-generated grandmother thwarts scammers by telling tall tales about her cat

Scammers around the world are having great success. Last year alone, the Federal Trade Commission warned U.S. consumers Record loss of $10 billion due to fraudIt was up 14% from just a year ago. Scammers are increasingly targeting elderly and vulnerable people by phone. More than two thirds of people living in the UK are over 75 reviewed in a recent research article claimed that they had been subject to at least one fraud attempt in the last six months. 40% of survey respondents frequently faced fraud attempts.

Now an AI-generated British grandmother named “Daisy” is trying to make the scammers’ jobs a little more boring. UK mobile operator Virgin Media O2 created Daisy to talk to bad actors and waste as much of their time as possible. Using large language patterns similar to ChatGPT, Daisy will talk about her passion for knitting and tell long-winded, made-up stories about family members in an attempt to keep the scammers on the line. In theory, every minute he spends annoyingly chatting with Daisy about his made-up family or chores is one less minute a scammer could target a real person.

“The newest member of our fraud prevention team, Daisy, is turning the tables on scammers – outmaneuvering and outmaneuvering them at their own ruthless game,” said Murray Mackenzie, O2 Fraud Director at Virgin Media. he said in a blog post.

AI grandma trained on real scam calls

O2 says it is working with professional fraud busters to add AI-linked phone numbers to the list of numbers known to be targeted by fraudsters. If a scammer tries to call one of these numbers they will immediately start interacting with Daisy. Recordings of conversations with scammers released by O2 show Daisy trolling scammers by talking about her fictional cat being “fluffy” and generally dancing around the questions. Daisy will also provide scammers with fake personal information and fake bank details, making them think they are scamming a real person. These conversations may annoy scammers. O2 provided recording clips in which frustrated fraudsters can be heard shouting expletives at the AI ​​on the other end.

“Stop calling me ‘honey’ you idiot (swear),,” one scammer is heard saying.

“I understand, dear,” replied Daisy.

To do all this, Daisy first uses a voice-to-text AI model to transcribe the fraudster’s speech. It then takes that text and runs it through another AI model, which drafts a response using relevant content. Another text-to-speech model then voices that response in the voice of an old woman. Since all these transactions take place within seconds, fraudsters think they are talking to a real person. Daisy was trained using real recordings of ‘cheaters’ collected by O2.

O2 deliberately used an older woman because they often disproportionate targets of scams. In this case, Daisy was programmed to give convoluted, long-winded speeches designed to get the scammers talking. The model has reportedly already kept multiple scammers on the line for more than 40 minutes.

Callers who are targeted by fraudsters in the United Kingdom can send their attackers to artificial intelligence by forwarding the call they receive to the number 7726. This number then sends the call to the Daisy helpline. O2 says it hopes Daisy can make a meaningful difference amid a rise in fraudulent phone activity. Nearly a fifth of Britons surveyed in O2’s latest research reported being the target of a scam every week.

Artificial intelligence is also contributing to new scams

While Daisy’s mission is to stop fraud, scammers are using similar AI tools to launch a variety of new attacks. Artificial intelligence “voice clones,” which use voice snippets to imitate a person’s voice, have been used to commit bank and wire fraud in recent years. In some extreme cases, scammers have even used artificial intelligence to trick people into believing their loved ones have been murdered. was kidnapped or held hostage. Believing that their son or daughter is in imminent danger, victims pay fake ransoms to scammers. Scams like these are becoming increasingly common. One in four recent respondents Investigated by cyber security firm McAfee claimed that they or someone they know had been targeted by an AI voice clone scam. Tools like Daisy could theoretically help stop this trend by sending other AI scam bots down winding rabbit holes.

“Let’s face it, darling,” Daisy said in a recording. “I have all the time in the world.”