Fraudsters extracted nearly HUF 10 million from an elderly man using artificial intelligence –

The so-called grandchild fraud, they deceive parents and grandparents with AI-based voice generators.

Although the AI ​​services put on a pedestal by tech companies (chatbots, image, video and sound generators, etc.) have many advantages, efforts to eliminate or at least mitigate the security risks they represent are far from sufficient. Criminals also take advantage of the opportunities inherent in technology. For example, they like to use voice generators based on artificial intelligence, thanks to which they can beat their unsuspecting victims much more easily than even just a few years ago.

With this, the cousin fraud, which is also widespread in Hungary, and whose targets are usually elderly people, has been raised to a new level. The perpetrators typically pretend to be an acquaintance of the child or grandchild, or perhaps a police officer, and try to make their chosen victim believe that a member of their family has been involved in an accident or has been taken to the police, and that their quick treatment or release can only be resolved if they immediately hand over a specified amount. With the involvement of AI sound generators, however, they can play the relative in trouble directly.

The most recent example of this is the Californian case about which the ABC7 news site reported. A man named Anthony, who asked to remain anonymous, had his savings depleted by $25,000 (roughly HUF 9.2 million) after fraudsters called him in the name of his son and asked him to transfer the amount immediately, citing that he had caused an accident, to be precise, run over a pregnant woman .

Anthony thought it was his son on the other end of the line, who was imitated by the AI ​​voice generator used by the fraudsters with absolute believability. The criminals made sure, they made a second call, in which an unknown person pretended to be his son’s lawyer and demanded $9,200. The person, who introduced himself as Michael Roberts, threatened Anthony by saying that if he did not pay the requested amount for bail, his son would remain behind bars for 45 days.

He didn’t give in right away, he tried to call his son, but the voicemail kept coming up, so fearing the worst, he rushed to the bank, where he lied that he wanted to install a solar panel, so that they wouldn’t get suspicious. She then asked her daughter, who also didn’t realize they were being scammed due to their growing panic, to call Michael Roberts back. The alleged lawyer said that someone would be arriving shortly with an Uber to collect the money.

The envelope changed hands, but the story was not over, the fraudsters wanted more. Another phone call followed, this time another “lawyer”, Mark Cohen, came forward to deliver the sad news that the pregnant woman who had been run over by Anthony’s son had died from her injuries, and as a result the bail had been raised to $15,800. Anthony handed over this amount as well, repeating the procedure described above, at the end of which he became poorer by a total of 25 thousand dollars.

Anthony later told police that the person who first called him sounded exactly like his son, so he didn’t suspect anything. According to Chelsea Sager, a detective from the Los Angeles Police Department, such scams are not new at all, but the inclusion of AI-based voice generators increases the chances that family members will believe the perpetrators.

“Fraudsters are getting smarter and more sophisticated. They are using social media and technology to create believable and convincing stories, tricking people into thinking they are talking to a family member or a government official.”

Anthony’s case is therefore not at all unprecedented, and as the development of technology plays into the hands of criminals, even more people may fall victim to similar scams. That is why he wants as many people as possible to know his story and learn from his mistake.

Source: www.pcwplus.hu