From PDI they mentioned this new phone scam that uses artificial intelligence to impersonate relatives and acquaintances. No incidents of this kind have yet been reported in Chile, but civilian police advised against falling for this type of deception.
Artificial intelligence (AI) is in the news every day due to the advancement of technology and artificial intelligence (AI) in our daily lives. However, some people use it to trick people and demand large sums of money.
Recently, there have been incidents in other countries where scammers have used AI to imitate or impersonate friends and family members to commit fraud.
According to the Factchequendo portal, the technology allows machines to mimic human voices and be trained to say things they don’t actually pronounce.
Even experts say that while these tools are within the reach of a few people, you should be wary of suspicious calls from unknown numbers from people you know.
For example, a woman in Arizona, USA, reported an attempted fraud in which criminals used artificial intelligence (AI) to imitate her daughter’s voice to make it look like she was kidnapped and demand money for the kidnapping.
Jennifer DeStefano, who told local media outlet Arizona’s Family about the unusual situation. There, he asserted, the scammers had succeeded in simulating his daughter’s voice perfectly, and was probably even crying during the call.
investigative police said – So far – We have not received any complaints about artificial intelligence abuse.
“However, we are monitoring, and if you are the victim of this crime, we would like you to report it to your local police station so we can begin the investigation process,” said Cybercrime in the Tokyo metropolitan area. Julio Vargas, deputy director of the agency, said.
Regarding the incident in the United States, a PDI official said: “This crime was committed abroad, but it may one day reach the country.”
In a conversation with BioBioChile, Vladimir Garay, advocacy and communications director for the Derechos Digitales organization, pointed out: “All AI systems need training on data. That is the key question. Boys and girls look at pictures of cats and horses and learn what cats and horses are. It can see any horse and know what it is.It takes millions of pictures of cats for AI to learn what a cat is.”
“And the same is true for voice assimilation software. To “learn” to copy a voice, you need a person’s voice recording. added the expert.
Along these lines, Garay said, “It is relatively easy to copy the voices of celebrities because of the large number of recordings.”
“Technology can change, but the logic of the scam seems to be the same as it used to be with phone calls and text messages,” Garay said. And when someone contacts you to ask for money, it seems the best response is to stay alert, especially if you feel the urgency to add to the request. ”
“The easiest way to clear suspicion is to contact the person who may have been in contact with you and confirm the veracity of the situation,” the expert said.
Specifically, “the answer is not technical, but rather to make the public aware that this is a fraudster’s modus operandi so that people can be prepared,” Garay said.
“It is also true that before this kind of technology became available to everyone, scammers were still trying to make phone calls and pretend to be other people. You may not need such an accurate simulation because it slows down the rate, which is why it’s important to know about this kind of scam and stay vigilant,” said the expert.
‘Shouldn’t be criminalized’
“Artificial intelligence should not be criminalized because it is designed to improve certain technological processes,” Vargas said.
However, he said, “artificial intelligence has become a catalyst for certain computer crimes, such as spamware and phishing campaigns, where people can be victims through email.”
“Crimes can also be reinforced by improvements and alterations through images and voices of people, so that people are deceived, e.g. It will happen,” he added. malicious forwarding. “
“Emulate human voice”
“Today there are artificial intelligence tools that can emulate the human voice better and better,” Vargas said.
“Criminals use any tool in malicious ways to deceive people and generate financial gain, as is the case with artificial intelligence, and ultimately through the exploitation and misuse of these tools they That’s what we want,” the deputy secretary said.
How to avoid new phone scams using artificial intelligence
“In certain cases where you think you’re being affected by voice theft, the first thing you should do is try to get in touch with us stealthily,” Vargas said of precautions you can take in such cases. We will contact you through traditional means, WhatsApp or direct numbers. “
“In the case of the foreigner, it turned out that the contact was made from a different number than the victim. So the communication channel was available. Confirmed,” Vargas said.
Another recommendation is that “family groups should cooperate with each other and create keywords in anticipation of such situations in case they are forced to move or become victims of crime.” is.
“It’s always important to protect our information and make sure it’s delivered to the right sites,” he stressed.
In that sentence, Vargas said, “Any information scattered across the internet, social networks, and internet forums can be easily gathered through artificial intelligence and exploited using social engineering and other systems to deceive people. there is,” he said.
“In case I become a victim, please have some kind of protocol in place to ensure that I am communicating effectively with the person. Even if you know it will actually go to that person’s account, that’s also an indication,” Vargas stressed.
Finally, from the cybercrime field, “Be careful about emergency messages that pose an emergency or dangerous situation to a person. It also suggests that I may be a victim of crime. ” .