Image related to artificial intelligence
Eleven teenage girls in Spain have reported that naked photos of themselves created with an artificial intelligence application have been circulated, sparking intense debate in the country.
“Currently, we have received 11 statements from victims” who are minors from the town of Almendralejo (in the southwestern state of Extremadura), a police spokesperson told AFP.
He added that the alleged authors of these images, who have been identified in some cases, had manipulated photos of “girls and minors” and superimposed their faces onto the nude bodies of “others.” Ta.
Another law enforcement official said the fake images were generated using an artificial intelligence (AI) application that can create highly realistic photo montages.
Acting Justice Minister Pilar Ropp told reporters: “These are very serious acts and there should be no impunity, even if the images have been manipulated.”
An investigation has been launched on suspicion of violating privacy. Prosecutors expressed the view that child pornography charges may be considered, considering the victim’s age and other factors.
According to Spanish media, the victims in these doctored photos may have been around 20 girls.
Miriam Al Adib, the mother of one of the girls, condemned the “atrocity” on Instagram.
“When I got home, one of my daughters said to me with great discomfort: “Look, what they did” (…) They took pictures of her and made it seem as if she I put it up as if I were naked,” she explained. .
Al-Adib said police had warned him that the photos might have been posted on a porn site.
Another mother told public television TVE that she had tried to blackmail her daughter by demanding money not to broadcast the images.
Artificial intelligence has raised concerns around the world about its potential for malicious use, such as in the case of pornographic “deepfakes.”
A 2019 study by Dutch artificial intelligence company Sensity found that 96% of fake videos online are non-consensual pornography, with the majority featuring women.