Spain under shock as schoolboys create fake nudes using generative models

newsletter via Feeds on Inoreader 2023-09-26

Summary:

4563px_taylor-flowe-4nKOEAQaTgA-unsplash

Since their inception in 2017, generative models have been used to artificially undress women in photos and create realistic pornographic videos. The problem has been identified for as long. But instead of being stopped, the tools for that purpose are becoming more accessible. So much so that ten schoolboys in the Spanish town of Almendralejo, in southwestern Spain, are under investigation after creating and disseminating fake nudes of their fellow pupils.

The case went public after the mother of one of the victims posted a video denouncing the situation on her Instagram account, where she has over 136,000 followers. Gynecologist Miriam Al Adib reported how her 14-year-old daughter came to her after she learned that pictures on which she appeared to be naked were circulating among her classmates. Several parents from the town came forward after the incident went public, since all their daughters claimed that the photographs were fake.

Although the case is ongoing and the details of the investigation are still confidential, the kids behind the ruse allegedly used an online app that uses generative models to create nudes from photographs where the girls were dressed. Many Spanish media outlets have identified specific applications with which the nude pictures could have been created, but neither of them has been confirmed as being the tool in case by police officers. All the platforms they mention require users to be over 18, but it is clear that there is no safeguard mechanism in place to prevent minors from using them, as elDiario.es pointed out.

Violating women’s bodies

That this event took place in a town of 34,000 inhabitants proves once more that new technology makes violating women’s bodies even easier. Since deepfake technologies became popular, some expected disinformation and propaganda to be the main use cases, leading to the collapse of institutions and war. Instead, women have been the main target.

This has been going on for a while: “Surprisingly, since the first fake pornographic videos and photos were created in 2017 using these techniques, this has been the main domain of application. Today, most studies estimate that around 90% of the fake content published online is pornographic,” summarizes Marta Beltrán, a doctor in computer science and mathematical modeling, in her book Mr. Internet.

The distressing case in Spain put the spotlight back on a never-ending problem. Back in 2019, an application of the same kind, DeepNude, allowed users to remove clothes from women’s pictures for 50 dollars. One year later, a Telegram bot appeared that enabled users to receive fakes nudes in return for sending pictures of girls and women.

Both services were taken down after the media reported on it and thereby created some public uproar. But, like mushrooms, similar tools keep popping up.

The problem goes further, as pointed out by Marta Beltrán to AlgorithmWatch: “Some of these services advertise themselves directly with slogans like ‘strip whoever you want.’ Moreover, they are offered on certain TikTok channels, video game chat rooms, and similar platforms where minors and very young people are active. It is clear that they are the target audience, and they offer tutorials so that they can learn quickly and get an idea of what can be done. They show it to them as something fun even.”

Only a few legal avenues

These platforms are not for adults only anymore. Instead, they are being advertised as an “open bar” for youngsters, Beltrán says. Creating artificial photos of someone we know is something that, with a little bit of effort, can be achieved with well-known tools such as Photoshop. But an automated service to strip women from their clothes that can be easily used by 13-year-olds is something entirely different, and should be treated accordingly by regulators.

In such cases, not a particular technology’s use is punished but the outcome. Undressing underage girls using generative models is not directly a criminal offense, which makes the legal case even more difficult.

Link:

https://algorithmwatch.org/en/spain-schoolboys-create-fake-nudes-ai/

From feeds:

Everything Online Malign Influence Newsletter » Newsletter

Tags:

credible policy-digital newsletter

Authors:

Naiara Bellio

Date tagged:

09/26/2023, 23:43

Date published:

09/26/2023, 20:59