New report finds generative machine learning exacerbates online sexual

newsletter via Feeds on Inoreader 2023-06-28

Summary:

meninoise.png?itok=moLRXM8L

Generative Machine Learning (ML) models have been rapidly adopted by large numbers of users. Large language model (LLM) chatbots and generative media tools can create human-like responses or any imagined scene in increasingly short timeframes. In just the past few months, some of this text and imagery has become so realistic that it is difficult to distinguish from reality. The public release of these tools also spawned a thriving open-source community dedicated to expanding their capabilities.

Tools for generating realistic images and video are advancing rapidly in the open-source ML community, with a combination of advances in generative ML technology and increasingly powerful processing power available to consumers. However, we have also seen the ability to misuse this technology to create deceptive accounts and content for government propaganda, and for creating non-consensual explicit media used for harassment and extortion.

A new report from researchers at the Stanford Internet Observatory and Thorn, a nonprofit working to address the role of technology in facilitating child sexual exploitation, highlights how this rapidly advancing technology poses a threat for child sexual exploitation, namely the production of increasingly realistic computer-generated child sexual abuse imagery (CSAM). The report outlines a number of potential technical mitigations and areas for industry and policy collaboration on AI ethics and safety measures. However, the researchers warn that the use of generative ML tools for creating realistic non-consensual adult content and CSAM is growing and likely to worsen without intervention by a broad array of stakeholders.

Link:

https://cyber.fsi.stanford.edu/io/news/ml-csam-report?ref=disinfodocket.com

From feeds:

Everything Online Malign Influence Newsletter ยป Newsletter

Tags:

newsletter

Date tagged:

06/28/2023, 14:29

Date published:

06/28/2023, 14:02