Dramatic rise in publicly downloadable deepfake image generators
beSpacific 2025-05-07
New Oxford study uncovers explosion of accessible deepfake AI image generation models intended for the creation of non-consensual, sexualised images of women. “Researchers from the Oxford Internet Institute (OII) at the University of Oxford have uncovered a dramatic rise in easily accessible AI tools specifically designed to create deepfake images of identifiable people, finding nearly 35,000 such tools available for public download on one popular globally accessible online platform, for example. The study, led by Will Hawkins, a doctoral student at the OII, and accepted for publication at the ACM Fairness, Accountability, and Transparency (FAccT) conference, reveals these deepfake generators have been downloaded almost 15 million times since late 2022, primarily targeting women. The data point towards a rapid increase in AI-generated non-consensual intimate imagery (NCII). Key findings:
- Massive scale: Nearly 35,000 publicly downloadable “deepfake model variants” were identified. These are models that have been fine-tuned to produce deepfake images of identifiable people, often celebrities. Other variants seek to generate less prominent individuals, with many based on social media profiles. They are primarily hosted on Civitai, a popular open database of AI models.
- Widespread use: Deepfake model variants have been downloaded almost 15 million times cumulatively since November 2022. Each variant downloaded could generate limitless deepfake images.
- Overwhelmingly targeting women: A detailed analysis revealed 96% of the deepfake models targeted identifiable women. Targeted women ranged from globally recognised celebrities to social media users with relatively small followings. Many of the most popular deepfake models target individuals from China, Korea, Japan, the UK and the US.
- Easily created: Many deepfake model variants are created using a technique called Low Rank Adaptation (LoRA), requiring as few as 20 images of the target individual, a consumer-grade computer, and 15 minutes of processing time.
- Intended to generate NCII: Many models carry tags such as ‘porn’, ‘sexy’ or ‘nude’ or descriptions signalling intent to generate Non-Consensual Intimate Imagery (NCII), despite such uses violating the hosting platforms’ Terms of Service and being illegal in some countries including the UK…”