Stop the Nonconsensual Use of Nude Images in Research

Published: 26 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 Position Paper Track OralEveryoneRevisionsBibTeXCC BY-NC 4.0
Keywords: nudity detection, image-based sexual abuse, research ethics, safety, dataset
TL;DR: We conducted a systematic review of CS papers that collect/use nude images and found widespread nonconsensual collection, as well as concerning practices like intentional collection of abuse content and the nonconsensual distribution of nude images.
Abstract: In order to train, test, and evaluate nudity detection models, machine learning researchers typically rely on nude images scraped from the Internet. Our research finds that this content is collected and, in some cases, subsequently \emph{distributed} by researchers without consent, leading to potential misuse and exacerbating harm against the subjects depicted. \textbf{This position paper argues that the distribution of nonconsensually collected nude images by researchers perpetuates image-based sexual abuse and that the machine learning community should stop the nonconsensual use of nude images in research.} To characterize the scope and nature of this problem, we conducted a systematic review of papers published in computing venues that collect and use nude images. Our results paint a grim reality: norms around the usage of nude images are sparse, leading to a litany of problematic practices like distributing and publishing nude images with uncensored faces, and intentionally collecting and sharing abusive content. We conclude with a call-to-action for publishing venues and a vision for research in nudity detection that balances user agency with concrete research objectives.
Lay Summary: AI models are used to detect whether an image contains nudity. To build and test these nudity detection models, computer scientists use nude images found online. But, just because a nude photo is online does not mean that the person in the photo posted it, or even knew it was being taken. Nonconsensual posting of sexual images of children and adults is a widespread issue, called image based sexual abuse. Even if the person in the photo posted it online themselves, they did not consent to their photo being used in research. Our study shows widespread collection of nude images without consent in machine learning research. Particularly concerning, we find multiple cases of known abuse material of children and adults being collected intentionally. Furthermore, we find several cases in which researchers may have engaged in image based sexual abuse: they published identifiable nude images of people without their consent or knowledge. We urge researchers and publishers to treat nude data with care. Our paper offers initial guidance on how to do so.
Submission Number: 578
Loading