๐Ÿ’€ doomscrolling.ai
safety
๐Ÿ’€040

AI Dataset for Detecting Nudity Contained Child Sexual Abuse Images

AI Incident DB Blogยทabout 2 months ago

A widely-used AI training dataset for nudity detection tools was found to contain child sexual abuse material (CSAM), highlighting serious failures in dataset curation and the potential for AI systems to be trained on illegal content that exploits children.

dataset contaminationCSAMtraining datachild safetyAI safety failurecontent moderation

More concerning developments in AI

See all stories