What happened
South Carolina criminal investigators, led by Chief Criminal Investigator Kevin Atkins, initiated prosecutions for AI-generated child sexual abuse material (CSAM) following new state legislation. Four to eight individuals have faced charges, with penalties up to ten years imprisonment per count. This includes a Berkeley County man charged in February and five Fort Dorchester High School girls victimised by AI-generated nude images this month. The National Center for Missing and Exploited Children (NCMEC) received over 70,000 AI-related CSAM reports in the last two years, highlighting the material's growing prevalence.
Why it matters
The emergence of AI-generated CSAM creates new legal and technical challenges for law enforcement and platform engineers. Procurement teams must prioritise AI platforms with comprehensive safety-by-design features, as federal law mandates platforms alert authorities to obscene requests. Security architects face increased pressure to implement advanced detection mechanisms capable of identifying AI-manipulated content, given the 70,000 AI-related CSAM reports to NCMEC in two years. This follows a pattern of evolving digital threats requiring legislative and technological adaptation.
Subscribe for Weekly Updates
Stay ahead with our weekly AI and tech briefings, delivered every Tuesday.




