AI-generated child abuse material rocketed 1,325 per cent across the world in just one year, it has been reported.
Childlight Global Child Safety Institute’s ‘Into the Light Index 2025’ revealed that reports involving child sexual abuse material (CSAM) generated by artificial intelligence rose from 4,700 in 2023, to more than 67,000 in 2024.
Childlight’s study, which analysed data from 33 countries in Western Europe and 8 countries in South Asia, documented the creation and sharing of child abuse photos and videos, in addition to online grooming and coerced sexual activity.
‘Severe’
AI-generated CSAM is the use of technology to create fictionalised images of children, or ‘nudify’ real images to sexualise them.
Childlight warned that such content is “increasing across all data sources that track this as a specific category of material”, while recent analysis suggests it is “often of the more severe categories, almost completely depicting female children”.
Paul Stanfield, Chief Executive Officer of Childlight, commented: “Emerging threats, like AI-generated CSAM, are being used to create new forms of harm, but AI, with adequate regulation, can also be harnessed to protect millions of children.
“With proper safeguards, AI can help remove abusive content faster than ever, ending the online re-victimisation of children whose images have circulated for years.”
Deepfakes
Earlier this year, the Children’s Commissioner called for AI-based nudification tools to be banned.
Dame Rachel de Souza warned that widely available apps are causing children to fear that “anyone – a stranger, a classmate, or even a friend – could use a smartphone as a way of manipulating them by creating a naked image”.
She added: “The online world is revolutionary and quickly evolving, but there is no positive reason for these particular apps to exist. They have no place in our society. Tools using deepfake technology to create naked images of children should not be legal and I’m calling on the government to take decisive action to ban them”.
Deepfake porn victim calls for new legislation