AI-made videos of child sex abuse rocket online

Films depicting child sexual abuse online have risen dramatically due to a proliferation of AI-generated videos, a charity has warned.

Between 1 January and 30 June of this year, the Internet Watch Foundation (IWF) discovered 1,286 AI-produced films of child sexual abuse compared to just two over the same period last year.

The IWF also revealed that confirmed reports of AI-generated sexualised images of children increased by 400 per cent in the first six months of 2025 compared with the same period in 2024.

AI regulation

IWF interim Chief Executive Derek Ray-Hill warned that “the way this technology is evolving, it is inevitable we are moving towards a time when criminals can create full, feature-length synthetic child sexual abuse films”.

He added: “A UK regulatory framework for AI is urgently needed to prevent AI technology from being exploited to create child sexual abuse material.”

Responding to the data, Dame Chi Onwurah MP, Chair of the Science, Innovation, and Technology Committee, said: “we must act now to ensure safety-by-design is not an afterthought, but a foundational principle in the development of emerging technologies”.

In May, England’s Children’s Commissioner, Dame Rachel de Souza, called on the UK Government to ban tools “using deepfake technology to create naked images of children”.

Online safety

Social media platforms will soon be required to block children from accessing “harmful content”, such as pornography and the promotion of self-harm, or face hefty fines.

From 25 July, under Ofcom’s new Protection of Children Codes, user-to-user services must implement “highly effective age assurance” measures to identify under-18s. Such checks could involve facial age estimation or ID.

Ofcom, appointed by the Government to enforce the Online Safety Act, has power to fine companies in breach of their duties up to £18 million or 10 per cent of their qualifying worldwide revenue, “whichever is greater”. In “extreme cases”, a court could block a website or app in the UK.

Earlier this year, Fenix International Limited, which runs OnlyFans, was fined just over one million pounds by Ofcom for failing to keep children safe from pornography.

Also see:

Teaching union: ‘Smartphones give kids unfettered access to porn’

UK Govt urged to ‘stem alarming tide’ of deepfake pornography

Labour think tank backs ban on AI ‘nudification’ software