UK Govt to criminalise ‘disgusting’ AI-generated nudes

Creating explicit deepfake images of people without their consent is to become a criminal offence.

Under the Online Safety Act 2023, seeking to share non-consensual intimate images is already illegal but the implementation of the Data (Use and Access) Act 2025 is set to criminalise the creation of such content. The Government also plans to ban dedicated ‘nudification’ tools under its Crime and Policing Bill, which is currently progressing through Parliament.

Ofcom recently launched an investigation into social media platform X, after it was reported that its AI tool ‘Grok’ had been used to undress women and children. Although X has now blocked such activity in countries where it is illegal, Ofcom can fine X up to £18 million or 10 per cent of its qualifying worldwide revenue, “whichever is greater”, if it already breached UK law.

‘Child abuse’

Prime Minister Sir Keir Starmer cautiously welcomed X’s announcement, emphasising that “we’re not going to back down, and they must act”.

He previously told MPs: “The actions of Grok and X are absolutely disgusting and shameful. Protecting their abusive users rather than the women and children who are being abused shows a total distortion of priorities.”

Speaking in Parliament, Liz Kendall, the Secretary of State for Science, Innovation and Technology, said: “The content that has circulated on X is vile. It is not just an affront to decent society — it is illegal. The Internet Watch Foundation reports ‘criminal imagery’ of children as young as 11, including girls sexualised and topless. This is child sexual abuse.

“Lives can and have been devastated by this content, which is designed to harass, torment and violate people’s dignity. They are not harmless images; they are weapons of abuse disproportionately aimed at women and girls, and they are illegal.”

Protect women’s dignity

The Christian Institute’s Head of Communications Angus Saul welcomed the further safeguards for adults and children. “No-one should be in a situation where a stranger can find an image of them and have it turned into an indecent image.

“Women and girls have suffered, and it is a disgrace that technology companies have taken so long to act. It is also disappointing that Grok will not be prevented from generating these images, only that the feature won’t be accessible to people in this country. Determined individuals will find ways around such restrictions, and so it would be much better if the feature did not exist at all.

“And while Grok has at least made a move in the right direction, there has been no confirmation from other AI-generation tools whether they too will be preventing such content from being created. Tech companies need to act now to protect the dignity of women.

“Ultimately though, these kinds of issues will continue to arise if the widespread objectification of women is not eradicated from society. There is a great deal of material that is sexualised on the internet, social media, as well as in films and TV.

“Such content is often not illegal, but is widely available for young people to find and view, and only serves to present the warped idea that all consensual sexual activity is good, and it obscures God’s good design for relationships: namely that while sex is a good gift, its only proper place is within the lifelong marriage of one man and one woman.”

Cyberflashing

Last week, the Government strengthened laws to ensure that social media users are not exposed to cyberflashing.

Sharing such images became illegal under the Online Safety Act, but social media platforms are now required to take proactive steps to ensure that it is never encountered. This could involve automated systems, which block such content before it is uploaded.

It has been estimated that around one in three teenage girls receive unsolicited sexual images.

Also see:

Man using a smartphone

AI ‘nudification’ apps to be made illegal in move to protect children online

Strictly winner: ‘Porn addiction began at 9 years old’

TikTok blasted for suggesting hardcore porn to 13-year-olds