AI ‘nudification’ apps to be made illegal in move to protect children online

The Government has announced it will ban ‘nudification’ apps, as part of its strategy to tackle violence against women and girls.

While it is already a criminal offence to create explicit deepfake images of people without their consent, the new laws will go further and make the AI tools that enable people to do this — termed ‘nudification’ or ‘de-clothing’ apps — illegal.

In April, the Children’s Commissioner Dame Rachel de Souza, who has been campaigning for these new laws, stated: “The act of making such an image is rightly illegal – the technology enabling it should also be”.

Technology weaponised

Technology Secretary Liz Kendall said: “Women and girls deserve to be safe online as well as offline”.

She stated: “We will not stand by while technology is weaponised to abuse, humiliate and exploit them through the creation of non-consensual sexually explicit deepfakes.”

Kendall warned that people who profit from, or enable the use of, ‘nudification’ apps “will feel the full force of the law”.

No reason to exist

Kerry Smith, Chief Executive of The Internet Watch Foundation, which aims to stop child sexual abuse online, welcomed the measures, commenting: “We are also glad to see concrete steps to ban these so-called nudification apps which have no reason to exist as a product”.

“Apps like this put real children at even greater risk of harm, and we see the imagery produced being harvested in some of the darkest corners of the internet.”

Also see:

Smartphone user

Strictly winner: ‘Porn addiction began at 9 years old’

Online ‘strangulation porn’ set to be made illegal

Deepfake porn victim calls for new legislation