TikTok’s explicit search suggestions can push 13-year-olds towards pornographic content, a new report has revealed.
Campaign group Global Witness created seven accounts on the social media platform, told the platform they were 13 years old and turned the ‘restricted mode’ on to exclude “Sexually suggestive content”. Although the accounts had no search history, initial search suggestions included “hardcore pawn clips”.
The group reported that each user found pornographic material “just a small number of clicks after setting up the account”. One user found such content after clicking the search bar and selecting a suggested search.
‘Clear breach’
In all instances of hardcore pornography, the content had been manipulated to evade restrictions. Some videos were shown within another picture or video, and search suggestions used word replacements such as “corn” for porn.
Global Witness explained: “Our point isn’t just that TikTok shows pornographic content to minors. It is that TikTok’s search algorithms actively push minors towards pornographic content. In other words, what we find here is not just a problem with content moderation, but also a problem with algorithmic content recommendation.”
“The Online Safety Act requires TikTok to protect minors from pornographic content – in this case, the platform wasn’t just showing such content to a minor, but actively directing them to it when the account user had zero previous search or watch history.”
Media lawyer Mark Stephens CBE added: “In my view these findings represent a clear breach of the Online Safety Act. It’s now on Ofcom to investigate and act swiftly to make sure this new legislation does what it was designed to do.”
Online Safety Act
Following the investigation, TikTok reported that it took action on more than 90 pieces of content and removed some of the search suggestions that had been highlighted by the group.
Since July, social media platforms have been required to block children from accessing ‘harmful content’, such as pornography and the promotion of self-harm, or face hefty fines.
Under Ofcom’s Protection of Children Codes, user-to-user services must implement “highly effective age assurance” measures to identify under-18s. Such checks could involve facial age estimation or ID.
Ofcom, appointed by the Government to enforce the Online Safety Act, has power to fine companies in breach of their duties up to £18 million or 10 per cent of their qualifying worldwide revenue, “whichever is greater”. In “extreme cases”, a court could block a website or app in the UK.
Deepfake porn victim calls for new legislation
Children’s tsar: ‘Porn age verification needed now more than ever’