Children as young as 13 are able to access virtual strip clubs using the Meta Quest headset made by Facebook parent company Meta, leaving them exposed to sexual material, grooming and even rape threats.
BBC News researcher Jess Sherwood posed as a young teen on the VRChat app, which allows users of all ages to mix in a variety of settings, ranging from restaurants to pole dancing clubs.
The app has a minimum age rating of 13, but as Sherwood’s identity and age was not checked, it raises the prospect of younger children lying about their age to gain access and being exposed to harmful content and behaviour.
While wearing the headset, the researcher was able to visit virtual-reality rooms where she saw other users’ digital 3D avatars simulating sexual acts. She was approached by a number of adult men and was also shown sex toys and condoms.
Some users spoke to Sherwood about “erotic role play”, while one man told her the app allowed avatars to “get naked and do unspeakable things”.
One user attempted to perform sexual acts on her avatar, and another said he would have sex with her, and then “rape your little sister”.
Sherwood said: “I was surprised how totally immersed in the spaces you are. I started to feel like a child again. So when grown men were asking why I wasn’t in school and encouraging me to engage in VR sex acts, it felt all the more disturbing.”
She continued: “Everything about the rooms feels unnerving. There are characters simulating sex acts on the floor in big groups, speaking to one another like children play-acting at being adult couples.
“It’s very uncomfortable, and your options are to stay and watch, move on to another room where you might see something similar, or join in – which, on many occasions, I was instructed to do.”
‘Incredibly harmful experiences’
Following the investigation, Andy Burrows, Head Of Child Safety Online Policy at the NSPCC, said: “It’s children being exposed to entirely inappropriate, really incredibly harmful experiences”, adding that changes needed to be made as a matter of urgency.
He continued: “This is a product that is dangerous by design, because of oversight and neglect. We are seeing products rolled out without any suggestion that safety has been considered”.
A statement from Meta, which did not create the VRChat app, but allows it to be downloaded from its app store, says players do have the capabilities to block other users, and that it would attempt to make safety improvements “as it learns how people interact in these spaces”.