Risks to children playing Roblox 'deeply disturbing', researchers say
The gaming platform has been accused of ‘troubling disconnect between child-friendly appearance and reality’

A new research has raised concerns about how easily children can access inappropriate content and interact unsupervised with strangers on the popular gaming platform Roblox.
These concerns come as parents voice growing fears about children experiencing addiction, seeing traumatising content and being approached by strangers on the hugely popular website and app, reports The Guardian.
Roblox has acknowledged that children may encounter harmful content and "bad actors" on its platform. While the company says it is making efforts to address these problems, it stresses that broader industry cooperation and government involvement are essential for real progress.
Promoting itself as "the ultimate virtual universe", Roblox hosts millions of user-created games and interactive environments, known collectively as "experiences". As of 2024, it reported over 85 million daily active users, with an estimated 40% under the age of 13.
In response to the growing concerns, the company stated it "deeply sympathises" with affected families but emphasised that "tens of millions of people have a positive, enriching, and safe experience on Roblox every day".
However, an investigation by digital behavior specialists Revealing Reality, shared with The Guardian, paints a more troubling picture. The researchers found a "deeply disturbing" gap between the platform's child-friendly image and the real experiences of young users.
To conduct the study, researchers created several Roblox accounts registered to fictional users aged five, nine, 10, 13, and over 40. These accounts only interacted with each other to ensure controlled conditions and to prevent outside influence on avatar behavior.
Despite the recent introduction of new parental control tools, the study concluded that existing safety features remain limited in effectiveness. Significant risks to children persist on the platform, the researchers warned.
The report found that children as young as five were able to communicate with adults while playing games on the platform, and found examples of adults and children interacting with no effective age verification. This was despite Roblox changing its settings in November 2024 so that accounts listed as belonging to under-13s can no longer directly message others outside of games or experiences, instead having access only to public broadcast messages.
The report also found the avatar belonging to the 10-year-old's account could access "highly suggestive environments". These included a hotel space where they could view a female avatar wearing fishnet stockings gyrating on a bed and other avatars lying on top of each other in sexually suggestive poses, and a public bathroom space where characters were urinating and avatars could choose fetish accessories to dress up in.
Researchers also found that their test avatars overheard conversations between other players verbalising sexual activity, as well as repeated slurping, kissing and grunting noises, when using the voice chat function. Roblox says that all voice chat – which is available to phone-verified accounts registered as belonging to users aged 13 and above – is subject to real-time AI moderation.
They further found that a test avatar registered to an adult was able to ask for the five-year-old test avatar's Snapchat details using barely coded language. Though Roblox says in-game text chat is subject to built-in filters and moderation, the report says this is an example of how easily such measures can be circumvented, creating opportunities for predatory behaviour.
Roblox said it recognised "there are bad actors on the internet" but added this was "an issue that goes beyond Roblox and needs to be addressed through collaboration with governments and an industry-wide commitment to strong safety measures across all platforms". It also acknowledged that age verification for under-13s "remains an industry challenge".
Stories shared by parents following a Guardian Community callout include that of a 10-year-old boy who was groomed by an adult he met on the platform, and a nine-year-old girl who started having panic attacks after seeing sexual content while gaming.
Damon De Ionno, research director of Revealing Reality, told The Guardian, "The new safety features announced by Roblox last week don't go far enough. Children can still chat with strangers not on their friends list, and with six million experiences [on the platform], often with inaccurate descriptions and ratings, how can parents be expected to moderate?"
The crossbench peer and internet safety campaigner Beeban Kidron said the research exposed the platform's "systematic failure to keep children safe", adding, "This kind of user research should be routine for a product like Roblox."
Matt Kaufman, chief safety officer at Roblox, said, "Trust and safety are at the core of everything we do. We continually evolve our policies, technologies, and moderation efforts to protect our community, especially young people. This includes investing in advanced safety tools, working closely with experts and empowering parents and caregivers with robust controls and resources.
"In 2024 alone, we added more than 40 new safety enhancements, and we remain fully committed to going further to make Roblox a safe and civil place for everyone."