TikTok recommends porn, other highly sexualised content to children: Report
The report says that even after activating child safety tools, test accounts designed to mimic 13-year-olds were still shown inappropriate search prompts that led to pornographic videos.

A new report by a human rights group claims TikTok's recommendation system is directing children's accounts towards pornography and other sexually explicit content.
The report by Global Witness says that even after activating child safety tools, test accounts designed to mimic 13-year-olds were still shown inappropriate search prompts that led to pornographic videos, including footage of penetrative sex, reports the BBC.
TikTok insists it is committed to safeguarding younger users and said it acted quickly once alerted to the issue.
How the research was conducted
Between late July and early August, researchers from Global Witness created four accounts registered with fake dates of birth set to 13 years old. The accounts enabled TikTok's "restricted mode," a feature designed to filter mature themes, but were not required to verify age beyond entering a birthday.
Despite doing no searches themselves, the dummy accounts received sexualised keyword suggestions through TikTok's "you may like" function.
Clicking on those suggestions surfaced clips of women simulating masturbation, exposing their underwear in public and, at its most extreme, even explicit pornographic material.
Researchers found that some of the most graphic videos were embedded inside seemingly ordinary content, suggesting attempts to bypass moderation.
'Huge shock'
The findings came as a "huge shock" to researchers, Ava Lee of Global Witness said.
"TikTok isn't just failing to prevent children from accessing inappropriate content – it's suggesting it to them as soon as they create an account."
Global Witness, which focuses on tech's impact on democracy, human rights and climate change, stumbled on the problem while working on another project in April.
The organisation reported its findings to TikTok, which said it removed offending content and adjusted its search recommendation system.
However, when the experiment was repeated in July and August, the problem persisted.
TikTok maintains it uses over 50 tools and features aimed at keeping teenagers safe and claims that 90% of guideline-violating content is deleted before it is viewed.
The company said, "We are fully committed to providing safe and age-appropriate experiences."
Regulations
The revelations come as stricter online safety regulations are being rolled out. The UK's Online Safety Act, which took effect on 25 July, requires platforms to employ "highly effective age assurance" to block minors from viewing pornography, as well as filter content related to self-harm, suicide or eating disorders.
Global Witness conducted its second round of testing after these new rules came into force.
Lee urged stronger oversight, saying, "Everyone agrees that we should keep children safe online… Now it's time for regulators to step in."
Researchers, during their work, also observed the reaction of other users to the sexualised search terms they were being recommended.
One commenter asked if someone could explain why their search suggestions looked like this, while another asked what was wrong with the app.