A new investigation claims TikTok recommends pornography and sexualised clips to children. Researchers created fake child accounts, turned on safety settings, and still received explicit search prompts. These led to videos showing masturbation simulations and even pornographic sex. TikTok says it acted immediately after being alerted and insists it prioritises safe, age-appropriate use.
Fake profiles highlight hidden dangers
In July and August, researchers from Global Witness built four TikTok accounts. They posed as 13-year-olds using false birth dates. The platform did not request extra identification. Investigators enabled TikTok’s “restricted mode”. The company advertises this feature as a barrier against sexual or mature themes. Yet the accounts still received sexual search suggestions in the “you may like” section. These led to videos of underwear flashing, breast exposure and masturbation. At the most extreme, researchers found full pornography disguised within harmless-looking clips.
Campaign group sounds alarm
Ava Lee from Global Witness described the results as a “huge shock”. She argued the platform not only fails to shield minors but also promotes harmful content. Global Witness usually investigates the impact of technology on democracy, human rights and climate change. The organisation first discovered TikTok’s explicit material by accident during earlier research in April.
TikTok responds to findings
Researchers informed TikTok earlier this year. The company said it removed inappropriate material and introduced corrections. But when Global Witness repeated the test weeks later, sexual videos reappeared. TikTok insists it has more than 50 safety features for young users. It claims nine out of ten violating videos are removed before anyone views them. Following the report, TikTok said it upgraded its search tools and deleted more harmful clips.
New laws tighten platform duties
On 25 July, the Children’s Codes of the Online Safety Act took effect. Platforms must now enforce strict age checks and stop minors from accessing pornography. Algorithms must also filter content related to suicide, self-harm and eating disorders. Global Witness conducted its second round of research after the rules began. Ava Lee called on regulators to act, stressing children’s safety must be enforced online.
Users express confusion
During the investigation, researchers observed user reactions. Some questioned why sexualised content appeared in their feeds. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”

