TikTok Under Fire for Pushing Sexual Content to Kids

Despite enabling the platform’s so-called “restricted mode”—designed to block mature themes such as sexual content—the researchers quickly stumbled across troubling suggestions.

TikTok’s powerful algorithm is once again under scrutiny after a human rights watchdog revealed it recommended sexual and pornographic material to accounts posing as children. Researchers from Global Witness, a campaign group that investigates the role of big tech in human rights and democracy, created four TikTok profiles in July and August, all pretending to be 13-year-olds.

Despite enabling the platform’s so-called “restricted mode”—designed to block mature themes such as sexual content—the researchers quickly stumbled across troubling suggestions. Without performing any searches themselves, the “children’s” accounts were directed to explicit search terms in the “you may like” section of the app.

Hidden in Plain Sight

The suggested terms led to videos of women exposing underwear in public places, flashing their breasts, and in more extreme cases, engaging in intimate activity. Many of these clips were embedded within seemingly harmless content, making them harder to detect and remove.

Ava Lee of Global Witness called the findings a “huge shock,” adding: “TikTok isn’t just failing to block harmful material—it’s actively recommending it to children as soon as they sign up.”

TikTok’s Response

TikTok insists it has more than 50 safety features aimed at protecting teenagers, claiming it removes nine out of ten violating videos before anyone sees them. Following Global Witness’s April investigation, the company said it had taken steps to fix the issue. Yet, when the watchdog repeated the test in late summer, the same problems resurfaced.

In a statement, TikTok said it had acted swiftly to remove inappropriate material and improve its recommendation system.

Regulatory Pressure Mounts

The revelations come as stricter online safety laws take effect. On July 25, new “Children’s Codes” under the Online Safety Act became legally binding, requiring platforms to use robust age verification and block harmful content—including pornography, self-harm, and eating disorder material.

Global Witness argues the timing makes the findings even more urgent. “Everyone agrees kids deserve protection online,” Lee stressed. “Now it’s time for regulators to step in.”

Meanwhile, frustrated TikTok users themselves voiced concern. One comment read: “Can someone explain what’s going on with these search suggestions, please?” Another asked bluntly: “What’s wrong with this app?”

Follow tovima.com on Google News to keep up with the latest stories
Exit mobile version