They also turned on the platform’s “restricted mode”, which TikTok says prevents users seeing “mature or complex themes, such as… sexually suggestive content”.
Without doing any searches themselves, investigators found overtly sexualised search terms being recommended in the “you may like” section of the app.
Those search terms led to content of women simulating masturbation.
Other videos showed women flashing their underwear in public places or exposing their breasts.
At its most extreme, the content included explicit pornographic films of penetrative sex.
These videos were embedded in other innocent content in a successful attempt to avoid content moderation.
Ava Lee from Global Witness said the findings came as a “huge shock” to researchers.
“TikTok isn’t just failing to prevent children from accessing inappropriate content – it’s suggesting it to them as soon as they create an account”.
Global Witness is a campaign group which usually investigates how big tech affects discussions about human rights, democracy and climate change.
Researchers stumbled on this problem while conducting other research in April this year.
