TikTok has announced a set of features to help users struggling with mental health issues and thoughts of suicide.
A policy director with TikTok announced the decision Tuesday, explaining that the warning will appear “when a user searches for terms that may bring up content that some may find distressing.” The notice will appear over the pages containing the search results, temporarily blocking the content from view until users choose to opt in by clicking “show results.”
In a press release, TikTok offered “scary makeup” as a search term that would prompt such a warning, but did not elaborate on how it would determine or categorize potentially distressing search terms. When contacted, a representative for the platform offered additional examples including “blood” and “sfx makeup” as searches that would cue the warning.
The features include guides on wellbeing and support for people who are struggling with eating disorders.
There is also a search intervention feature that directs users to support resources if they look up terms such as “suicide”.
The move comes as rival platform Instagram comes under fresh scrutiny over its impact on users’ wellbeing.
In the announcement TikTok said: “We care deeply about our community, and we always look for new ways in which we can nurture their well-being.
“That’s why we’re taking additional steps to make it easier for people to find resources when they need them on TikTok.”
Last month, TikTok had even removed videos of people partaking in social media’s “milk crate challenge,” in which users attempted to ascend and descend makeshift staircases made of unsecured milk crates. At the time, the trend had been called out by local police and health departments across the country — and even the FDA — as being dangerous.
The new resources, which the company said will be rolled out globally in the coming months, include an expanded guide on eating disorders and a feature that will direct users to local support, such as a Crisis Text Line, if they search for the terms such as suicide.
TikTok, like its rival social media platforms, has come under intense scrutiny over the impact it can have on the mental health of its users, especially teenagers.
TikTok’s announcement came as The Wall Street Journal reported that Instagram has repeatedly found that in certain situations its platform could be harmful to the mental health of its teenage users.
According to the report, the company has been studying the impact of the app on its younger users’ mental wellbeing for at least two years. The paper also said that the research had repeatedly found it is harmful for a large proportion of users, especially teenage girls.