TikTok Self-Harm Study is ‘Every Parent’s Nightmare,’ Say Researchers

Image: Center for Countering Digital Hate

TikTok has been found to be promoting videos about self-harm and eating disorders to susceptible teens, according to a report published by the nonprofit Center for Countering Digital Hate (CCDH) on Wednesday (via Associated Press News).

Social media algorithms try to identify what a user likes and can relate to, using that information to curate and serve content that might interest them. The more a user interacts with a specific type of content, the more of it they are shown.

CCDH researchers conducted an experiment where they created TikTok accounts for fictional teenagers in Canada, the U.S., the U.K., and Australia. Using these accounts, the researchers “liked” videos about self-harm and eating disorders.

TikTok’s algorithms did their thing, and within minutes the researchers started seeing videos about losing weight and self-harm. Some of the videos they were served featured pictures of models and unrealistic body types that could entice body image issues, as well as discussions of suicide.

The study highlights just how double-edged the algorithms used by social networks like TikTok can be. If you interact with any kind of harmful content, you’ll be shown more of it without prejudice or any filters to check for potential vulnerability to such subjects.

What’s even more alarming is that when the researchers created accounts with user names that indicated a particular vulnerability to eating disorders by including words like “lose weight,” for example, they were served even more potentially harmful content.

“It’s like being stuck in a hall of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself,” said CCDH CEO Imran Ahmed. “It is literally pumping the most dangerous possible messages to young people.”

TikTok isn’t the only social media platform whose algorithms aren’t entirely clear on what they shouldn’t promote, though. Last year, a Wall Street Journal report went after Instagram for being “toxic” to teen girls and promoting body image issues.

Josh Golin, executive director of Fairplay, a nonprofit advocating for better online protections for children, said TikTok isn’t the only social network failing to protect its young and impressionable users from harmful content and aggressive data collection.

“All of these harms are linked to the business model,” said Golin. “It doesn’t make any difference what the social media platform is.”

Responding to the report, a TikTok spokesperson refuted the findings. They noted that the CCDH didn’t use the platforms as the average user would, and the results were therefore skewed. TikTok added that user names should not affect the kind of content the platform’s algorithms serve.

“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” TikTok said in its statement.

TikTok doesn’t allow anyone under the age of 13 to create an account, and its official videos prohibit videos that encourage eating disorders or suicide. What’s more, U.S. users who search for content pertaining to eating disorders receive a prompt offering mental health resources and contact information for the National Eating Disorder Association.

Facebook and Instagram recently introduced new privacy settings for teens. TikTok, meanwhile, added an automated break prompt and other digital well-being features earlier this year, but it also got into trouble for accessing U.S. user data in China.

For more information, download the CCDH’s full report here.

P.S. Help support us and independent media here: Buy us a beer, Buy us a coffee, or use our Amazon link to shop.