TikTok starts recommending content tied to eating disorders and self-harm to 13-year-olds within 30 minutes of their joining the platform, according to a report by a UK-based non-profit called the Center for Countering Digital Hate.

TikTok tailors a stream of short videos to people based on their interests, view times and the accounts they follow. It starts recommending content tied to eating disorders and self-harm to 13-year-olds within 30 minutes of their joining the platform, and sometimes in as little as three minutes, according to the report.

Once young users viewed and liked content about body image and mental health, TikTok automatically recommended related videos to them every 39 seconds, according to the researchers. To test the app, the researchers set up eight accounts in August, posing as 13-year-olds, the minimum age for users, in the United States, Britain, Australia and Canada.

“The pathways into extreme content were so innocuous,” Imran Ahmed, chief executive of the Center for Countering Digital Hate, said in an interview. “Your eye might be caught by a video of an aspirational body in beautiful clothes and very quickly the algorithm realizes you’re interested in body image.”

Some of the test accounts saw videos promoting “junkorexia,” a slang term for people with anorexia who eat only junk food, and others from users talking about suicide or featuring razor blades. The researchers found that many videos promoted eating disorders through hashtags using code words in an effort to avoid moderation and that harmful videos sat alongside more positive ones about recovery. For example, people have used #EdSheeranDisorder to tag posts about eating disorders while appearing to talk about the pop singer.

TikTok pushed back on the findings of the report.

“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” said Mahsau Cullinane, a spokeswoman for TikTok. “We regularly consult with health experts, remove violations of our policies and provide access to supportive resources for anyone in need.”

Ms. Cullinane added that the company’s aim was to build a service for all people, including “people who choose to share their recovery journeys or educate others on these important topics.”

There have also been growing concerns around the content that TikTok serves to teenagers. A Wall Street Journal investigation in 2021 found that teens were inundated with dangerous weight-loss videos, including “tips” on how to consume less than 300 calories per day. Meanwhile, the news program 60 Minutes recently reported that young users of Douyin, ByteDance’s version of TikTok that is available in China, are served educational and patriotic content and limited to just 40 minutes of time on it per day.

TikTok is not the only social media platform facing scrutiny for its influence on young people. This year, a coroner in England ruled that Instagram and other social media platforms contributed to the 2017 suicide of Molly Russell, a 14-year-old girl. Last year, leaked documents from Frances Haugen, a former employee of Meta, which owns Instagram, detailed research inside the company that suggested teenage girls were suffering body image issues when using Instagram. Instagram has had problems with reining in content promoting disordered eating.

Mr. Ahmed said that TikTok required more oversight and awareness from parents and lawmakers, as well as transparency around its algorithms. The Center for Countering Digital Hate created a parents’ guide to the platform with Ian Russell, Molly Russell’s father, who oversees a foundation set up in her name.

Source: https://www.nytimes.com/2022/12/14/business/tiktok-safety-teens-eating-disorders-self-harm.html