Recent technical research, in collaboration with the Algorithmic Transparency Institute and AI Forensics, highlights concerns regarding TikTok’s ‘For You’ feed, which risks pushing children and young people towards harmful mental health content. The research showed that after 5-6 hours on the platform, almost half of the videos served were mental health-related and potentially harmful, a volume roughly ten times higher than for accounts with no interest in mental health.
Risks of the ‘Rabbit Hole’ Effect:
A faster “rabbit hole” effect was observed when researchers manually rewatched mental health-related videos suggested to “sock puppet” accounts mimicking 13-year-old users in Kenya, the Philippines, and the USA. Within 3 to 20 minutes of manual research, over half of the videos in the ‘For You’ feed were related to mental health struggles, with multiple recommended videos in a single hour romanticizing, normalizing, or encouraging self-harm and suicide.
TikTok’s Business Model and Content Recommender System:
The reports argue that TikTok’s business model, which prioritizes engagement to keep users on the platform and collect more user data, poses inherent risks to young users. TikTok’s content recommender system and data collection practices raise concerns about amplifying depressive and suicidal content that could worsen existing mental health challenges.
Two Companion Reports:
Amnesty International released two companion reports: “Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation” and “I Feel Exposed: Caught in TikTok’s Surveillance Web.” These reports highlight the abuses experienced by children and young people using TikTok, linking these issues to TikTok’s recommender system and its underlying business model.
Conclusion:
TikTok’s design practices are criticized for being manipulative and addictive, aiming to keep users engaged for extended periods. The platform’s algorithmic content recommender system is seen as exposing young individuals with pre-existing mental health challenges to serious risks of harm. These findings underscore the need for TikTok to address the potential dangers of its content and its business model, especially when it comes to young users’ mental health