Content moderators in Hyderabad, India, have spoken out about the significant strain on their mental health from reviewing harmful online content, including graphic violence and sexual material, sometimes involving trafficked children. Many social media platforms have outsourced content moderation work to countries like India and the Philippines.
Key points from the report:
Emotional Distress: Content moderators reported feeling emotionally distressed, depressed, and experiencing sleep difficulties as a result of their work.
Disturbing Content: One moderator mentioned having to watch a particularly disturbing video of a girl being stabbed. He expressed deep distress at having to view such content, which often leaves a lasting impact.
Low Pay: The average pay for moderators in this region was less than £8 per day.
Support Services: Moderators reported that while they were encouraged to seek help when dealing with stressful situations, the support provided was often unhelpful or insufficient.
Disability Challenges: A moderator with mobility issues mentioned that his disability made him feel isolated, as all communication with clients and management was online.
Concerns About Content: Some moderators expressed concerns about certain content, such as child nudity or videos containing violent and sadistic acts, but said they were discouraged by supervisors from flagging such content.
Call for Regulation: Some moderators argued that there is a need for stricter regulation of online content, especially on livestream apps, to protect both users and the emotional well-being of content moderators.
Outsourcing to India: The shift of content moderation work to countries like India has raised concerns about the impact on the mental health of the young workforce involved.
Advocacy for Better Protection: Advocacy organizations call for better protection and support for content moderators who face traumatic content daily. They stress the importance of acknowledging the critical role these moderators play in maintaining a safer online environment.
AI Limitations: While AI is considered a tool to assist content moderation, it may not be a complete solution, and human moderators are still necessary.
This report sheds light on the challenges faced by content moderators in countries where this work is outsourced and highlights the need for better support and protection for these workers.