**
In the heart of Jharkhand, India, a group of resilient women are quietly bearing the weight of an invisible yet crucial role in shaping global artificial intelligence. A new documentary, *Humans in the Loop*, casts a spotlight on the harrowing experiences of female data workers who sift through disturbing content to help train AI systems. Their stories reveal a chilling intersection of technology and trauma, challenging us to reconsider the human cost of our digital age.
The Daily Grind of Content Moderation
On a sun-drenched veranda, 26-year-old Monsumi Murmu juggles the demands of family life with a job that most would find unimaginable. Nestled within the mud walls of her home, she engages in content moderation for a major tech company, often from one of the few spots in her village where mobile reception is reliable. Yet, while the sounds of her family bustle within, her screen displays a world of violence and abuse that she must confront daily.
Murmu’s role demands acute attention; she is tasked with reviewing up to 800 flagged videos and images each day. These assessments are critical for training algorithms to identify harmful content, yet the psychological toll is significant. “In the beginning, I couldn’t sleep,” she recalls, haunted by the imagery that seeped into her dreams. “By the end, you don’t feel disturbed – you feel blank.”
The Emotional Cost of Digital Labour
This emotional desensitisation is not an isolated incident. Research indicates that content moderation is fraught with risks, placing it alongside some of the world’s most hazardous professions. Milagros Miceli, a sociologist leading the Data Workers’ Inquiry, states, “In terms of risk, this work belongs in the category of dangerous jobs.”
Studies have shown that those in content moderation frequently grapple with intrusive thoughts, anxiety, and sleep disruptions. A recent investigation into the experiences of Indian moderators laid bare the pervasive psychological strain, revealing that many suffer from traumatic stress even in workplaces that offer support.
Women at the Forefront of AI Training
As the demand for data annotation surges, women constitute a significant portion of this workforce, often referred to as “ghost workers.” In 2021 alone, around 70,000 people in India were engaged in data annotation, a market valued at approximately $250 million. Notably, 80% of these workers hail from rural or marginalised backgrounds, where opportunities for gainful employment are scarce.
The shift towards remote work has offered these women a rare pathway to financial independence. Many come from Dalit and Adivasi communities, for whom digital roles signify a step up from traditional agricultural or mining jobs. However, the very nature of this work can reinforce their marginalisation. As Priyam Vadaliya, an AI and data labour researcher, points out, the perceived respectability of these roles can create an unspoken expectation of gratitude, which may discourage workers from voicing concerns about their mental health.
The Reality of Ambiguity in Job Descriptions
Raina Singh, a former data worker, shares her own harrowing experience. Initially enticed by the promise of a stable income, she was unprepared for the graphic nature of her assignments. After being shifted to projects involving adult content, she found herself overwhelmed. “I can’t even count how much porn I was exposed to,” she admits. The work not only took a toll on her mental state but also affected her personal relationships, leading to a profound sense of disconnection.
The vagueness of job descriptions often shrouds the realities of these roles in mystery. Many workers are unaware of the psychological challenges they will face until they are entrenched in the job. “People are hired under ambiguous labels, but only after contracts are signed do they realise what the actual work is,” says Vadaliya.
The Silence Surrounding Mental Health
Despite the evident risks, many companies provide little to no psychological support for their workers. Out of eight data annotation firms contacted, only two acknowledged offering mental health resources. This lack of care places an additional burden on workers, many of whom may lack the language to articulate their distress.
Moreover, strict non-disclosure agreements prevent workers from discussing their experiences, compounding their sense of isolation. Murmu, concerned about her family’s perception of her job, often feels trapped. “Finding another job worries me more than the work itself,” she confesses, as she navigates the mental health impacts of her role.
Why it Matters
The stories of these women are a stark reminder of the hidden costs behind our increasingly digital world. As we continue to rely on AI systems, it is imperative to acknowledge and address the psychological toll on those who make this technology possible. The resilience of these workers deserves recognition and respect, prompting a crucial conversation about the ethics of AI and the human lives intertwined in its development. Their courage in the face of adversity not only sheds light on the darker corners of the tech industry but also calls for urgent reforms to protect and support the unseen labourers shaping our future.