In a troubling turn of events, over 1,000 employees in Kenya have lost their jobs with Sama, an outsourcing firm that provided content moderation and AI training services for Meta. This abrupt dismissal follows Meta’s decision to halt its partnership with Sama amid allegations that workers were exposed to inappropriate and private content through the company’s AI smart glasses. The situation exposes the precarious nature of tech employment, particularly in developing regions.
Mass Layoffs Shake Kenyan Workforce
On Thursday, Sama, headquartered in Nairobi, announced the layoffs, which have ignited outrage amongst activists advocating for workers’ rights. The affected individuals, primarily low-wage earners involved in data annotation and AI training, were given a mere six days’ notice before their terminations. The Oversight Lab, an organisation dedicated to promoting equitable technology practices across Africa, confirmed the layoffs and is currently advising the former employees on potential legal recourse.
This is not the first instance of large-scale job losses at Sama. In 2024, a lawsuit revealed that numerous content moderators had suffered severe mental health issues, including PTSD and anxiety, stemming from their exposure to disturbing online material. The layoffs this time have been described as “devastating” by the Oversight Lab, which pointed out the broader implications of such corporate actions on the youth and economy of Kenya, suggesting that current strategies are detrimental to the country’s participation in the AI sector.
Allegations of Inappropriate Content Exposure
The controversy emerged after reports surfaced that certain workers were instructed to view content captured by Meta’s Ray-Ban smart glasses, which included private moments involving users. Meta’s chief executive, Mark Zuckerberg, has been known to frequently don these glasses, intensifying scrutiny over the situation. Following these revelations, Meta publicly stated, “Photos and videos are private to users,” affirming that human reviewers are essential to improving product performance, and emphasised that their decision to sever ties with Sama was due to the latter’s failure to meet the company’s standards.
In a bid to quell discontent, Sama released a statement acknowledging the significant impact of these layoffs on its workforce. The company maintained that it is a “responsible corporate citizen,” claiming to provide living wages, comprehensive benefits, and wellness resources to its employees, including on-site counselling support.
A Broader Industry Concern
The plight of the former Sama workers raises critical questions about the nature of employment in the tech industry, especially for outsourced workers in developing nations. Kauna Malgwi, a former Sama employee, highlighted the systemic issues within the global AI sector, stating, “This issue is not confined to one company or contract. It shows how the global AI industry is shaped. Power sits with large technology companies. Risk flows downward, affecting outsourced workers, often in the global south, who have the least protection and highest exposure.”
This incident dovetails with recent findings from a jury in Los Angeles, which concluded that major platforms like Meta’s Instagram and Google’s YouTube had purposefully designed their products to be addictive, leading to harmful consequences for young users.
Why it Matters
The layoffs of over 1,000 Kenyan workers underscore the fragility of employment in the tech industry, particularly for those in developing regions. As major corporations like Meta navigate challenges and controversies, the repercussions of their decisions disproportionately affect vulnerable workers. With the spotlight on the ethical implications of outsourcing and the mental health impacts of content moderation, this situation serves as a stark reminder of the urgent need for accountability and fair treatment within the global technology landscape. As discussions around AI and its implications continue to evolve, it is crucial to ensure that the voices of those most affected are heard and prioritised in the ongoing dialogue.