In a concerning development, TikTok moderators in the UK have accused the social media giant of “oppressive and intimidating” union busting tactics. This comes after the company fired hundreds of workers, just before they were set to vote on forming a union.
The moderators, who were responsible for vetting content to ensure compliance with TikTok’s rules, claim the platform is guilty of unfair dismissal and breaching trade union laws. About 400 moderators in London were let go before Christmas, with the process initiated a week before the planned union vote.
TikTok, which boasts around 30 million monthly users in Britain, has strongly denied the legal claims, describing them as “baseless”. The company says the layoffs were part of a global restructuring, as it increasingly relies on AI to automate the removal of posts that violate content rules.
However, John Chadfield, the national officer for tech workers at the Communication Workers Union, which represented around 250 of the affected moderators, says this is “holding TikTok to account for union busting”.
He added: “Content moderators have the most dangerous job on the internet. They are exposed to child sex abuse material, executions, war and drug use. Their job is to make sure this content doesn’t reach TikTok’s 30 million monthly users. It is high pressure and low paid. They wanted input into their workflows and more say over how they kept the platform safe. They said they were being asked to do too much with too few resources.”
According to TikTok, its increasing use of AI has reduced moderators’ exposure to graphic content by 76% in the past year. However, Ros Curling, a co-executive director of the tech justice non-profit Foxglove, which is backing the legal action, called the platform’s treatment of its content moderators “appalling”.
She said by “laying off essential safety workers they are putting the platform’s users at risk”, adding: “TikTok has made its position clear: union busting and trampling on our labour laws comes first – the safety of its users, including millions of children, and the wellbeing of its essential safety workers, comes last. We hope the employment tribunal will force them to change course.”
Michael Newman, a partner at the law firm Leigh Day, said: “This case is an important example of how individuals who band together can stand up to the might of big tech firms, and especially of how the fig leaf of AI cost savings should not be allowed to obscure vital safety concerns.”