Ofcom, the UK’s communications regulator, has launched an investigation into Elon Musk’s social media platform X (formerly known as Twitter) over concerns that its AI tool Grok is being used to create sexualised images without consent.
The watchdog stated that it has received “deeply concerning reports” of the chatbot being used to generate and share undressed images of people, as well as “sexualised images of children.” If X is found to have broken the law, Ofcom has the power to issue a fine of up to 10% of its worldwide revenue or £18 million, whichever is greater.
X has referred the BBC to a statement posted by its Safety account in early January, which warned that “anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.” Musk himself later responded to a post questioning why other AI platforms were not being scrutinised, stating that the UK government was seeking “any excuse for censorship.”
The investigation will examine whether X has failed to remove illegal content quickly enough and taken “appropriate steps” to prevent people in the UK from accessing it. This includes “non-consensual intimate images” and child sexual imagery. Ofcom will also check if X has implemented “highly effective age assurance” measures to stop children from viewing pornographic images.
The decision follows a global backlash over Grok’s image creation feature, with both Malaysia and Indonesia temporarily blocking access to the tool over the weekend. An Ofcom spokesperson said the investigation would be a “matter of the highest priority,” emphasising that “platforms must protect people in the UK from content that’s illegal in the UK.”
Legal experts have weighed in on the potential implications of the investigation. Lorna Woods, a professor of internet law at Essex University, said it was “hard to predict” how quickly the investigation would progress, noting that Ofcom has the option to apply for a business disruption order to block access to X in the UK, though this would be a “rare circumstance” in response to an ongoing problem.
Meanwhile, Clare McGlynn, a law professor at Durham University, argued that the debate around whether X might be blocked in the UK was a “distraction,” stating that “women and girls need action and changes on the ground so that Grok does not produce illegal intimate images and women can get their non-consensual images removed.”