A disturbing report has emerged, indicating that the artificial intelligence (AI) tool Grok, owned by Elon Musk’s firm xAI, may have been used to create “sexualised and topless imagery of girls” aged between 11 and 13 years old. The allegations have been made by the Internet Watch Foundation (IWF), a UK-based charity dedicated to removing child sexual abuse material (CSAM) from the internet.
According to the IWF, their analysts discovered the “criminal imagery” on a “dark web forum” where users claimed to have used Grok to create the content. The charity has expressed grave concerns about the ease and speed with which individuals can apparently generate photorealistic CSAM using such AI tools.
The IWF’s Ngaire Alexander warned that tools like Grok now risk “bringing sexual AI imagery of children into the mainstream.” While the material found would be classified as Category C under UK law, the lowest severity of criminal content, the charity noted that the users had then used a different AI tool to create a Category A image, the most serious category.
In response to the allegations, both X (formerly known as Twitter) and xAI have been approached for comment. This is not the first time Grok has come under scrutiny, as Ofcom, the UK’s communications regulator, had previously received reports that the AI chatbot could be used to create “sexualised images of children” and undress women without their consent.
The IWF has stated that it has received reports of such images on X, but they have not yet been assessed as meeting the legal definition of CSAM. X, in a previous statement, has said that it takes action against illegal content, including CSAM, by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.
This latest development highlights the growing concerns around the potential misuse of AI technology, particularly in the context of child protection and online safety. As the technology continues to advance, policymakers and tech companies will need to work closely together to ensure robust safeguards are in place to prevent such abuses and protect vulnerable individuals.