Ofcom Launches Investigation into Telegram for Potential Child Abuse Material Violations

Alex Turner, Technology Editor
4 Min Read
⏱️ 3 min read

In a significant move towards enhancing online safety, Ofcom has initiated an investigation into the messaging app Telegram amid alarming concerns regarding the sharing of child sexual abuse material (CSAM) on its platform. This inquiry comes on the heels of evidence suggesting that the app may not be adequately preventing the distribution of harmful content, raising questions about its compliance with stringent UK regulations.

Telegram Under Scrutiny

On Tuesday, Ofcom announced its formal probe into Telegram, a popular messaging service, following allegations of CSAM being circulated among its users. The UK’s communications regulator is taking a hard stance to ensure that user-to-user services uphold their legal obligations to protect vulnerable individuals from encountering illegal content. Under current legislation, platforms like Telegram must implement robust systems for identifying and eliminating CSAM, or they risk facing hefty fines for non-compliance.

In response to these allegations, Telegram has vehemently denied any wrongdoing. The company issued a statement asserting that it has made significant strides in combating the spread of CSAM since 2018. “We have virtually eliminated the public dissemination of CSAM on our platform through advanced detection algorithms and collaboration with non-governmental organisations,” Telegram emphasized. The firm expressed surprise at the investigation, suggesting it could be part of a broader narrative targeting online platforms that prioritise freedom of speech and privacy rights.

A Wider Crackdown on Online Safety

This investigation forms part of Ofcom’s broader efforts to enforce the UK’s rigorous online safety framework, which includes stringent rules aimed at tech companies to combat CSAM. Suzanne Cater, Ofcom’s director of enforcement, reiterated the regulator’s commitment to addressing child exploitation and abuse, stating, “Child sexual exploitation and abuse causes devastating harm to victims, and ensuring that sites and apps address this issue is one of our highest priorities.”

The inquiry into Telegram follows outreach from the Canadian Centre for Child Protection, which raised alarms about the presence of CSAM on the platform. Ofcom is also probing other platforms, including Teen Chat and Chat Avenue, due to concerns regarding their susceptibility to grooming.

Under the Online Safety Act, which came into force in March 2025, user-to-user services like Telegram are required to demonstrate concrete measures against “priority illegal content,” encompassing CSAM, grooming, and other serious offences. Failure to comply could result in substantial penalties, with fines reaching up to £18 million or 10% of a company’s global revenue—whichever is greater.

While Ofcom has previously issued fines to service providers for failing to meet these obligations, some companies have responded with defiance. For instance, the US message board 4chan has openly mocked Ofcom’s regulatory efforts, demonstrating the challenges the regulator faces in enforcing compliance.

Support from Child Protection Advocates

The investigation has garnered support from child welfare organisations. Rani Govender, associate head of policy at the NSPCC, welcomed Ofcom’s actions. “Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded

Share This Article
Alex Turner has covered the technology industry for over a decade, specializing in artificial intelligence, cybersecurity, and Big Tech regulation. A former software engineer turned journalist, he brings technical depth to his reporting and has broken major stories on data privacy and platform accountability. His work has been cited by parliamentary committees and featured in documentaries on digital rights.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy