Ofcom Launches Investigation into Telegram Following Canadian Alert on Child Abuse Material

Liam MacKenzie, Senior Political Correspondent (Ottawa)
5 Min Read
⏱️ 4 min read

A concerning tip-off from Canada regarding the alleged sharing of child sexual abuse material on Telegram has prompted Ofcom, the UK’s online safety regulator, to initiate a formal investigation into the messaging platform. This development underscores the ongoing battle against online exploitation and the responsibilities of tech companies in ensuring user safety.

Child Protection Concerns

The alarm was raised by the Canadian Centre for Child Protection, which informed Ofcom that Telegram was potentially being used to disseminate illegal content. Under both British and Canadian law, the possession and distribution of child sexual abuse material is strictly prohibited. The UK’s Online Safety Act mandates that providers of user-to-user services, which includes messaging applications, must actively assess and mitigate risks associated with such criminal activities.

In a statement released on Tuesday, Ofcom confirmed it had received evidence regarding the alleged sharing of child sexual abuse material on Telegram and conducted its own assessment of the platform. The regulator stated, “In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content.”

Telegram’s Response

Telegram, which boasts over one billion users, provides a platform where individuals can exchange messages, share files, and participate in both private and group voice and video calls, all with minimal content restrictions. While the app is popular among various user groups, including activists and journalists, it has also faced criticism for being exploited by criminals.

In response to Ofcom’s announcement, Remi Vaughn, a spokesperson for Telegram, categorically denied the accusations. “Telegram has virtually eliminated the public spread of CSAM [child sexual abuse material] on its platform through world-class detection algorithms and cooperation with NGOs,” Vaughn stated. He expressed concern that the investigation might represent a broader effort to undermine online platforms that champion freedom of expression and privacy rights.

Ongoing Challenges in Child Safety

The Canadian Centre for Child Protection has gained international recognition for its commitment to combating child exploitation online. Through initiatives like Project Arachnid, the Centre employs web crawlers to identify and report child abuse material across various online spaces, actively monitoring forums and chat groups that are known to be frequented by offenders.

Lloyd Richardson, the Centre’s director of technology, expressed his concerns that child exploitation has resurged on Telegram, despite numerous notifications sent to the company over the past year. “Although not directly related to the information provided to Ofcom, in the last year we have sent thousands of notifications to Telegram related to content and accounts on their service,” he remarked.

Moreover, Ofcom has indicated that it is also investigating other chat services that allow open chatrooms and private messaging, which are reportedly being used by predators to groom minors.

Regulatory Powers and Future Implications

Ofcom holds significant authority, with the ability to impose fines reaching up to £18 million or 10% of a company’s global revenue if it is found to be in violation of the law. This investigation could set a critical precedent for how tech companies are held accountable for the safety of their platforms.

In parallel, Canadian Identity Minister Marc Miller is currently consulting on an online safety act, taking cues from the UK’s legislative framework. A previous iteration of an online harms bill in Canada sought to require platforms to swiftly remove child sexual abuse material and address other serious online harms. The federal government is expected to incorporate similar measures into its forthcoming bill, which may be released as soon as June. Additionally, consultations are ongoing regarding regulations for children’s use of AI chatbots and potential age restrictions for social media use.

Why it Matters

This investigation into Telegram highlights a pivotal moment in the ongoing struggle to safeguard children from online exploitation. As regulators in both the UK and Canada seek to strengthen their frameworks for online safety, the outcome of this inquiry could influence legislative approaches worldwide. The necessity for tech companies to take robust action in preventing their platforms from being misused for criminal purposes is more urgent than ever, as is the need for protective measures that ensure the digital landscape is safe for all users, particularly the most vulnerable.

Share This Article
Covering federal politics and national policy from the heart of Ottawa.
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2026 The Update Desk. All rights reserved.
Terms of Service Privacy Policy