Ofcom Launches Investigation into Telegram
The UK media regulator, Ofcom, has initiated an investigation into the messaging service Telegram amid concerns that the platform may be failing to prevent the sharing of child sexual abuse material (CSAM).
On Tuesday, Ofcom announced it was examining evidence indicating that CSAM was present and being distributed on Telegram.
Under current UK legislation, user-to-user services operating within the country are required to implement systems designed to prevent users from encountering CSAM and other illegal content. These services must also have mechanisms to address such content, or they risk substantial fines for non-compliance.
Telegram responded to the investigation with a statement categorically denying Ofcom's allegations.
"Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations]," the company told the BBC.
"We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy."
This inquiry is part of a broader enforcement effort by Ofcom targeting services suspected of violating the UK's comprehensive online safety regulations. These include enhanced rules obligating technology firms to combat CSAM, which is illegal to possess or share in the UK.
"Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities,"said Suzanne Cater, director of enforcement at Ofcom.
She added that while progress has been made in addressing CSAM on smaller platforms, including file-hosting and sharing services, the problem "extends to big platforms too."
The children's charity NSPCC has welcomed Ofcom's investigation into Telegram.
"Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day,"said Rani Govender, associate head of policy at NSPCC.
"The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram."

Broader Enforcement Actions by Ofcom
Ofcom stated that its investigation into Telegram was prompted after the Canadian Centre for Child Protection alerted the regulator to the alleged presence and sharing of CSAM on the messaging app.
In addition to Telegram, Ofcom has commenced investigations into the services Teen Chat and Chat Avenue due to potential grooming risks identified through collaboration with child protection agencies.
"Teen-focused chat services are too easily being used by predators to groom children,"Cater said.
"These firms must do more to protect children, or face serious consequences under the Online Safety Act."
The Online Safety Act's illegal content duties, effective from March 2025, require user-to-user services such as messaging apps and social networks to demonstrate active measures against "priority illegal content," including CSAM, terrorism, grooming, and extreme pornography.
Ofcom has previously issued fines to providers found in breach of their duties related to illegal content or age verification.
The regulator has the authority to impose fines of up to £18 million or 10% of a company's global revenue, whichever is greater, for non-compliance.
However, some firms have resisted Ofcom's rules and enforcement actions. For example, the US message-board 4chan recently mocked the regulator's threats of penalties with hamster memes.
Despite this, Ofcom reported on Tuesday that one file-sharing service it contacted regarding concerns about its illegal content controls had made "material improvements" to comply with its obligations.
for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? here.






