
Ofcom Investigates Telegram Over Child Sexual Abuse Material Sharing, Company Denies Allegations
Ofcom initiated its probe on Tuesday, citing evidence suggesting the presence and sharing of CSAM on Telegram's platform. Under current UK law, user-to-user services operating within the country are mandated to implement systems preventing exposure to CSAM and other illicit content, alongside mechanisms for its removal, to avoid substantial financial penalties.
Telegram issued a statement, asserting, "Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations]." The company expressed surprise at the investigation, suggesting it might be "part of a broader attack on online platforms that defend freedom of speech and the right to privacy."
This investigation forms part of a broader regulatory effort by Ofcom to enforce the UK's online safety requirements, which include strengthened rules for technology companies to address CSAM. Suzanne Cater, Ofcom's director of enforcement, highlighted that child sexual exploitation and abuse inflict "devastating harm," and tackling it is a "highest priority."
The children's charity NSPCC has welcomed the investigation, with Rani Govender, associate head of policy, noting that police record approximately 100 child sexual abuse image offences daily. Ofcom stated its investigation into Telegram followed contact from the Canadian Centre for Child Protection. Furthermore, Ofcom has also initiated investigations into Teen Chat and Chat Avenue concerning potential grooming risks.
The Online Safety Act's illegal content duties, operational since March 2025, compel user-to-user services to demonstrate their efforts in tackling "priority illegal content," encompassing CSAM, terrorism, grooming, and extreme pornography. Ofcom possesses the authority to levy fines up to GBP#18 million or 10% of global revenues, whichever is higher, for non-compliance.

