Wednesday, May 13, 2026
English edition

World

Ofcom investigating Telegram over child sexual abuse material concerns

April 21, 2026 International Source: BBC World

Ofcom investigating Telegram over child sexual abuse material concerns

Share this article

The popular messaging service told the BBC in a statement it "categorically denies Ofcom's accusations". Ofcom probing Telegram over child sexual abuse material concerns Copyright current_year BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking. Copyright current_year BBC. All rights reserved. The BBC is not responsible for the content of external sites. Read about our approach to external linking. Messaging app Telegram has denied accusations made by Ofcom Telegram logo displayed on a smartphone screen. Ofcom investigating Telegram over child sexual abuse material concerns The UK media regulator has launched an investigation into Telegram over concerns it may be failing to prevent child sexual abuse material (CSAM) being shared. Ofcom said on Tuesday it was probing the popular messaging service after gathering evidence suggesting CSAM was present and being shared on the platform. Under the current law, user-to-user services operating in the UK must have systems in place to prevent people from encountering CSAM and other illegal content, as well as mechanisms to tackle it - or risk huge fines for breaches. Telegram said in a statement that it "categorically denies Ofcom's accusations". "Since 2018, Telegram has virtually eliminated the public spread of CSAM on its platform through world-class detection algorithms and cooperation with [non-governmental organisations]," it told the BBC. The company added: "We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy." It is part of a wider crackdown from Ofcom on services it suspects could be flouting the UK's sweeping online safety requirements - including toughened-up rules for tech firms to tackle CSAM, which it is illegal to possess or share in the UK. "Child sexual exploitation and abuse causes devastating harm to victims, and making sure sites and apps tackle this is one of our highest priorities," said Suzanne Cater, director of enforcement at Ofcom. She added while there had been progress with tackling CSAM on smaller services, including file-hosting and sharing platforms, the issue "extends to big platforms too". Children's charity the NSPCC welcomed Ofcom's Telegram probe. "Recent NSPCC research revealed around 100 child sexual abuse image offences are being recorded by police every day," said Rani Govender, its associate head of policy. "The scale of this abuse is stark and we strongly welcome Ofcom ramping up action to tackle it, including opening this investigation into Telegram." The probe was also welcomed by the Internet Watch Foundation (IWF), which works to identify and remove CSAM online, including on Telegram. IWF communications director Emma Hardy said the organisation shared concerns about "bad actor networks" on the platform, and "that not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed". She said while the company has taken some action, for these "to be truly effective, they need to do more". This, Hardy said, should see safeguards expanded across Telegram, including to chats users can protect with end-to-end encryption. This, Hardy said, should see safeguards expanded across Telegram, including to chats users can protect with Ofcom said it launched its probe into Telegram after it was contacted by the Canadian Centre for Child Protection over the alleged presence of and sharing of CSAM on the messaging app. It said it had also begun investigations into services Teen Chat and Chat Avenue over potential grooming risks raised through its work with child protection agencies. "Teen-focused chat services are too easily being used by predators to groom children," Cater said. "These firms must do more to protect children, or face serious consequences under the Online Safety Act." The Act's illegal content duties, which took effect in March 2025, require so-called user-to-user services like messaging apps and social networks to prove they are tackling "priority illegal content". This includes CSAM, terrorism, grooming and extreme pornography. A girl and her dog looking at a phone screen in a stock photo What the Online Safety Act is - and how to keep children safe online Ofcom has issued several fines to providers accused of failing to comply with its duties for illegal content or age checks. It has the power to fine companies £18m or 10% of their global revenues - whichever is higher - where it finds non-compliance. But its rules and enforcement actions have been met with resistance by some firms. US message-board 4chan has recently mocked the regulator's threats of action and fines with hamster memes. mocked the regulator's threats of action and fines with hamster memes However, Ofcom said on Tuesday one file-sharing service it contacted with concerns about its systems to deal with illegal content had made "material improvements" to comply with its duties. Close up on a pair of hands holding a white phone, the person is wearing a brown shirt. Porn company fined £1.35m by Ofcom over age check failings Pavel Durov, a white man white short black hair. He has designer stubble. He is wearing all black, including a cap, a turtleneck and a suit jacket. Telegram founder allowed to leave France following arrest A green promotional banner with black squares and rectangles forming pixels, moving in from the right. The text says: “Tech Decoded: The world’s biggest tech news in your inbox every Monday.” Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here. to follow the world's top tech stories and trends. The ruling could be the beginning of the end of social media as we know it, writes the BBC's technology editor Zoe Kleinman. The government will interview the young people and their parents before and after they try the limits to assess their impact. Safety campaigners say Meta is "passing the buck" with its new feature for parents using Instagram's teen supervision tools. With 144m daily users, the gaming company is extending its tech to introduce two age‑specific accounts. Some experts believe it highlights a social media shift as platforms boost short video. Roblox said safety was a top priority and it had advanced safeguards in place to keep users safe. Customers who do not confirm how old they are or are underage will have web content filters turned on automatically. The fine includes £450,000 for lack of age checks to prevent children from seeing pornography.