The Ofcom investigation into major online platforms has widened as the UK regulator examines whether services such as Telegram, Teen Chat, and Chat Avenue are doing enough to prevent child sexual abuse and online grooming. The action comes under the Online Safety Act, which requires platforms to assess and reduce risks related to illegal content, including child sexual abuse material (CSAM).
The UK’s communications watchdog said the Ofcom investigation was launched after receiving evidence suggesting that harmful content and predatory behavior may be occurring across these platforms, raising serious concerns about user safety, especially for children.
Ofcom Investigation Into Telegram over CSAM Risks
A key part of the Ofcom investigation focuses on Telegram and its potential exposure to child sexual abuse material. Authorities confirmed they received intelligence from the Canadian Centre for Child Protection, which indicated the alleged presence and sharing of CSAM on the platform.
Following this, Ofcom conducted its own assessment and decided to formally investigate whether Telegram has failed to meet its legal obligations under the Online Safety Act. In the UK, both the possession and distribution of such material are criminal offenses, placing significant responsibility on platforms to actively detect and remove it.
Regulators stated that platforms offering user-to-user communication must implement systems to identify and mitigate risks. The Ofcom investigation will assess whether Telegram has adequate safeguards in place or if gaps in enforcement have allowed illegal content to circulate.
Teen Chat Platforms Under Scrutiny for Grooming Risks
The Ofcom investigation also extends to Teen Chat and Chat Avenue, which are being examined for their potential role in enabling online grooming. These platforms offer features such as open chatrooms, private messaging, and media sharing, which regulators say can be misused by predators.

Online grooming can involve coercing minors into sharing explicit content, engaging in sexual conversations, or arranging offline meetings. Ofcom said it has been working with child protection agencies to identify services where such risks are higher.
Despite prior engagement with the companies, the regulator said it remains unconvinced that sufficient protections are in place. The Ofcom investigation will determine whether these platforms are properly assessing risks and taking steps to prevent children from being exposed to harmful or illegal activity. In the case of Chat Avenue, the probe will also examine whether adequate safeguards exist to block minors from accessing explicit content.
File-Sharing Platforms Show Mixed Progress
Alongside messaging and chat services, the Ofcom investigation has reviewed file-sharing platforms, which have historically been used to distribute CSAM. Regulators noted some progress in this area.
For instance, Pixeldrain has implemented perceptual hash-matching technology, allowing automated detection and removal of known abusive content. This came after Ofcom raised concerns about the platform’s initial lack of safeguards.
Another service, Yolobit, has restricted access to users in the UK, leading Ofcom to close its investigation. Several other file-sharing providers have taken similar steps, either blocking UK access or deploying detection technologies following enforcement action.
These developments suggest that regulatory pressure is pushing some platforms to improve, though the Ofcom investigation indicates that broader risks remain across different types of online services.
Enforcement Powers and Next Steps
Under the Online Safety Act, the Ofcom investigation follows a structured process. Regulators will gather and analyze evidence before determining whether a platform has breached its legal duties. Companies will be given a chance to respond before any final decision is made.
If violations are confirmed, Ofcom has the authority to impose strict penalties. These include fines of up to £18 million or 10 percent of global annual revenue. In more serious cases, courts can enforce business disruption measures, such as requiring internet providers to block access to a platform in the UK or cutting off payment and advertising services.
Suzanne Cater, Director of Enforcement at Ofcom, emphasized that tackling child exploitation remains a top priority. She noted that while some progress has been made, especially among file-sharing services, risks persist across larger platforms and youth-focused chat services.
Growing Pressure on Platforms to Comply
The Ofcom investigation highlights increasing regulatory scrutiny on online platforms operating in the UK. Under the Online Safety Act, any service accessible to UK users must comply with local laws, regardless of where the company is based.
With investigations now underway across messaging apps, chat platforms, and file-sharing services, the regulator is signaling that failure to protect users, particularly children, will carry serious consequences.
As the Ofcom investigation continues, further updates are expected on whether these platforms will face enforcement action or be required to strengthen their safety measures.

