- News
- UK
- Home News
The regulators accused tech platforms of ‘failing to put children’s safety at the heart of their products’
Nicole Wootton-Cane Thursday 12 March 2026 00:11 GMT- Bookmark
- CommentsGo to comments
Bookmark popover
Removed from bookmarks
Close popoverProtesters accuse Meta of fuelling youth mental health crisis
Get the free Morning Headlines email for news from our reporters across the world
Sign up to our free Morning Headlines email
Sign up to our free Morning Headlines email
Email*SIGN UPI would like to be emailed about offers, events and updates from The Independent. Read our Privacy notice
Two UK watchdogs have issued an urgent plea to social media companies to strengthen their age-checking processes, saying they must “act now” to keep children safe online.
In an open letter the Information Commissioner’s Office (ICO), and Ofcom, which oversee data and communications regulation in the UK, accused tech platforms including Meta, Snap, and TikTok of “failing to put children’s safety at the heart of their products”.
It comes after protesters gathered at Meta’s London headquarters on Wednesday to accuse the social media giant of designing “addictive” and “dangerous” algorithms that damaged their mental health as teens. In the US the platform is facing legal action over similar claims.
open image in galleryTech companies must do more to block children from using social media platforms, the regulators said (AFP/Getty)The letter sent to Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube, asks platforms to explain what actions they are taking on age checks and grooming protections before the end of April.
The regulators warn children can “easily bypass” existing age barriers, saying there is “no excuse” for not having effective age gates in place. ICO chief executive Paul Arnold said most services use self-declaration to identify whether children are 13 or over and that this method can be easily bypassed and is therefore ineffective.
“With ever growing public concern, the status-quo is not working and industry must do more to protect children,” he added.
Tech firms are coming under increasing fire to strengthen protections for children online, with critics warning they are too often leaving young people “open to harm and exploitation”.
A recent study found British children are potentially being exposed to guns, self-harm, misogyny and sex within minutes of creating social media profiles, while others have said they believed social media’s “addictive” design played a role in their child’s suicide.
Ofcom said that without proper protections such as strict age checks, children are being “routinely exposed to risks”.
open image in galleryA recent study found British children are potentially being exposed to guns, self-harm , misogyny and sex within minutes of creating social media profiles (PA)Its chief executive, Dame Melanie Dawes, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products. There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid. That must now change quickly, or Ofcom will act.”
Ofcom said it will publicly report in May on the responses from the platforms it has contacted, and alongside this will publish new research on how much or how little children’s online experiences have changed during the first year of the Online Safety Act being in force.
The regulator said if it is not satisfied with the platforms’ responses, “we will be ready to take enforcement action” and could consider strengthening the regulatory requirements under existing industry codes “to ensure further change”.
In February, the ICO fined Reddit more than £14m over claims it failed to protect the children who use its platform. The watchdog’s investigation found that Reddit had failed to check the age of users on its platform, which put children at risk.
The government has launched a consultation into whether to ban social media for under 16s, but on Monday rejected a vote to bar them completely. MPs instead supported a government bid to give additional, more flexible powers to ministers following the consultation.
A Roblox spokesperson said: “Roblox is deeply committed to safety, and we are in regular dialogue with Ofcom about how we protect our community of players. In the past year alone, we have launched more than 140 new safety features, including the introduction of new mandatory age checks that all players must complete in order to access chat features on Roblox. These age checks are designed to limit communication between adults and kids, ensuring players can chat only with others of a similar age – addressing a key concern highlighted by Ofcom today.
“Age checks are just one in a suite of safeguards on Roblox, including age-appropriate content restrictions, chat monitoring and 24/7 content moderation. While no system is ever perfect, we continue to strengthen protections designed to keep players safe and look forward to demonstrating our efforts in our ongoing dialogue with Ofcom.”
Meta, Snap, TikTok, and YouTube have been contacted for comment.
More about
Information Commissioner's OfficeICODataalgorithmsJoin our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments