Skip to main content
Advertisement

UK Regulators Demand Stronger Age Verification on Social Media for Under-13s

UK regulators Ofcom and ICO urge major social media platforms to adopt stronger age verification to protect under-13s, highlighting current safeguards' shortcomings and calling for robust, legally compliant measures.

·4 min read
Getty Images A boy in a blue hoodie leaning against a white wall, holding a phone in both hands and looking down at it. His face is covered by his brown hair.

UK Regulators Call for Enhanced Age Verification on Social Media Platforms

Major technology companies have been urged to implement more stringent age verification measures for users under 13 in the UK, comparable to those currently mandated for adult-oriented services.

The media regulator Ofcom and the Information Commissioner's Office (ICO) have contacted platforms including Facebook, Instagram, Snapchat, TikTok, YouTube, Roblox, and X, emphasizing the need to improve protections for younger children online.

Ofcom Chief Executive Melanie Dawes criticized current practices, stating that services are "failing to put children's safety at the heart of their products." The companies, however, have defended their existing safeguards. Google, which owns YouTube, expressed surprise at Ofcom's approach and suggested the regulator should prioritize higher-risk services instead.

Despite these defenses, both Ofcom and the ICO insist that social media firms must reinforce their efforts to prevent children under 13 from registering on their platforms.

Currently, many platforms depend on self-reported ages during sign-up processes. The ICO highlighted the limitations of this approach in an open letter to social media and video platforms:

"As self-declaration is easily circumvented, this means underage children can easily access services that have not been designed for them,"

Most social media platforms set a minimum age limit of 13. However, Ofcom research indicates that 86% of children aged 10 to 12 have their own social media profiles.

Ofcom advocates for the adoption of "highly-effective age checks," a requirement currently enforced only for certain services offering over-18 content, such as pornography. Applying similar verification methods to social media platforms used by young children would necessitate voluntary adoption of robust measures by major technology companies.

The ICO's concern centers on the processing of young children's data. Its letter, signed by Chief Executive Paul Arnold, states:

"Where services have set a minimum age - such as 13 - they generally have no lawful basis for processing the personal data of children under that age on their service,"

Technology Secretary Liz Kendall affirmed that no platform would receive leniency regarding child protection and expressed full support for Ofcom's regulatory role:

"No company should need a court order to act responsibly to protect children,"

and added that Ofcom has her full backing in holding platforms accountable.

Advertisement
A green promotional banner with black squares and rectangles forming pixels, moving in from the right. The text says: “Tech Decoded: The world’s biggest tech news in your inbox every Monday.”

Responses from Technology Companies

YouTube stated it was surprised by Ofcom's "move away from a risk-based approach, particularly given that we routinely update them and other regulators on our industry-leading work on youth safety." The company urged regulators to focus on high-risk services failing to comply with the Online Safety Act.

Meta, owner of Facebook and Instagram, indicated that many of Ofcom's recommendations are already implemented, including the use of artificial intelligence to estimate users' ages based on activity and facial age estimation technology. Meta also noted that implementing age verification for app stores could streamline the process, allowing parents and teens to provide personal information only once.

Snapchat reported it is currently testing age verification tools.

TikTok described its use of "enhanced technologies" to detect and remove underage accounts. The company also claimed to be the only major platform to transparently publish the number of suspected under-13 accounts it removes, reporting over 90 million suspected under-13 accounts removed between October 2024 and September 2025.

Roblox highlighted additional protections for under-13 users and noted the release of 140 new safety features in the past year, including "the introduction of new mandatory age checks that all players must complete in order to access chat features." A spokesperson expressed anticipation in demonstrating their efforts during ongoing discussions with Ofcom.

X was contacted for comment by but did not respond.

Expert Opinions on Regulatory Measures

Professor Amy Orben, a digital mental health expert at Cambridge University, welcomed the regulators' actions but emphasized that this should be the beginning of stronger regulation:

"Safety must be built into products by design rather than treated as an afterthought, with regulators showing more strength in holding companies to account,"

Social media analyst Matt Navarra pointed out that the "real risk" lies in algorithms and recommendation systems, another area Ofcom identified as needing attention:

"Knowing a user is a child is step one, but designing a platform that doesn't exploit their attention is the next step - and that step is actually much harder,"

Additional reporting by Chris Vallance.

for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? here.

This article was sourced from bbc

Advertisement

Related News