Technology

Social media firms asked to toughen up age checks for under-13s

UK regulators urge Instagram, Snapchat, TikTok, YouTube, and Roblox to strengthen age checks for under-13s to prioritize childrens safety.

Two children engaged with a smartphone in an indoor educational setting.

Image: GlobalBeat / 2026

UK Authorities Demand Stricter Age Verification for Under-13s on Social Media

Instagram, Snapchat, TikTok, YouTube, and Roblox under the spotlight for lax under-13 protections.

UK regulators are demanding that major social media platforms enforce stricter age checks to protect children under 13 from potential online threats.

With the digital world expanding, the safety of young children online is a growing concern. Regulators have criticized platforms for not prioritizing child safety in their product design.

ONGOING CONCERNS

The UK’s Office of Communications (Ofcom) has identified a significant gap in the protection offered by social media platforms to children under 13. Instagram, Snapchat, TikTok, YouTube, and Roblox are among the platforms that have been singled out for not having age verification processes robust enough to keep under-13s out.

“We have seen a concerning failure to prioritise child safety,” Ofcom said, emphasizing the platforms’ responsibility to protect young users.

While platforms have implemented some measures, Ofcom found that the majority of them allow under-13s to create accounts with ease. This issue becomes more pressing as children increasingly engage with social media at a young age.

REGULATORY PRESSURE

Ofcom’s call for stricter age verification comes as part of a broader push to ensure that social media platforms adhere to the UK’s online safety laws. The new rules, if enacted, could result in hefty fines or even bans for platforms that fail to comply.

“Platform designers often overlook the fact that under-13s are using their products,” said Ofcom’s Chief Executive, Melanie Carr.

The regulator is particularly concerned about the potential for grooming, bullying, and exposure to inappropriate content that under-13s may face on these platforms.

TECHNICAL CHALLENGES

Implementing effective age verification systems is not without its challenges. Platforms must balance user privacy with the need for accurate age checks, a delicate act that has stumped many tech giants.

“Age verification is a complex issue, and while we have made progress, there is more to be done,” a spokesperson for TikTok acknowledged.

Technological solutions, such as AI-based age estimation and stricter sign-up processes, are being explored by some platforms. However, concerns about data privacy and the potential for misuse of such technologies persist.

INTERNATIONAL IMPLICATIONS

The UK’s stance on age verification for under-13s could have international implications, as other countries grapple with similar issues. The European Union, for instance, has been working on digital services regulations that include provisions for the protection of minors online.

“The UK is taking a leading role in setting standards for online safety,” said a digital policy analyst at a London-based think tank.

As the global conversation on digital safety evolves, the UK’s approach to age verification on social media platforms may serve as a model for other nations.

PARENTAL CONCERNS

Parents have expressed growing concerns about their children’s safety on social media platforms. The lack of effective age verification is seen as a significant gap that exposes children to potential harm.

“As a parent, I want to know that the platforms my child uses are doing everything they can to protect them,” said Sarah Thompson, a mother of two teenagers.

The call for stricter age verification is not just about legal compliance; it’s about rebuilding trust between platforms and the parents of their youngest users.

IMPACT

The impact of stricter age verification measures will be felt across the industry, with platforms needing to invest in new technologies and processes to comply. This could lead to changes in how social media is designed and used, particularly for younger audiences.

WHAT’S NEXT

Ofcom’s demands are a clear signal that the UK is serious about holding social media platforms accountable for protecting children online. The next steps will involve closely monitoring how these platforms respond and whether they can effectively implement the required changes.

SOURCE MATERIAL:
Instagram, Snapchat, TikTok, YouTube and Roblox are among the platforms UK regulators say aren’t putting children’s safety at the heart of their products.