UK Pressures Social Media Giants to Tighten Under‑13 Age Checks
UK regulators urged platforms including Instagram, TikTok and YouTube to strengthen age‑verification systems to prevent children under 13 from accessing their services.
UK regulators have called on major social media and video‑sharing platforms to introduce stronger age‑verification measures to protect children online. In an open letter, the Information Commissioner’s Office (ICO) urged companies to improve age‑assurance systems so that younger children cannot easily access services that are not designed for them.
The request was sent to several large technology platforms including Instagram, Snapchat, TikTok, YouTube, Roblox, Facebook and X. Regulators said many platforms are not putting children’s safety at the center of product design and that existing checks are often insufficient to prevent under‑age users from creating accounts.
The move comes as the UK steps up enforcement under the Online Safety Act 2023, which introduces a duty of care for digital platforms to protect users—particularly children—from harmful or inappropriate content. The law encourages companies to deploy effective age‑assurance methods such as age verification or age‑estimation technologies.
Regulators also emphasized that platforms must comply with data‑protection rules governing children’s personal information. The ICO said it will continue working with media regulator Ofcom to ensure companies implement stronger safeguards and design features that prioritize child safety across digital services.
Comments (0)
No comments yet. Be the first to comment!

