UK Rejects Social Media Ban for Teens, Urges Stronger Child Safety
The UK rejected a blanket social media ban for under‑16s, urging platforms to improve age‑verification systems. Regulators say tech firms must do more to protect children online.
The United Kingdom has rejected a proposal to impose a blanket ban on social media access for users under the age of 16, opting instead to pressure technology companies to strengthen online child safety measures. Lawmakers voted against the proposal that would have restricted teenagers from accessing major social media platforms.
Following the decision, the UK’s data protection watchdog, the Information Commissioner’s Office (ICO), together with communications regulator Ofcom, called on social media companies to significantly improve their age‑verification systems. Regulators said many platforms still rely on self‑declared age checks, which are easily bypassed and allow children to access services intended for older users.
Major platforms including Meta, TikTok and Snap have been urged to place children’s safety at the center of their product design. Authorities argue that recommendation algorithms and engagement‑focused features can expose younger users to harmful or inappropriate content if adequate safeguards are not in place.
Instead of imposing an outright ban, the UK government is focusing on stricter enforcement of existing regulation. Under the Online Safety Act 2023, social media companies operating in the country must take steps to protect children from harmful content and ensure stronger safeguards, including more effective age‑verification tools.
Comments (0)
No comments yet. Be the first to comment!

