UK under-16 social media ban is wrong, says Molly’s dad

You’re going to hear a lot this week about banning under‑16s from social media. Ian Russell, father of Molly Russell, says that would be wrong. Speaking to the BBC’s Newscast, he urged ministers to enforce existing laws rather than, in his words, “sledgehammer” bans. A joint statement from the Molly Rose Foundation, the NSPCC and Childnet calls a blanket ban the wrong solution and warns it could create a false sense of safety as harms shift elsewhere online. (sg.news.yahoo.com)

Before we decide what’s best for young people, it helps to be clear about what a ban actually means. Australia’s new rules, in force since 10 December 2025, require platforms to take “reasonable steps” to stop under‑16s holding accounts, backed by large fines. Companies can use age checks such as ID verification or AI estimation, but they are not required to verify every single user. This distinction matters when you think about privacy, feasibility and classroom impact. (abc.net.au)

The Australian rollout is already shaping the global conversation. In the first days, platforms reported removing or restricting more than 4.7 million accounts believed to belong to under‑16s. Meta alone said it blocked around 550,000 across Instagram, Facebook and Threads. Supporters see rapid action; critics point to evasion and potential displacement to lesser‑known apps. Both dynamics can be true at once, and that’s why evidence, not slogans, should guide policy. (apnews.com)

In the UK, the politics are moving quickly. Prime Minister Sir Keir Starmer has kept “all options on the table”, while Health Secretary Wes Streeting has asked US academic Jonathan Haidt to brief officials on tougher curbs. Conservative leader Kemi Badenoch says her party would bring in a ban if it won the next election. Meanwhile, peers resume report stage of the Children’s Wellbeing and Schools Bill on Monday 19 January and Wednesday 21 January, when amendments on social media access could be debated and voted on. (ft.com)

Russell’s worry is about unintended outcomes. If you ban access to major platforms without fixing product design and enforcement, some young people will migrate to harder‑to‑monitor spaces. We’ve already seen early reports in Australia of teens seeking alternatives and using workarounds, which is why charities argue for smarter rules rather than a simple on/off switch. Our job as educators, parents and students is to ask whether a proposal reduces risk in practice, not just on paper. (theguardian.com)

So what are the charities asking for instead? First, enforce today’s minimum age robustly so under‑13s can’t sign up to social media, games or AI chatbots. Second, force platforms to apply age‑appropriate design-switching off high‑risk features for younger teens by default. Third, strengthen the Online Safety Act so companies must deliver safer experiences for different age groups, with platforms rated like films to reflect their risk profile. This is the “broader and more targeted” path the signatories set out. (sg.news.yahoo.com)

Let’s pause on age checks because you’ll hear the terms used loosely. Age assurance covers a range of methods-from privacy‑preserving estimation to verified ID-each with trade‑offs for accuracy, inclusion and data security. Australia’s model rests on “reasonable steps”, not checking every user, which is why effectiveness will depend on product changes as much as paperwork. If you’re discussing this in class, compare the methods and ask which are fair, proportionate and workable. (abc.net.au)

For teachers, this is a teachable moment. You can help students weigh up two ideas at once: social media can expose children to serious harms, and yet a blunt ban can miss where risk really lives-in product features and weak enforcement. Bring students into the evidence, use real examples from trusted sources, and keep the focus on practical safeguards at home and in school routines, not scare stories. (sg.news.yahoo.com)

For families, simple steps matter while Parliament debates. Talk together about minimum ages and why they exist, switch off features that nudge constant scrolling, and agree screen‑free times so focus and sleep improve. If a young person does see harmful content, the goal is not blame but support-save evidence, report it in‑app, and speak to a trusted adult or helpline. Change will come from law and design, but care at home still counts every day.

What happens next? Peers return to the bill on Monday 19 January and Wednesday 21 January. If an amendment on under‑16s passes in the Lords, the question comes to the Commons. The government says it is reviewing options while relying on the Online Safety Act for now. As readers of The Common Room, we’ll keep you focused on the facts, the trade‑offs, and what any change would mean for your classroom and your family. (bills.parliament.uk)

← Back to Stories