UK Tech Crackdown: Platforms Must Overhaul Algorithms and Age Checks or Face Major Fines
Websites and online platforms will be required to overhaul their content recommendation algorithms for young users and implement stronger age verification measures—or risk facing substantial fines—according to the UK’s media regulator, Ofcom.

The final version of Ofcom’s Children’s Safety Codes has been released, promising “transformational new protections” for minors online. These new rules are part of the Online Safety Act and still require parliamentary approval.
Under the codes, platforms hosting pornography or content promoting self-harm, suicide, or eating disorders must take stronger steps to prevent children from accessing such material.
Ofcom Chief Executive Dame Melanie Dawes called the move a “gamechanger,” emphasizing that platforms must adapt or lose the ability to serve users under 18 in the UK. “Unless you know where children are, you can’t give them a different experience to adults,” she told BBC Radio 4’s Today programme. While she acknowledged that enforcement won’t be foolproof, she stressed that these rules have legal weight.
However, critics argue the measures don’t go far enough. Ian Russell, chair of the Molly Rose Foundation—named after his daughter who died by suicide at 14—voiced his disappointment at what he described as a lack of ambition in the codes.
Former Facebook safety officer Prof. Victoria Baines told the BBC the move is “a step in the right direction,” noting that tech firms are beginning to invest more resources and personnel into online safety.
Technology Secretary Peter Kyle underscored that one of the main targets is the algorithms feeding harmful content to children. “Most kids don’t seek out this material—it finds them,” he told BBC Radio 5 Live. He also revealed that the government is considering a social media curfew for under-16s, but any such move would be based strictly on supporting evidence.
Key Measures in the Children’s Safety Codes:
- Adjusting algorithms to filter harmful content from children’s feeds
- Implementing robust age verification for age-restricted content
- Immediate removal of harmful material once identified
- Simplified terms of service that children can easily understand
- Allowing children to opt out of potentially harmful group chats
- Providing direct support for children exposed to harmful content
- Appointing a named person responsible for child safety
- Requiring annual reviews of risks to children by senior management
If platforms fail to comply, Ofcom has the authority to impose fines or, in severe cases, seek court orders to block sites or apps in the UK.
Children’s charity NSPCC welcomed the codes as “a pivotal moment for children’s online safety,” but urged Ofcom to go further—particularly concerning encrypted messaging apps, which often evade monitoring.