New laws regarding children’s safety online are to be introduced, regulating what they are not able to see when accessing the internet.
This new legislation comes in the wake of increased concerns over child safety both online and in day-to-day life, following the government’s recent U-Turn over a public enquiry into grooming gangs, as well as reports that one Bradford school has recently faced issues with pupils being sent inappropriate images via social media service Snapchat. Meanwhile, films such as Netflix’s Adolescence have helped to raise the profile for these issues globally.
More and more, children and young people are being exposed to a wide range of sensitive and inappropriate material aimed at adults, findings have shown. The media regulator Ofcom recently found that children as young as eight may be accessing pornography online, whilst as many as 16% of teenagers have reported seeing material that promotes harmful ideas towards body types and eating habits.
Ofcom also reported that the average age of users first accessing online pornography was just 13, whilst over half of boys aged 11-14 have engaged with influencers connected with the so-called ‘manosphere’.

Image: Shutterstock
The new Child Safety Codes will aim to prevent anyone under 18 from accessing inappropriate material by requiring sites showing such content to introduce secure identity checks, in order to verify they are not too young to be doing so.
These age verification measures will be applied to sites hosting pornography, as well as any other content deemed harmful for minors, including material related to self-harm, suicide, or eating disorders. Age-appropriate limits will also be placed on other potentially harmful content, such as bullying, abuse or hate speech, and content relating to violence or encouraging physical danger.
Users wishing to access such sites will be required to verify themselves using either facial age recognition software, photo ID matching, or credit card checks.
Websites and social media platforms will also be required to introduce more stringent content moderation systems, ensuring that search engines filter out potentially harmful content, with safe search settings that can’t be switched off by anyone under 18, and combating toxic algorithms that place harmful content across social media feeds. Sites will have a duty to take steps against such content if it is being shared across their algorithms, and to make it easier for young people to report harmful content.
The Child Safety Codes will be introduced into UK law as part of the Online Safety Act, introduced in 2023, which had already taken steps to enforce age verification for certain sites. As many as a thousand platforms have as a result already implemented many of these checks, including the UKs most frequented pornography site, PornHub.
Numerous other sites have now committed to more stringent checks in line with the new law, including Reddit, Grindr, and the social media giant X.

“We’ve drawn a line in the sand” the Technology Secretary Peter Kyle has stated. “This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online. The time for tech platforms to look the other way is over. They must act now to protect our children, follow the law, and play their part in creating a better digital world. If they fail to do so, they will be held to account.”
This new legislation from the government has been welcomed by charities and other child support services. “Children, and their parents, must not solely bear the responsibility of keeping themselves safe online,” states Chris Sherwood, Chief Executive at the National Society for the Prevention of Cruelty to Children. “It’s high time for tech companies to step up. If enforcement is strong, these Codes should provide children and young people with a vital layer of protection when they go online. If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes. It must use its wide range of powers, including fines, to protect children from harm and maintain public confidence in this new regulatory regime for online safety.”
The new Child Safety Codes will be signed into law from tomorrow. Failure to comply with the new legislation measures will be subject to serious enforcement measures, including fines of up to £18m, or 10% of their worldwide revenue – whichever is greater.
Questions have been raised over whether the usage of VPNs will allow some users to bypass the new regulations. Ofcom have however maintained that a majority of parents believe the measures they have set out will nevertheless improve children’s safety, with seven in ten parents reportedly stating that the measures will make a positive difference to children’s safety online, while over three-quarters, while nine in 10 agree that it is important tech firms follow Ofcom’s rules.
“Prioritising clicks and engagement over children’s online safety will no longer be tolerated in the UK,” stated Ofcom Chief Executive Dame Melanie Dawes. “Our message to tech firms is clear – comply with age-checks and other protection measures set out in our Codes or face the consequences of enforcement action from Ofcom.”
Any concerns regarding a child’s safety online can be raised with the Police, or with the Bradford Families and Young Persons Information service, which can be reached on 0800 953 0966.



