The protection of children on the internet has been huge focus in the UK media lately, from calls for protection of children online from bereaved parents travelling to the United States to calls from Duke and Duchess of Sussex for more to be done to protect children online.
Last Thursday (April 24, 2025), Ofcom published a major policy statement containing six volumes and eight finalized pieces of guidance as part of its Phase 2 implementation of the Online Safety Act (OSA) for protecting children from harm online. Given the media coverage and Ofcom’s policy statement, the protection of children online is under even greater scrutiny than before.
As part of OSA Phase 1 implementation, Ofcom has shown it is serious with its enforcement on platforms’ OSA duties. More information can be found in our previous article Ofcom ramps up pressure for platforms under UK Online Safety Act.
In this article, part of a wider series about the Online Safety Act, we explore what additional measures will be implemented to keep kids safe online.
Children’s Access Assessment
From April 16, 2025, all platforms regulated under the OSA are expected to have completed their Children’s Access Assessment. Platforms need to assess (at the minimum every 12 months) whether it is likely for children to access their platform, with considerations of:
- is it possible for children to normally access the service; AND
- Either:
- are there a significant number of children who are users of the service? OR
- is the service of a kind likely to attract a significant number of children?
Satisfying Questions 1 and 2 will result in the outcome that it is likely for children to access the assessor’s platform. Whilst this is relatively easy to complete, it demonstrates the expectation from the OSA and Ofcom for all providers to consider the impact and risk of access by children on their platform even if they provide adult-only content (such as pornography).
Ofcom has made this clear by citing evidence that children may be attracted to dating and pornography services. Unless platforms have implemented “highly effective age assurance,” Ofcom expects the risk assessment to determine that it is likely for children to access the platform.
Be sure to look out for our upcoming article in June on highly effective age assurance’ which is required to be in place for adult content providers by July 2025.
Children’s Risk Assessment
The Children’s Risk Assessment must be conducted every 12 months by all platforms likely to be accessed by children. It categorizes harmful content to children in three categories:
- Primary Priority Content (PPC): i) pornography; ii) content that encourages, promotes, or provides instructions for suicide; iii) content of the same for deliberate self-injury; and iv) content of the same for behaviors associated with an eating disorder. (Four types of PPC).
- Priority Content (PC): eight types of content as outlined by the Ofcom guidance, similar to the 17 priority illegal harms, such as abuse, hate, bullying and violence. (Eight types of PC).
- Non-Designated Content (NDC): content which presents material risks of significant harm to children in the UK, such as body-shaming or body-stigmatizing content or content promoting depression, hopelessness and despair. (at least two types of NDC as identified by Ofcom).
Platforms must then conduct a risk assessment on i) the likelihood of a child encountering the harm; and ii) the impact to children from the kind of content, for each of the four PPCs, eight PCs and at least two of the NDCs identified by Ofcom, as well as any additional NDCs identified by the platform. The deadline is July 24, 2025.
Non-Designated Content
Identifying and assessing NDC may be challenging as it is non-specific and Ofcom expects platforms to be able to review services in-depth and to identify NDC. This means platforms cannot rely on Ofcom to outline the risks they need to consider, and must identify and assess additional risks unique to their platforms. Platforms are likely to require expert help in assessing these risks.
Platforms are also under an obligation to report identified NDC to Ofcom at nondesignatedcontent@ofcom.org.uk. Whilst there is no specified timeframe for reporting, it is likely that newly identified NDC would have arisen at the most recent Children’s Risk Assessment and should be reported accordingly upon conclusion of the risk assessment.
Illegal Harms Risk Assessment
The Children’s Risk Assessment employs the same methodology as the Illegal Harms Risk Assessment in terms of how risk assessments are conducted, which should have been in place for all platforms from March 16, 2025. Platforms are expected to have evidential input into risk assessment such as core inputs of user data and incident reviews as well as enhanced inputs such as product testing data and consultations.
The same record-keeping requirements also apply, so the information captured in the Children’s Risk Assessment should largely be the same as the Illegal Harms Risk Assessment.
There is an inevitable overlap of the risks considered in the Illegal Harms Risk Assessment and the Children’s Risk Assessment. Ofcom still expects a separate risk assessment into the risks and harms but specifically in the context of protecting children online, however both sets of risk assessments should work alongside each other to outline risks specific to illegal harms and/or the protection of children.
Protection of Children Code
Similar to the recommended measures of the illegal content code of practice, Ofcom has published roughly 70 recommended measures for user-to-user services and search services to implement following completion of the Children’s Risk Assessment. The recommended measures under the Protection of Children Code are broadly similar to the illegal content code, such as requirements for governance and accountability, content moderation, and reporting and complaints.
However, there are additional measures, such as age assurance processes and default settings for children. It is expected the same ‘comply or explain’ approach and the ‘forbearance period’ of up to six months would apply, meaning by February 2026, platforms should have these measures implemented, failing which enforcement penalties of fines of up to £18m ($24m) or 10% of global turnover, whichever is higher, could apply to those in default.
Next steps
Our advice is that platforms should use their existing Illegal Harms Risk Assessment to aide construction of their Children’s Risk Assessment. Adult content providers who are yet to implement ‘highly effective age assurance’ will have to update their Children’s Access Assessment once it is implemented by July 2025, as such it is unlikely they will need to conduct a Children’s Risk Assessment.
Ofcom is consulting on changes to the requirement of blocking and muting user controls as well as disabling comments for services between 700,000 to seven million monthly UK users, as opposed to only services with seven million monthly UK users. The consultation closes July 22, 2025. There is also an existing consultation on the draft guidance for how to protect women and girls online which closes May 23, 2025.
Phase 2 implementation of the OSA is well and truly underway. Two additional risk assessments and over 70 recommended measures will add significant obligations to platform providers on top of their duties on illegal harms. The overlap of risks should make it easier for platforms, but considerations should be made to ensure these overlapping risks and assessments complement each other and are consistent.
As Deputy Managing Partner of Katten’s London affiliate, Terry Green is a trusted adviser to international banks, social media platforms, investment funds, large hotel groups, family offices and luxury retailers. Larry Wong is a trainee solicitor in the Social Media Group at Katten.
