UK Online Safety Act CSEA reporting starts 7 April 2026
If you run or moderate an online space where people post and respond to each other, a new legal duty is about to land. The Home Office has made the Online Safety Act 2023 (Commencement No. 7) Regulations 2026, signed on 9 March 2026, which switch on the child sexual exploitation and abuse (CSEA) reporting duty on 7 April 2026. That date matters for compliance planning and for how we all expect bigger and smaller platforms to act when they find illegal child abuse content.
What exactly starts in April? Sections 66(1) and 66(2) of the Online Safety Act 2023 come into force for regulated user-to-user services. In plain English, if your service lets users upload or share content and other users can see or interact with it, you have to report detected CSEA content to the National Crime Agency (NCA). This sits alongside supporting provisions in section 66(7) to (10) that explain how the duty operates.
There is also a criminal backstop. Section 69, the offence connected to failing to meet the CSEA reporting duty, is commenced at the same time for these services. The Act’s general offences framework and rules about extra‑territorial reach are also commenced in respect of section 69, which means the offence can apply to services based outside the UK where the content or users are UK‑linked. For everyday readers, that means a global platform with UK users cannot ignore these rules simply because it is incorporated elsewhere.
Let’s get clear on scope. These changes apply to regulated user‑to‑user services under the Act. Think social networks, forums, messaging features in games, and any app where one user’s content can be seen by others. If UK users can access your service, you should assume the duty may apply and check your legal classification against the Online Safety Act’s definitions. Ofcom will expect accurate record‑keeping to evidence how you meet the duty.
What counts as CSEA content? The Act points to specific criminal offences, not just ‘harmful’ material. We’re talking about content that amounts to child sexual abuse or exploitation in law, for example illegal images or grooming activity. The duty is about reporting detected and previously unreported CSEA content; it does not replace existing obligations to remove illegal content swiftly or to protect users through strong safety systems.
Who do you report to, and how fast? The destination is the National Crime Agency. The legislation is designed to route industry reports to the NCA so investigators can act. In practice, services will need clear internal playbooks so that when moderators or automated systems flag suspected CSEA content, staff can assess, preserve necessary data for law enforcement, and file a report without delay. Accuracy and timeliness both matter.
Regulation also needs teeth and transparency. The regulations bring into force powers for Ofcom, the independent regulator, to require information relevant to this duty and, where necessary, to commission ‘skilled person’ reports. Transparency reporting is in scope too, so Ofcom can ask providers for specific data about how they are meeting the CSEA reporting duty. That helps Parliament, teachers, parents and young people see whether platforms are doing the basics to protect children.
Dates to mark in your diary: the instrument was made on 9 March 2026 and the duty starts on 7 April 2026. Most of the supporting offence and enforcement provisions that start on that date do so only for regulated user‑to‑user services, which is where most peer‑to‑peer sharing happens. That targeted switch‑on is intentional: it focuses first on the spaces where children are most likely to interact and be at risk.
If you’ve followed the Online Safety Act roll‑out, you may remember an earlier timetable that slipped. A previous set of commencement regulations from 2025 was revoked before taking effect. This new commencement instrument resets the start date and makes the legal position clear for providers and for Ofcom. The message from government is simple: from April, reporting CSEA content is not optional.
What should service providers and community moderators do now? Start by confirming whether you are a regulated user‑to‑user service. Map your detection and escalation steps so that moderators know exactly when to preserve evidence and when to trigger an NCA report. Check who signs off a report, how you log it, and how you safeguard affected users. For small teams, that might mean nominating a trained safeguarding lead and documenting a short, plain‑English workflow.
For learners and teachers, here’s the civic takeaway. This is an example of Parliament using commencement regulations to ‘turn on’ specific parts of an Act in stages. It shows how criminal law (the offence for failing to report) and regulatory oversight (Ofcom’s information powers and transparency asks) are designed to work together. The goal is swift reporting to specialists at the NCA when illegal child abuse content is found.
Finally, a note on culture as well as compliance. Technology and moderation choices are part of safeguarding. The law sets the floor; your service design sets the standard users actually feel. Clear reporting routes for your community, well‑trained staff, and steady collaboration with law enforcement are what move this beyond a tick‑box into real‑world protection for children. That’s something all of us-educators, students and platform teams-can stand behind.