UK sets 48-hour rule on non-consensual intimate images

Here’s the headline in plain English: the government plans to force tech platforms to remove any intimate image shared without consent within 48 hours of it being flagged. Ministers say this change will be written into the Crime and Policing Bill, with Ofcom able to fine companies up to 10% of global revenue or even block services that refuse to act. Announced on 19 February 2026, the proposal is framed as part of tackling violence against women and girls online. (gov.uk)

What does “48 hours” really mean? The clock starts when the image is reported to the platform. The rule would apply across mainstream social media and adult sites, not just one platform at a time, so survivors aren’t stuck repeating the same report over and over. Government briefings and independent reporting both emphasise the threat of service blocking as well as fines to drive compliance. (gov.uk)

The promise of “report once, remove everywhere” matters. Officials want platforms to coordinate removals so the same image comes down across multiple services in one go, and then stays down. Ofcom is exploring digital fingerprinting (hash‑matching) so reposts are automatically stopped, similar to how the internet already handles known child sexual abuse and terrorist content. (gov.uk)

If you’re wondering who actually polices this, it’s Ofcom. Under the Online Safety Act, Ofcom can set detailed codes of practice and investigate, then issue penalties or require access blocking for serial non‑compliance. In this plan, intimate image abuse would be treated with the same top‑tier seriousness as the worst illegal content, with technical measures to stop re‑uploads. (gov.uk)

Where are we in the law‑making process? As of 19 February 2026, the 48‑hour rule is a government amendment to the Crime and Policing Bill. The bill has finished committee stage in the House of Lords; report stage begins on 25 February 2026. That means these duties aren’t in force yet. If Parliament passes them, Ofcom will consult on the exact rules and timelines before enforcement starts. (parliament.uk)

Why now? Because the scale of abuse is rising. The UK’s Revenge Porn Helpline handled 22,275 reports in 2024, up 20.9% year on year. Campaigners and researchers also flag a surge in deepfake nudes and so‑called “nudification” tools that strip clothing from images, making rapid takedown and proactive blocking more urgent. (swgfl.org.uk)

Does this include AI‑generated images and chatbots? Yes-ministers have specifically pointed to deepfakes, ‘nudification’ tools and chatbots like Grok. Government statements say creating or sharing non‑consensual intimate images will be treated as a ‘priority offence’ under the Online Safety Act, raising expectations on platforms to prevent and remove this material swiftly. (gov.uk)

What if the site is overseas or refuses to cooperate? The plan includes guidance for UK internet providers to block access to rogue sites hosting this content, even where the Online Safety Act’s usual reach is limited. That gives Ofcom a back‑stop if fines alone don’t change behaviour. (gov.uk)

A quick word on consent for classrooms and parents: consent must be clear, ongoing and freely given. Sharing or threatening to share someone’s intimate image without permission is abuse. If the person in the image is under 18, the law already treats those images as child sexual abuse material and platforms must act immediately, regardless of consent.

If this happens to you or someone you teach, prioritise safety and evidence. Save URLs, usernames, messages and timestamps. Use each platform’s in‑app reporting, and if there’s extortion or threats, contact the police. Adults can also use StopNCII to create a secure hash (a digital fingerprint) so partner platforms can block re‑uploads. The Revenge Porn Helpline can guide next steps and support you emotionally and practically. (stopncii.org)

For teachers and designated safeguarding leads, keep the focus on support over blame. Make space for the student to talk, avoid asking for the image, and agree an action plan together: immediate platform reports, StopNCII where appropriate, and contact with parents or carers where it’s safe to do so. Reinforce that being targeted is never the victim’s fault.

What comes next? Watch the bill’s progress in late February and into spring. If Parliament approves the amendment, Ofcom will publish draft guidance and timelines, then start enforcing-so expect platforms to update their reporting tools, takedown workflows and cross‑platform hashing systems ahead of the deadline coming into force. We’ll keep this page updated as dates are confirmed. (bills.parliament.uk)

← Back to Stories