UK deepfake intimate image law starts 6 Feb 2026
From Friday 6 February 2026, it becomes a criminal offence to create or to ask someone to create AI‑made or otherwise fabricated sexual images of an adult without their consent. Parliament put these offences on the statute book in the Data (Use and Access) Act 2025, which adds new provisions to the Sexual Offences Act 2003. (apnews.com)
Here’s the plain‑English version. Two offences are switching on: creating a purported intimate image of an adult (new section 66E) and requesting the creation of a purported intimate image of an adult (new section 66F). The Act also adds definitions (section 66G) and a special prosecution time limit (section 66H). These sit within the Sexual Offences Act 2003, which governs criminal sexual conduct in England and Wales. (legislation.gov.uk)
The law uses the term “purported intimate image”. In practice, that means an image that looks like a photo or video of an adult and appears to show them naked or engaged in sexual activity, even if the image is AI‑generated, edited or otherwise not genuine. This wording is designed for deepfakes and similar manipulations. (legislation.gov.uk)
Consent is central. You commit the offence if the person shown does not consent and you do not have a reasonable belief that they do. That second part matters for education: a casual “go on then” in a group chat is not reliable proof. The law also says “requesting” can include doing something that could reasonably be taken as a request, such as agreeing to an offer to make an image. (legislation.gov.uk)
Prosecutors get more time than usual to bring cases. For these two offences, a case can be brought within three years of the alleged conduct and within six months of the point at which the prosecutor thinks they have enough evidence to proceed. This is intended to reflect how such images often surface later. (legislation.gov.uk)
Courts will be able to issue deprivation orders covering the image itself and anything containing it, such as a phone, laptop or hard drive. In plain terms, that enables the court to strip offenders of the unlawful images and the devices used. The Sentencing Code is amended for this purpose. (legislation.gov.uk)
These powers also extend into the Service Justice System. Where the conduct amounts to a service offence with an equivalent in England and Wales, service courts can order deprivation of the image and devices under the Armed Forces Act 2006. (legislation.gov.uk)
There’s a limit on “encouraging or assisting” liability for the requesting offence. The Serious Crime Act 2007 is amended so people are not double‑charged under general encouragement laws as well as the specific “requesting” offence. This keeps the focus on the tailored offences Parliament created for deepfakes. (bills.parliament.uk)
For schools, colleges and universities, this is a chance to update safeguarding and behaviour policies before February. Make sure your community understands that using an app or chatbot to make a sexualised image of a classmate or staff member is not a prank; it’s a criminal offence. Build this into digital citizenship lessons, refresh reporting routes to the designated safeguarding lead, and be clear about how evidence is captured and stored if a report is made.
For students and families, the message is simple: do not create or ask for AI sexual images of anyone without clear permission. If you’re targeted, keep any evidence, avoid resharing, tell a trusted adult and report it to the police. Note that the new offences apply to adults; images of under‑18s are already illegal under separate child protection laws, which remain in force. The government has also trailed related reforms on taking intimate images and installing equipment to enable such abuse. (gov.uk)