UK orders swift Ofcom action on xAI Grok deepfakes

Overnight, xAI changed how Grok’s image tool works. The UK Technology Secretary, Liz Kendall, called the generation of sexualised images of women and children “despicable and abhorrent”, urged Ofcom to set out next steps “in days not weeks”, and reminded xAI that services can be blocked in the UK if they refuse to comply. The statement was published on 9 January 2026. (gov.uk)

What changed at Grok? X and xAI restricted image generation and editing to paying subscribers, a move Downing Street has already criticised as “insulting” because it risks turning unlawful image creation into a premium feature. Several outlets report that some editing routes still function for non‑paying users, so the fix may be partial. (news.sky.com)

Here’s the simple bit you can teach: paying does not make illegal content lawful. UK rules focus on design and prevention, not opt‑ins. That’s why officials are pushing for swift action and clear accountability for how image tools work across every entry point, not just the obvious ones. (washingtonpost.com)

What can Ofcom actually do? Under the Online Safety Act, the regulator can demand information, launch investigations, impose fines of up to £18m or 10% of global turnover, and in the most serious cases seek court orders to block access in the UK or cut off payment and advertising. Ofcom has already used its new powers against several services since 2025. (ofcom.org.uk)

Today’s legal baseline matters. Sharing or threatening to share intimate images without consent, including deepfakes, is already a criminal offence. And from 8 January 2026, cyberflashing became a priority offence under the Online Safety Act, meaning platforms must stop unsolicited sexual images before users receive them. (gov.uk)

What is about to change? Parliament has passed the Data (Use and Access) Act 2025, which creates new offences for creating or requesting a non‑consensual “purported intimate image” of an adult. Ministers say these provisions will be brought into force in the coming weeks. The offences are punishable by imprisonment or a fine following summary conviction. (legislation.gov.uk)

Separately, the government plans to ban so‑called nudification tools that turn real photos into fake nudes. Officials confirmed in December that new laws will target the creation and supply of these apps via the Crime and Policing Bill now before Parliament. That sits alongside work to make it impossible for children to take, share or view a nude image on their phones. (gov.uk)

Platforms are not waiting for fresh laws to start improving. Ofcom’s guidance on keeping women and girls safer online, published in November 2025, sets practical steps services should follow, from better reporting and quicker takedowns to curbing pile‑on abuse. Ofcom’s latest bulletin highlights that 99% of deepfake intimate image abuse depicts women. (ofcom.org.uk)

So what does this mean for you as a student, parent or teacher? It means services must assess risks and build protections in by default. Dating apps and social platforms now have proactive duties to block unsolicited sexual images. If you see abusive content, report it in‑app, keep evidence, and seek support from a trusted adult or safeguarding lead. (ofcom.org.uk)

What happens next? Ofcom has contacted X and xAI and is under pressure to update the public within days. If investigators find non‑compliance, the regulator can order fixes, levy significant fines, or apply to the courts for business‑disruption measures, including blocking access in the UK. (reuters.com)

← Back to Stories