UK urges Ofcom to act on X Grok deepfakes in days

Ministers have told Ofcom to decide “in days, not weeks” whether enforcement action is needed against X after its Grok tool was used to create sexualised deepfakes. On 9 January 2026, Technology Secretary Liz Kendall said the regulator should use the “full legal powers” Parliament has given it and reminded xAI that UK law allows for blocking if companies refuse to comply. Downing Street also condemned limiting image edits to paying users as “insulting” to victims. (gov.uk)

Here’s the spark for the row. On X, users have been tagging @grok under images and asking it to undress or sexualise people without consent. After a public outcry, Grok now tells users that image generation and editing are limited to paying subscribers-effectively placing the feature behind a paywall. Ministers and charities say that does not fix safety by design. (standard.co.uk)

Ofcom says it contacted X and xAI at the start of the week, set a firm deadline for answers, and has now received a response. The regulator is carrying out an expedited assessment and has promised a further update shortly-timing the government says should be measured in days, not weeks. (the-independent.com)

So what can Ofcom actually do? Under the Online Safety Act, the regulator can fine companies up to £18m or 10% of global turnover for breaches. In serious cases, it can apply to the courts for “business disruption measures”, such as cutting off payment providers, advertising services or even access to an app or site in the UK. Those steps require court approval and are intended as a last resort. (legislation.gov.uk)

Is Grok really switched off? Reporting in the Guardian says that while tagging @grok has tightened, image manipulation remained possible through other routes on X and via the separate Grok app. That’s why campaigners argue the harm has not been addressed-only re-routed. (theguardian.com)

Evidence of harm is real and recent. The Internet Watch Foundation told Sky News its analysts had found criminal imagery of girls aged 11 to 13 that appears to have been created using Grok. That discovery underlines why non‑consensual edits aren’t just offensive-they can be illegal, especially when involving children. (news.sky.com)

The political response has been sharp. Prime Minister Keir Starmer called the images “disgraceful” and said Ofcom has the government’s full backing to act. The Liberal Democrats have urged Ofcom to restrict access to X while investigations proceed. Senior Conservatives have also condemned the abuse, stressing the need for swift action. (standard.co.uk)

Let’s keep terms clear for the classroom. A deepfake is a synthetic or altered image or video that convincingly changes how someone looks or what they appear to do. In this case, people used AI prompts to strip clothing or sexualise images without consent. For many women and girls, that is experienced as a gender‑based violation, not a prank.

What the law expects from platforms matters. Ofcom’s first illegal‑harms codes require companies to assess risks and put in place measures to reduce the chances of illegal content reaching users, and to remove it quickly when they become aware of it. Those duties moved from theory to practice in 2025 and enforcement has stepped up since. (ofcom.org.uk)

If you’re a student, educator or parent, here’s how to respond. Do not share the image, even to criticise it-that spreads harm. Save evidence privately, report it on X, and tell a trusted adult or your organisation’s safeguarding lead. In the UK, the Internet Watch Foundation can take reports of child sexual abuse material, and the police can advise on criminal offences around intimate image abuse.

For media literacy, notice the design choices. Making a risky feature available first and adding limits later shifts the burden onto victims and schools. Safer design means strong default blocks on nudification, reliable age checks, and clear prompts that reject sexualised requests about real people. These safeguards should be built‑in, not paywalled.

What happens next? Ofcom is assessing X’s response now and is expected to set out next steps in days rather than weeks. That could range from formal investigations and fines to asking the courts for business‑disruption or access‑restriction orders if X is found to be non‑compliant. We’ll keep this explainer updated as the regulator moves. (gov.uk)

← Back to Stories