Safe AI tutoring plan for disadvantaged pupils
If you have ever looked at private tutoring and thought that it sounds useful but out of reach, this announcement is trying to speak to that gap. In a government press release, the Department for Science, Innovation and Technology and the Department for Education said EdTech firms and AI labs are being invited to build safe, classroom-ready AI tutoring tools for pupils in Years 9 and 10. The plan is for up to eight organisations to begin testing tools in schools from summer 2026, with teachers supervising how they are used. The subjects named by ministers are English, maths, science and modern foreign languages, and the stated aim is to make extra help more available to pupils from disadvantaged backgrounds.
The pitch from government is clear: one-to-one help works, but it is often priced like a luxury. Ministers say private tutoring can speed up learning by as much as five months, yet it can also cost families hundreds or even thousands of pounds a year. That is why this is being presented as an inequality story as much as a technology story. According to the same press release, successful tools could eventually support up to 450,000 disadvantaged pupils each year. The announcement also links back to the schools white paper Every Child Achieving and Thriving, published earlier in 2026, which set out an ambition to cut the attainment gap between children from lower-income backgrounds and their peers. Digital Government minister Ian Murray said AI could widen access to personalised support, while education minister Olivia Bailey stressed that safety and proper testing have to come first.
What would these tools actually do? The government says they should adjust to individual pupils, step in with extra guidance when someone gets stuck, and spot the areas where more practice is needed. Teachers would also get clearer information about what pupils have understood and where they are struggling, so lessons and support can be adapted more quickly. **What this means in plain English:** ministers are not describing a robot teacher taking over the room. They are describing an extra tutoring layer that sits alongside classroom teaching, with adults still in charge. Nav Sanghara, chief executive of Woodland Academy Trust, made that point in the government release too, arguing that technology should improve teaching rather than stand in for it.
Safety is the part the government keeps stressing, and with good reason. If an AI tool gives wrong answers, unsuitable advice or confusing feedback, pupils can lose trust quickly and teachers can lose time. Ministers say any product used in this programme must match the national curriculum, be age-appropriate, and meet the Department for Education's Generative AI Product Safety Standards. The Department for Science, Innovation and Technology also says new national benchmarks are being built to test whether these tools are accurate and safe. Hundreds of teachers are meant to help shape example classroom interactions and scoring criteria, which matters because classroom reality is much messier than a polished tech demo.
There is also a data protection point here, and it is not a small one. The notes to editors say no identifiable pupil data will be shared publicly, and pupil work will not be used to train AI models without parental permission. In other words, the government is trying to reassure schools and families that trialling these systems will not mean quietly handing children's work to developers. **What it means for families:** whenever schools and AI appear in the same sentence, the first questions should be about data. Who can see it, what is stored, what is reused, and who gives consent are not side issues. They are part of whether a tool deserves trust.
The money and timetable matter too, because this is still a pilot rather than a finished national service. The government says successful bidders will receive £300,000, and up to eight organisations could be selected for a Pioneer Group to design and test what these tutoring tools can actually do in real classrooms. Co-design with schools is due to begin in summer 2026, and ministers say wider national availability could follow from 2027 if the results are strong enough. That is worth slowing down over. If you see headlines suggesting every child is about to get an AI tutor, that jumps ahead of the evidence. Right now, ministers are funding an experiment and asking schools to help test it. Proof still has to be earned.
There is a wider policy picture behind all this. The government says it is putting an extra £325 million into school connectivity by 2029/30 to narrow the digital divide, and up to £23 million into testing AI and EdTech products that might improve outcomes or reduce teacher workload. It is also opening access to its AI Content Store, a library of publicly available educational materials that developers can use for testing and evaluation. Even so, technology will not solve educational inequality by itself. Pupils who need the most support may also be dealing with housing pressure, caring duties, poor internet access at home or gaps that built up long before Year 9. An AI tutor could become a useful extra layer of help, but it cannot by itself remove the reasons some children fall behind in the first place.
This is why the next stage matters more than the announcement itself. If the tools are genuinely safe, properly tested and built with teachers rather than simply sold to them, they could widen access to the kind of tailored help that better-off families have long been able to buy. If they are rushed, badly monitored or treated as a shortcut, they could create new problems while claiming to fix old ones. For pupils, parents and teachers, the sensible response is neither hype nor panic. It is careful attention. Watch for the evidence from the school trials, watch for whether disadvantaged pupils and children with SEND are genuinely included, and watch for whether AI support really means better learning with a teacher still firmly in the loop.