DfE Press Release: 450,000 disadvantaged pupils could benefit from AI tutoring tools

450,000 disadvantaged pupils could benefit from AI tutoring tools

“AI in the Classroom: Levelled Playing Fields or New Safeguarding Frontiers?”

The landscape of British education is shifting. Following the recent government announcement regarding a massive rollout of AI tutoring tools, the Association of Mental Health in Education is taking a closer look at what this means for our schools, our staff, and—most importantly—the emotional well-being of our pupils.

With the Department for Education (DfE) pledging that 450,000 disadvantaged pupils could benefit from these tools by 2027, we are entering a new era of “personalised learning.” But as we embrace the efficiency of algorithms, we must ensure we don’t lose the empathy of the classroom.

Here is everything you need to know, broken down into three key areas.

1. The Big News: A £23 Million Vision for Equity

The headline is bold: by the end of 2027, the government aims to provide high-quality, safe AI tutoring tools to every disadvantaged pupil in Years 9 to 11. This isn’t just about tech for tech’s sake; it is a mission to “break the link between background and destiny.”

Currently, only one in four disadvantaged children achieves a grade 5 or above in English and Maths GCSEs. The government’s plan involves a massive co-creation project starting this Summer term, where teachers, AI labs, and tech giants will build tools specifically aligned with the National Curriculum. Backed by a £23 million investment in “EdTech Testbeds,” the goal is to provide the kind of one-to-one support usually reserved for families who can afford private tutors.

2. The Practicalities: What Will Schools Actually Have to Do?

For school leaders and pastoral teams, this isn’t a “set and forget” update. It requires a fundamental rethink of school operations.

Firstly, schools will need to move from ad-hoc AI use to formal policy. This means updating Safeguarding and Data Protection policies to include AI-specific risks. Educators will be expected to act as “AI supervisors” rather than being replaced by machines. The DfE has been clear: the human teacher remains the “expert in the room.”

Practically, schools will need to:

  • Identify and Onboard: Pinpoint eligible students (those on Free School Meals) and integrate AI tutoring into their weekly schedules.
  • Invest in CPD: Staff will need to undergo training to understand “AI literacy”—learning how to spot “hallucinations” (where AI makes up facts) and how to use these tools to reduce their own administrative workloads.
  • Audit Infrastructure: With 1,000 schools set to become “Testbeds,” many will need to audit their Wi-Fi and device capacity to ensure no child is left behind due to a “digital divide” in hardware.

3. The Heart of the Matter: Implications for Mental Health

At the Association of Mental Health in Education, our primary focus is the psychological impact of this transition. There are both “sunny” and “shadow” sides to AI in education, as we learned at the BETT Conference on Friday.

The Benefits: AI can provide a shame-free learning environment. Many students experience maths anxiety or are frightened of how they might appear to peers. An AI tutor doesn’t get frustrated; it allows for infinite repetition, which can significantly lower cortisol levels in anxious learners. An admin workload reduction for teachers, could contribute to a reduction in teacher burnout—a leading cause of secondary trauma in school environments.

The Risks: However, we must be wary of digital isolation. Outside research, such as reports from the Oxford University Press, suggests that over 60% of students worry that AI might erode their independent thinking. From a mental health perspective, we must ensure that AI does not replace the “precious covenant” of the teacher-student relationship.

Crucially, the new safety standards mandate that AI tools must be mental health aware. If a student types something into an AI tutor that suggests self-harm or deep distress, the system must be programmed to immediately flag this to a human Safeguarding Lead. We cannot allow a child in crisis to be “counselled” by a chatbot; the technology must serve as a bridge to human intervention, not a barrier.

Final Thoughts

As we move toward 2027, our role as educators and mental health advocates is to ensure that technology serves the child—not the other way around. AI has the potential to level the playing field, but only if we maintain our focus on the “human in the loop.”

We encourage all our members to engage with the upcoming government consultations. Let’s ensure that as we sharpen our pupils’ minds with AI, we continue to protect their hearts with human compassion.

Would you like to be kept in the loop about changes to UK legislation and #DfE and #Oftsed changes? Don’t forget you can join AMHIE for free as an educational setting or individual, you can find more here.

The Children’s Plan: Reforming Education for Every Child

The Children's Commissioner's School Census is out and mental health is a key concern Key Concerns and Statistics: In 2022, almost one million children in England had an active referral to mental health services. Children and young people's mental health services...

DfE Guidance Update: Peer Support Networks

Peer support networks for school and college mentalhealth leads The Department for Education have released their updated Guidance document Promoting and supporting mental health and wellbeing in schools and colleges to include a new section on peer support groups....

Find out more about our new partners: Nudge Education

  Our partners at Nudge Education provide a variety of bespoke services for young people who are chronically disengaged, out of education, or at risk.   "At Nudge Education, we are committed to building a world where no young person is left behind. The reality is...

Join us today & get all the membership benefits amhie offers.