Navigating AI in School

Navigating AI in School

So how do you navigate the minefield of AI in school? Daniel Emmerson explains all…

At the Good Future Foundation we focus on helping schools navigate AI in a way that is both practical and, most importantly, responsible. Our work is rooted in the reality that pupils are not only using these tools for homework and revision, but also for social and emotional support. That means teachers need to be equipped to guide responsible use that strengthens learning while also safeguarding wellbeing. We work alongside schools to build confidence, set clear principles, and ensure that AI is used to enhance connection rather than replace it.


AI is already part of school life

Pupils use it to revise, to plan homework, to translate and sometimes to ask for advice about life. For educators, that brings both opportunities and risks. The question is, therefore, about how to use it in ways that support learning, protect wellbeing and strengthen the relationships that keep school communities safe.

A useful starting point is clarity of purpose 

Tools should serve a curriculum goal or a pastoral aim that staff already understand. That might mean reducing administrative tasks so teachers can focus on relationships, or it might mean improving access for learners who need scaffolds such as read aloud, dictation, or language support. When purpose is clear, decisions about what to use and what to avoid become simpler and success is easier to judge

Mental health considerations need to sit alongside teaching practice from the start 

Pupils are growing up with systems that seem responsive, available and non judgemental. For some, that can feel safer than speaking to an adult. There may be a short term benefit in pupils trying out how to put their feelings into words. There is also a risk that worries get processed in isolation rather than through trusted relationships. Schools can name this tension openly with pupils and frame AI as a tool for thinking, not as a source of personal advice. When questions cross into health or safety, pupils should be reminded to speak with a trusted adult and shown how to do so.

Boundaries for AI use are important 

In classrooms, explain what AI in school can be used for, what it should not be used for and what pupils should do if they receive content that feels troubling. Keep these instructions simple and consistent across subjects. Where a tool is used for planning or drafting, add reflection prompts that ask pupils to evaluate the quality of the output and explain the choices they made. This helps pupils to build judgement rather than dependency and supports the habits of reflection that underpin resilience.

Safeguarding processes should cover AI clearly

 

Staff need to know what to do if a pupil shares a concern while using an AI tool, shows harmful content generated by a system, or reveals that they have been seeking personal health advice in this way. A short flow is often enough. Pause and listen. Reassure. Record what has been shared. Escalate according to the school’s existing policy. Communicate with home if needed. Keep it simple so that action is consistent under pressure

Staff wellbeing is a crucial factor 

AI can reduce workload in practical ways when used carefully. Examples include drafting outlines for lesson slides, generating success criteria from a specification, or translating letters to families. In each case, the teacher remains the author. Set a shared expectation that outputs are always reviewed and adapted and schedule short practice sessions where staff can try one use case at a time. Small successes help staff gain confidence and protect time without adding pressure.

Inclusion should run through every decision 

Some pupils will have easy access to devices and data at home while others will not. Some will take to new tools quickly while others will need careful introductions and clear modelling. Plan for this from the start. Offer in class use to reduce reliance on home access. Pair pupils thoughtfully. Keep language plain. Provide alternatives. Make sure families understand how tools are being used and how their child’s data is being protected.

Listening to pupils is essential

Young people notice what works and what does not. Invite them to help shape simple classroom guidelines. Ask them how AI changes the way they approach a task. Explore the difference between helpful assistance and over reliance. Treat this as part of digital citizenship rather than an add on. When pupils help to set the rules, they are more likely to follow them and more likely to speak up when something feels wrong.

Leaders set the tone

Senior Mental Health Leads, pastoral teams and curriculum leads should agree to a small number of principles that apply across the school. For example, AI should support human connection, not replace it. AI should reduce workload, not increase it. Advice on health, identity, or relationships should go to trusted adults. Data should be handled with care and reviewed regularly. Principles like these make everyday choices easier for teachers and clearer for families.

Keep the conversation open

This space is moving quickly and no school will get everything right at once. Short reviews each term help. What has saved time? What has improved access? What has raised a concern? What needs to change? Share findings across the staff body and with governors and celebrate examples where AI has freed time for connection, helped a pupil find their voice, or supported a teacher to plan with more focus. In the end, what matters is the difference it makes for pupils and staff.

All of the support Good Future Foundation provides to schools is free of charge. From professional development and policy guidance to student voice projects and our AI Quality Mark, our aim is to give teachers the confidence to use these tools safely and effectively. We work with schools of every type and stage, always keeping wellbeing and inclusion at the centre.

By Daniel Emmerson

September 2025

About the Author

The Good Future Foundation is a UK Registered Charity dedicated to helping schools use artificial intelligence responsibly and inclusively. Everything we do is free of charge, from our AI Quality Mark which now involves over 500 schools and trusts, to professional development days that have reached thousands of teachers across the UK. Our work is driven by school leaders and by student voice, with guidance from an advisory committee of educators and AI professionals. Daniel Emmerson, Executive Director, has more than a decade of school leadership experience and now leads the Foundation’s programmes internationally, supporting schools to put the strategic implementation of AI first.

www.goodfuture.foundation

daniel@goodfuture.foundation

DfE Guidance Update: Peer Support Networks

Peer support networks for school and college mentalhealth leads The Department for Education have released their updated Guidance document Promoting and supporting mental health and wellbeing in schools and colleges to include a new section on peer support groups....

10 Reasons we need to talk about menopause

 10 Reasons we need to talk about menopause by Helen Clare Sometimes I feel as if everyone is talking about peri/menopause. Then I realise that's my experience because I talk about perimenopause and menopause all the time! In fact many of the people I speak to have...

Find out more about our new partners: Nudge Education

  Our partners at Nudge Education provide a variety of bespoke services for young people who are chronically disengaged, out of education, or at risk.   "At Nudge Education, we are committed to building a world where no young person is left behind. The reality is...

Join us today & get all the membership benefits amhie offers.