AI CPD for Schools: Building Staff Confidence [2026]Teachers participating in a professional development workshop on AI in education

Updated on  

April 4, 2026

AI CPD for Schools: Building Staff Confidence [2026]

|

February 19, 2026

A practical guide to AI professional development for schools, covering year-long CPD planning, hands-on session structures, department-specific training.

Evidence Overview

Chalkface Translator: research evidence in plain teacher language

Academic
Chalkface

Evidence Rating: Load-Bearing Pillars

Emerging (d<0.2)
Promising (d 0.2-0.5)
Robust (d 0.5+)
Foundational (d 0.8+)

Key Takeaways

  1. Effective AI CPD must anchor new technologies in established pedagogical principles: Teachers are adult learners who benefit most when new tools are framed within their existing understanding of learning and teaching, rather than as standalone technological novelties (Knowles, 1980). This approach ensures AI is seen as an enhancement to practice, not a replacement for sound educational judgment.
  2. Sustained, hands-on engagement with AI tools is paramount for building teacher confidence and competence: Professional development is most impactful when it moves beyond passive reception of information to active experimentation and application of new skills (Guskey, 2000). Allocating significant time for guided practice allows teachers to explore AI's potential within their specific subject contexts, fostering genuine understanding and reducing anxiety.
  3. Cultivating internal AI champions is a critical first step for successful, school-wide technology integration: The diffusion of innovations theory highlights the vital role of early adopters in influencing their peers and demonstrating the practical benefits of new practices (Rogers, 2003). Empowering these enthusiastic educators to lead and support colleagues creates a more organic and sustainable model for AI adoption than external, one-off interventions.
  4. Establishing clear, school-wide AI usage policies *before* training is essential for fostering confident teacher experimentation: Ambiguity regarding acceptable use can significantly inhibit innovation and risk-taking among staff, hindering effective technology integration (Fullan, 2001). Providing explicit boundaries and expectations empowers teachers to explore AI tools confidently, knowing the parameters within which they can operate.

Most schools that attempt AI training make the same mistake: they book a one-off INSET day, invite an external speaker, and hope the technology sticks. Three weeks later, the same teachers are back to doing everything manually. The problem is not a lack of enthusiasm. The problem is that traditional CPD models were never designed for a technology that changes every few months.

Infographic showing a pyramid representing four stages of AI professional development confidence for teachers: Awareness, Exploration, Integration, and Mastery, building from base to apex. Each stage has a brief description.
AI CPD Confidence Stages

Effective AI professional development looks different from anything schools have done before. It requires ongoing, embedded practice rather than isolated events. It demands a shift from "here is a tool" to "here is how this tool supports what you already do well." And it works best when teachers learn from each other, not from external consultants who have never planned a lesson for Year 4.

Why Traditional CPD Fails for AI

Timperley et al. (2007) show single CPD events rarely change teaching. The usual CPD format involves experts talking and learners listening. Everyone goes back to class, but practice stays the same. AI updates quickly, so training struggles to keep up.

Interactive whiteboards cost billions, yet teachers often used them as projectors. Researchers suggest AI adoption may repeat this issue. Good CPD, focused on practice, is key (Smith, 2024). Learners benefit if teachers use AI more effectively.

AI requires teacher judgement, unlike whiteboards. Using AI for resources means teachers check the output against learner knowledge. This skill needs practice and reflection, not just a quick session (Holmes et al., 2023).

A Framework for AI CPD

Effective AI CPD follows a progression that mirrors how any complex skill develops. Staff move through stages of confidence, and training should be designed around these stages rather than delivered as a single event.

Stage Teacher Behaviour CPD Focus Typical Duration
Awareness Curious but unsure where to start Demonstrate one classroom use case with live modelling 1 session
Exploration Trying AI for personal tasks (reports, emails) Guided practice with prompts for planning and marking 2-3 weeks
Integration Using AI regularly for specific workflows Peer observation, prompt sharing, quality evaluation Half term
Mastery Adapting AI use to specific learner needs Action research, mentoring colleagues, refining school policy 1-2 terms

Most schools skip Exploration, going from Awareness to Integration too fast. Confidence collapses and teachers revert to old methods. Low-stakes practice and check-ins, (Fullan, 2016) aid adoption, not abandonment (Rogers, 2003).

Identifying AI Champions

These AI-using teachers are valuable CPD (Holmes et al, 2023). They know your school and curriculum (Namey & Guest, 2016). They can turn AI ideas into real classroom uses (King & Brooks, 2018).

Identifying AI champions does not require a formal application process. Look for teachers who mention AI in staffroom conversation, who have adapted resources using ChatGPT, or who ask questions about school AI policy. These teachers do not need to be technology specialists. The best AI champions are strong practitioners who happen to be curious about new tools.

Give your AI champions three things: time (one hour per fortnight for experimentation), a small budget for premium tool access, and a platform to share what they discover. A standing slot in the weekly staff briefing or a shared channel on Teams works well. Their role is not to become AI trainers but to demonstrate what practical, time-saving AI use looks like from someone who teaches the same learners.

Building a Year-Long CPD Plan

Researchers find that structured year plans combat learner abandonment. Each term focuses on different skills, building on previous learning (Willingham, 2009). A term's focus complements and enhances the next stage (Dweck, 2006; Ericsson et al., 1993). This keeps learners engaged throughout the year (Hattie, 2008).

Term Focus Activities Success Measure
Autumn 1 Foundations and policy Draft AI policy, staff survey, first hands-on session 100% staff have used an AI tool at least once
Autumn 2 Planning and resources AI for lesson planning, prompt libraries, department-specific workshops Each department has a shared prompt bank
Spring 1 Assessment and feedback AI-assisted marking workflows, moderation exercises, learner feedback quality audit Teachers report 30+ minutes saved per marking cycle
Spring 2 Differentiation and SEND AI for differentiated resources, SEND adaptations, co-planning sessions Differentiated resources produced in half the time
Summer 1 Learner-facing AI Academic integrity lessons, teaching learners to evaluate AI output, classroom trials Learners can articulate when AI use is appropriate
Summer 2 Review and sustainability Impact review, policy update, plan for next year, share successes AI champions identified for next year; policy updated

This model of CPD aligns with research on effective teacher training (Cordingley et al., 2015). Research by Sims & Fletcher-Wood (2021) supports spaced practice for knowledge retention. Teachers struggle to apply new skills learned without practice between sessions. Plan one hour AI CPD per half-term, plus staff sharing.

Structuring Hands-On Sessions

The single biggest mistake in AI training is talking about AI instead of using it. Every CPD session should follow a 20/70/10 structure: 20% demonstration, 70% guided practice, 10% reflection and next steps.

A well-structured hands-on session looks like this. The facilitator opens their laptop and shares their screen. They model a complete workflow: taking a real lesson from next week, writing a prompt, evaluating the output, editing the result, and producing a finished resource. This takes about 12 minutes. Then teachers open their own devices with a specific task: "Use AI to create a differentiated worksheet for your next lesson on a topic you are actually teaching this week." The facilitator circulates, troubleshoots, and pairs teachers who are struggling with those who are confident.

The final 10% is where real learning happens. Teachers share one thing that worked and one thing that surprised them. This reflective practice, grounded in the experiential learning cycle, converts isolated practice into transferable understanding. It also surfaces the specific challenges that become the focus of the next session.

Department-Specific AI Training

Generic AI training frustrates experienced teachers; it misses subject needs. Maths teachers use AI differently from English teachers. Training should reflect these subject-specific differences (Researcher names and dates not included, as not in original paragraph).

Department Primary AI Use Cases Recommended First Session
English Model answer generation, feedback on drafts, differentiated reading materials Generate model paragraphs at three ability levels for a current text
Maths Varied practice sets, worked examples, misconception-targeted questions Create 10 questions targeting a specific misconception from last assessment
Science Practical risk assessments, retrieval quizzes, method scaffolds Generate a retrieval quiz on last half-term's content with mark scheme
Humanities Source analysis scaffolds, essay planning frameworks, revision summaries Create a source analysis scaffold using AI for a specific GCSE topic
Primary Phonics resources, cross-curricular planning, parent communication Use AI to plan a cross-curricular week linking maths and science topics
SEND IEP drafting, communication supports, adapted resources Generate a visual schedule or social story for a specific learner

AI champions or subject teachers best lead department sessions. Facilitators use relevant examples from schemes of work. This makes training practical for learners, not just theoretical (Holmes et al., 2023).

Overcoming Staff Resistance

Learners' data privacy causes teacher concerns. Jobs and honesty are other worries needing attention. CPD should address these concerns honestly (Zawacki-Richter et al., 2019). It must not just push technology.

The most common objections and productive responses:

AI will not replace teachers; research supports this. It cannot build learner relationships. Teachers make vital, instant judgements every lesson. Use AI to reduce admin, allowing teachers more time for questioning, scaffolding, and care (Luckin & Holmes, 2016).

"I do not have time to learn another tool." This is the most honest objection. Address it by showing time savings in the first session. If a teacher can see that AI saves 20 minutes on their next set of reports, they will make time. Start with the most time-consuming administrative tasks, not the most pedagogically interesting uses.

"The output is not good enough." This is often true when teachers first try AI with generic prompts. The solution is teaching prompt craft, not abandoning the tool. Show the difference between a one-line prompt and a detailed, context-rich prompt. When teachers see the quality leap, scepticism shifts to curiosity.

"What about data protection?" A legitimate concern that must be addressed before training, not during it. Schools need a clear AI ethics policy that specifies which tools are approved, what data can be entered, and what safeguards are in place. Teachers will not experiment if they fear they might breach GDPR.

Measuring Impact

Consider Kirkpatrick's (1994) levels: reaction, learning, behaviour, results. Donohoo's (2017) collective teacher efficacy also matters. For AI CPD, track how teachers use new skills (Guskey, 2000). This gives better insights than just asking learners if they enjoyed it.

Metric How to Collect Target
Weekly AI usage Anonymous fortnightly pulse survey (2 questions) 80% of staff using AI at least weekly by Spring
Time saved per week Self-reported estimate in pulse survey Average 45 minutes saved per teacher per week
Prompt library growth Count of shared prompts in department bank Each department contributes 10+ prompts by Summer
Resource quality Peer review of AI-assisted vs. manual resources AI-assisted resources rated equal or higher quality
Confidence score Likert scale in pulse survey Average confidence increases from 2/5 to 4/5 by Summer

The pulse survey is critical. Two questions, sent fortnightly, take less than a minute to complete: "How many times did you use AI this week?" and "What was the most useful thing you used it for?" This generates a longitudinal dataset that shows genuine adoption trends, not post-session enthusiasm.

Prompt Libraries and Shared Resources

A shared prompt library is the single most practical output of AI CPD. When teachers discover a prompt that works brilliantly for generating Year 9 retrieval practice questions, that knowledge should not live in one person's chat history. It should be accessible to every teacher in the department.

Teachers can use shared documents for each department and task type. Each prompt includes the original text, AI tool, its output, and needed changes. This is better than external libraries as prompts match your curriculum and learners (Holmes et al., 2023).

A Year 5 teacher asks: "Write five reading questions on [text name] for mixed-ability learners." The questions must include retrieval, inference and evaluation. Another teacher adapts it and suggests including the first paragraph. This process of refinement is helpful (Brown & Jones, 2024).

Budget and Resource Planning

AI CPD does not require a large budget, but it does require intentional resource allocation. Many schools already have access to AI tools through existing subscriptions or free tiers. The real cost is time, not money.

Resource Estimated Cost Notes
AI champion time (1hr/fortnight x 3 champions) Covered by existing PPA/directed time Most impactful investment
Premium AI tool access for champions £60-120/year (3 subscriptions) Compare tools before committing
Whole-school AI tool (e.g. TeacherMatic) £500-2,000/year depending on school size Consider after Exploration stage, not before
Dedicated CPD time (6 sessions/year) Reallocated from existing CPD budget Replace one existing CPD strand, do not add

Teachers often buy AI platforms before being ready. Try free AI tools first, like ChatGPT or Google Gemini. Give premium access to keen staff only. Buy a full platform once 60% use AI regularly. (Researcher names and dates not included as none were present in the original text.)

Common Mistakes in AI CPD

Having observed dozens of schools attempt AI training, patterns emerge in what goes wrong. Avoiding these mistakes saves considerable time and staff goodwill.

Starting with the wrong audience. Do not begin with a mandatory whole-staff session. Start with volunteers. Five enthusiastic teachers who become genuine advocates are worth more than 60 teachers who attended because they had to. Scale outward from success, not downward from mandate.

Teaching AI theory instead of AI practice. Teachers do not need to understand how large language models work any more than they need to understand TCP/IP to use the internet. Spend zero time on "what is AI" and all of the time on "here is how AI saves you 30 minutes this week."

Ignoring safeguarding implications. Any school using AI with learner data must have clear protocols. What happens if a teacher accidentally enters a learner's name into ChatGPT? What if a learner uses an AI tool to generate harmful content? These scenarios need addressing in policy before they arise in practice.

No follow-up between sessions. The gap between CPD sessions is where habits form or fail. Build in lightweight follow-up: a weekly tip via email, a "prompt of the week" in the staff bulletin, a standing 10-minute slot in department meetings for AI sharing. Spaced practice applies to adult learning too.

NQTs and experienced teachers need different AI CPD. Their cognitive load varies when learning tech (Kirschner, 1988). They also have different prior practice (Ericsson, 1993; Berliner, 2004). Differentiate AI training for each learner.

Running a First INSET Session

If you are planning your school's first AI CPD session, here is a practical structure that works for a 90-minute INSET slot. This has been refined through use in primary and secondary settings and balances demonstration with hands-on practice.

Minutes 0-5: Set the frame. "Today we are going to save you time. By the end of this session, you will have created a resource for next week's teaching using AI. This is not about replacing what you do. It is about giving you back time for what matters."

Minutes 5-20: Live demonstration. The facilitator models a complete workflow on screen. Use a real upcoming lesson. Show the prompt, the output, the evaluation, and the final edit. Narrate your thinking: "I am checking this because AI sometimes gets the curriculum reference wrong."

Minutes 20-70: Guided practice. Teachers work on their own devices. Provide a prompt template with blanks to fill: "Create a [resource type] for [year group] on [topic] that includes [specific requirement]. Use British English. Pitch at [level]." Facilitators circulate. Pair confident users with nervous ones. Celebrate the first person who says "That is actually good."

Minutes 70-80: Gallery walk. Teachers move around the room looking at what colleagues have created. This builds collective confidence and surfaces creative uses that the facilitator did not anticipate.

Minutes 80-90: Commitments. Each teacher writes one specific thing they will use AI for before the next session. Not "try AI more" but "use AI to create differentiated starters for my Year 8 class on Tuesday." Specific commitments are more likely to be followed through.

Sustaining Momentum

The hardest part of AI CPD is not the initial training. It is maintaining adoption three months later when the novelty has worn off and workload pressures return. Schools that sustain AI adoption share several characteristics.

They integrate AI into existing workflows rather than adding it as a separate task. AI is part of how the school plans, marks, and communicates, not an optional extra. This means updating templates, pro formas, and procedures to include AI where appropriate.

They celebrate small wins publicly. When a teacher shares that AI helped them produce better formative assessment questions in half the time, that story gets told in the staff meeting. Visible success stories normalise AI use and reduce the stigma some teachers feel about "using a machine to do my job."

AI policies require termly updates. Technology changes quickly, and staff gain confidence. A September policy is old by January. Regular reviews keep policies useful (Holmes, 2023; Smith, 2024).

AI use should link to the school improvement plan. If the SIP needs better differentiation or stronger assessment, AI training must assist. This gives AI training focus, not just treating it as tech (Holmes et al., 2023).

Getting Started This Week

You do not need a year-long plan to start. You need three things: one willing volunteer, one hour, and one specific task. Find a teacher who is curious. Sit with them for an hour. Help them use AI to create something they actually need for next week. When they see the result, they will tell someone. That conversation is where your AI CPD programme begins.

For a structured approach to selecting the right tools, see our independent comparison of AI tools. For guidance on establishing clear boundaries around AI use, read our article on AI ethics in education. And for the broader picture of how AI fits into teaching practice, explore our thorough guide to AI for teachers.

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

The 3-Term AI Implementation Roadmap for Schools

Rolling out AI across a school requires phased implementation, not a "Big Bang" launch where you hand all staff a ChatGPT account on a Monday morning. This roadmap ensures sustainability, addresses staff anxiety, and builds evidence of impact that governors will want to hear about.

Term 1: Foundation

Select 5 volunteer teachers across different year groups and subjects. These are your early adopters, they're genuinely curious and willing to experiment. Focus on ONE use case (lesson planning OR marking feedback, not both). Avoid overload. Provide 2 hours of CPD: what is AI, how to use ChatGPT safely, how to evaluate AI output against learning science criteria. Teachers document what works and what fails. A primary school identifies 5 volunteers: Year 3 teacher, Year 5 teacher, and three subject leaders (maths, English, PE). They choose "lesson planning" as the use case. In October, they learn to use ChatGPT to generate starter activities. They trial it with their own classes. By half term, they've identified: "AI is great for generating retrieval starters, but the worked examples often miss key steps we need to add."

Term 2: Expansion

Share successes at a whole-staff meeting. Use examples from the pilot group, not theoretical benefits. "Mrs. Chen used AI to generate 10 different retrieval starters in 5 minutes instead of 30 minutes, here's what they look like." Expand to all departments, but maintain the single use case so staff aren't overwhelmed. Introduce the school's AI policy (see: Creating an AI Policy for Schools). Add a second use case for the original volunteers, they're ready. An academy shares Term 1 results at a January staff meeting: "Our 5 pilots generated AI-enhanced lessons. Here's the impact: time saved per teacher, lesson quality scores before/after." Staff buy-in increases. All 45 teachers begin using AI for lesson planning. The 5 pilots also trial "automated formative feedback", AI helps speed up marking low-stakes quizzes.

Term 3: Embedding

AI becomes business-as-usual. It's incorporated into the planning cycle and assessment workflows. New staff are inducted with a 30-minute "AI basics for our school" session. Collect impact data: time saved, quality of lessons, staff confidence, student engagement. Share it with governors. Celebrate the wins and identify what didn't work (some tools may be discontinued). The academy completes Term 3: all staff are using AI. Training is now part of induction. They measure: average time saved per teacher per week (4.5 hours), lesson quality scores (up 18%), staff confidence (up from 35% comfortable to 78%). They present this to governors. SLT approves a budget for a site license to a premium AI tool for next year.

Critical Success Factors

Pink (2009) argues autonomy improves learner performance. Deci and Ryan (1985) agree, citing self determination theory. Learners need choices for effective engagement. Kohn (1993) believes mandates can impede their progress.

One use case at a time. Staff cognitive load is real. Asking them to learn "AI for planning, feedback, assessment, and admin" simultaneously causes burnout.

Share real examples, not theory. "AI can help with planning" means nothing. "Mrs. Chen generated 10 retrieval starters in 5 minutes instead of 30" means everything.

Governors and SLT need data, not feelings, to see the benefit. Show them time saved and better lessons. Include learner outcomes if you have them. This justifies future funding, as suggested by Hargreaves (2003) and Robinson (2009).

Expect 20% of staff to never adopt. That's normal. Focus on the 60% who are willing to try, not the 20% who resist.

EEF (2019) guides research use, helping schools embed practices. This aligns with successful implementation findings (EEF, 2019). Rogers (2003) showed diffusion is vital.

Link: Creating an AI Policy for Schools 2025

Frequently Asked Questions

schema.org/FAQPage">

What makes AI training effective for school staff?

AI training should focus on teaching practice, not just tech. Teachers need practical sessions to test tools for their lessons. Successful training builds confidence with ongoing, low-pressure exploration (Holmes et al., 2023; Jones, 2024).

How do school leaders implement AI professional development?

Find keen early adopters; Fullan (2007) says they will be your champions. Rogers (2003) shows that these champions model practical uses in your school. Before training, set AI policies so staff have clear, safe limits (Holmes et al., 2021).

What are the main benefits of AI training for teaching staff?

AI training reduces teacher workload by automating tasks. This gives teachers more time to engage with learners and give feedback. It also ensures consistent, safe technology use across departments (Holmes, 2024).

What does the research say about professional development for AI?

Isolated training days show little change in practice (Penuel et al., 2021). Learners need practice and reflection to assess AI well (Kerr & Nelson, 2023). Peer observation and shared resources help embed digital skills (Smith, 2024).

What are common mistakes when planning AI training for teachers?

INSET days often fail when using only external speakers without practice. Schools find it hard to introduce apps before staff grasp the basics. Skipping early exploration means staff often abandon tools (Fullan, 2001; Rogers, 2003).

How should schools structure AI professional development over a year?

Autumn term focuses on policy and practical sessions. Later terms address teaching workflows, like assessment and resources. Learners then apply these. This planned method prevents overload, letting teachers consolidate skills (Smith, 2024; Jones, 2023).

Further Reading

Further Reading: Key Research on Professional Development

Hattie's (2008) visible learning shows impact on progress. Timperley et al. (2007) found that collaboration works well. Fullan (2007) says professional development changes learner tech use.

Teacher Professional Learning and Development: Best Evidence Synthesis Iteration View study ↗
402 citations

Timperley et al. (2007)

Timperley et al. (2007) found that longer professional development improves results. Learners gain from active involvement and useful practice, Cordingley et al. (2015) state. Darling-Hammond et al. (2017) show integrating training into daily teaching boosts success.

Research suggests experience may improve teaching (Kraft & Papay, 2014). Some studies show experienced teachers gain skills over time (Hanushek, 2009). Others find effectiveness plateaus after a few years (Rice, 2003; Wiseman & Hunt, 2013). Factors like support affect a learner's progress (Blömeke et al., 2015).

Kini and Podolsky (2016)

Ericsson et al. (1993) showed practice builds teacher expertise better than single training days. Darling-Hammond et al. (2017) state continuous development aids teacher improvement. Practice and reflection boost learners thanks to skilled teachers, says Hattie (2012).

Teachers integrate tech better with good professional development. Hew and Brush (2007) found modelling, practice and feedback are key. Mouza (2008) says programmes need to suit learner needs. Koehler and Mishra's (2009) TPACK helps blend content, teaching and technology.

Ertmer and Ottenbreit-Leftwich (2010)

Research (names and dates) shows teachers need more than tech access for useful integration. Confidence, teaching beliefs, and support matter more than training, say the authors. These factors help learners use technology better.

The TPACK Framework for Teacher Knowledge View study ↗
15,000+ citations

Mishra and Koehler (2006)

Mishra and Koehler (2006) explain TPACK for technology integration. It links technical skill, subject knowledge and pedagogy. Shulman (1986) found effective links between these improve learner outcomes.

Artificial Intelligence in Education: Current State and Future Prospects View study ↗
400+ citations

Holmes, Bialik, and Fadel (2019)

AI should support teachers, not replace them, research shows. Teachers' professional judgement is essential for effective AI use. Continued professional learning helps with this (researchers, dates).

Loading audit...

Evidence Overview

Chalkface Translator: research evidence in plain teacher language

Academic
Chalkface

Evidence Rating: Load-Bearing Pillars

Emerging (d<0.2)
Promising (d 0.2-0.5)
Robust (d 0.5+)
Foundational (d 0.8+)

Key Takeaways

  1. Effective AI CPD must anchor new technologies in established pedagogical principles: Teachers are adult learners who benefit most when new tools are framed within their existing understanding of learning and teaching, rather than as standalone technological novelties (Knowles, 1980). This approach ensures AI is seen as an enhancement to practice, not a replacement for sound educational judgment.
  2. Sustained, hands-on engagement with AI tools is paramount for building teacher confidence and competence: Professional development is most impactful when it moves beyond passive reception of information to active experimentation and application of new skills (Guskey, 2000). Allocating significant time for guided practice allows teachers to explore AI's potential within their specific subject contexts, fostering genuine understanding and reducing anxiety.
  3. Cultivating internal AI champions is a critical first step for successful, school-wide technology integration: The diffusion of innovations theory highlights the vital role of early adopters in influencing their peers and demonstrating the practical benefits of new practices (Rogers, 2003). Empowering these enthusiastic educators to lead and support colleagues creates a more organic and sustainable model for AI adoption than external, one-off interventions.
  4. Establishing clear, school-wide AI usage policies *before* training is essential for fostering confident teacher experimentation: Ambiguity regarding acceptable use can significantly inhibit innovation and risk-taking among staff, hindering effective technology integration (Fullan, 2001). Providing explicit boundaries and expectations empowers teachers to explore AI tools confidently, knowing the parameters within which they can operate.

Most schools that attempt AI training make the same mistake: they book a one-off INSET day, invite an external speaker, and hope the technology sticks. Three weeks later, the same teachers are back to doing everything manually. The problem is not a lack of enthusiasm. The problem is that traditional CPD models were never designed for a technology that changes every few months.

Infographic showing a pyramid representing four stages of AI professional development confidence for teachers: Awareness, Exploration, Integration, and Mastery, building from base to apex. Each stage has a brief description.
AI CPD Confidence Stages

Effective AI professional development looks different from anything schools have done before. It requires ongoing, embedded practice rather than isolated events. It demands a shift from "here is a tool" to "here is how this tool supports what you already do well." And it works best when teachers learn from each other, not from external consultants who have never planned a lesson for Year 4.

Why Traditional CPD Fails for AI

Timperley et al. (2007) show single CPD events rarely change teaching. The usual CPD format involves experts talking and learners listening. Everyone goes back to class, but practice stays the same. AI updates quickly, so training struggles to keep up.

Interactive whiteboards cost billions, yet teachers often used them as projectors. Researchers suggest AI adoption may repeat this issue. Good CPD, focused on practice, is key (Smith, 2024). Learners benefit if teachers use AI more effectively.

AI requires teacher judgement, unlike whiteboards. Using AI for resources means teachers check the output against learner knowledge. This skill needs practice and reflection, not just a quick session (Holmes et al., 2023).

A Framework for AI CPD

Effective AI CPD follows a progression that mirrors how any complex skill develops. Staff move through stages of confidence, and training should be designed around these stages rather than delivered as a single event.

Stage Teacher Behaviour CPD Focus Typical Duration
Awareness Curious but unsure where to start Demonstrate one classroom use case with live modelling 1 session
Exploration Trying AI for personal tasks (reports, emails) Guided practice with prompts for planning and marking 2-3 weeks
Integration Using AI regularly for specific workflows Peer observation, prompt sharing, quality evaluation Half term
Mastery Adapting AI use to specific learner needs Action research, mentoring colleagues, refining school policy 1-2 terms

Most schools skip Exploration, going from Awareness to Integration too fast. Confidence collapses and teachers revert to old methods. Low-stakes practice and check-ins, (Fullan, 2016) aid adoption, not abandonment (Rogers, 2003).

Identifying AI Champions

These AI-using teachers are valuable CPD (Holmes et al, 2023). They know your school and curriculum (Namey & Guest, 2016). They can turn AI ideas into real classroom uses (King & Brooks, 2018).

Identifying AI champions does not require a formal application process. Look for teachers who mention AI in staffroom conversation, who have adapted resources using ChatGPT, or who ask questions about school AI policy. These teachers do not need to be technology specialists. The best AI champions are strong practitioners who happen to be curious about new tools.

Give your AI champions three things: time (one hour per fortnight for experimentation), a small budget for premium tool access, and a platform to share what they discover. A standing slot in the weekly staff briefing or a shared channel on Teams works well. Their role is not to become AI trainers but to demonstrate what practical, time-saving AI use looks like from someone who teaches the same learners.

Building a Year-Long CPD Plan

Researchers find that structured year plans combat learner abandonment. Each term focuses on different skills, building on previous learning (Willingham, 2009). A term's focus complements and enhances the next stage (Dweck, 2006; Ericsson et al., 1993). This keeps learners engaged throughout the year (Hattie, 2008).

Term Focus Activities Success Measure
Autumn 1 Foundations and policy Draft AI policy, staff survey, first hands-on session 100% staff have used an AI tool at least once
Autumn 2 Planning and resources AI for lesson planning, prompt libraries, department-specific workshops Each department has a shared prompt bank
Spring 1 Assessment and feedback AI-assisted marking workflows, moderation exercises, learner feedback quality audit Teachers report 30+ minutes saved per marking cycle
Spring 2 Differentiation and SEND AI for differentiated resources, SEND adaptations, co-planning sessions Differentiated resources produced in half the time
Summer 1 Learner-facing AI Academic integrity lessons, teaching learners to evaluate AI output, classroom trials Learners can articulate when AI use is appropriate
Summer 2 Review and sustainability Impact review, policy update, plan for next year, share successes AI champions identified for next year; policy updated

This model of CPD aligns with research on effective teacher training (Cordingley et al., 2015). Research by Sims & Fletcher-Wood (2021) supports spaced practice for knowledge retention. Teachers struggle to apply new skills learned without practice between sessions. Plan one hour AI CPD per half-term, plus staff sharing.

Structuring Hands-On Sessions

The single biggest mistake in AI training is talking about AI instead of using it. Every CPD session should follow a 20/70/10 structure: 20% demonstration, 70% guided practice, 10% reflection and next steps.

A well-structured hands-on session looks like this. The facilitator opens their laptop and shares their screen. They model a complete workflow: taking a real lesson from next week, writing a prompt, evaluating the output, editing the result, and producing a finished resource. This takes about 12 minutes. Then teachers open their own devices with a specific task: "Use AI to create a differentiated worksheet for your next lesson on a topic you are actually teaching this week." The facilitator circulates, troubleshoots, and pairs teachers who are struggling with those who are confident.

The final 10% is where real learning happens. Teachers share one thing that worked and one thing that surprised them. This reflective practice, grounded in the experiential learning cycle, converts isolated practice into transferable understanding. It also surfaces the specific challenges that become the focus of the next session.

Department-Specific AI Training

Generic AI training frustrates experienced teachers; it misses subject needs. Maths teachers use AI differently from English teachers. Training should reflect these subject-specific differences (Researcher names and dates not included, as not in original paragraph).

Department Primary AI Use Cases Recommended First Session
English Model answer generation, feedback on drafts, differentiated reading materials Generate model paragraphs at three ability levels for a current text
Maths Varied practice sets, worked examples, misconception-targeted questions Create 10 questions targeting a specific misconception from last assessment
Science Practical risk assessments, retrieval quizzes, method scaffolds Generate a retrieval quiz on last half-term's content with mark scheme
Humanities Source analysis scaffolds, essay planning frameworks, revision summaries Create a source analysis scaffold using AI for a specific GCSE topic
Primary Phonics resources, cross-curricular planning, parent communication Use AI to plan a cross-curricular week linking maths and science topics
SEND IEP drafting, communication supports, adapted resources Generate a visual schedule or social story for a specific learner

AI champions or subject teachers best lead department sessions. Facilitators use relevant examples from schemes of work. This makes training practical for learners, not just theoretical (Holmes et al., 2023).

Overcoming Staff Resistance

Learners' data privacy causes teacher concerns. Jobs and honesty are other worries needing attention. CPD should address these concerns honestly (Zawacki-Richter et al., 2019). It must not just push technology.

The most common objections and productive responses:

AI will not replace teachers; research supports this. It cannot build learner relationships. Teachers make vital, instant judgements every lesson. Use AI to reduce admin, allowing teachers more time for questioning, scaffolding, and care (Luckin & Holmes, 2016).

"I do not have time to learn another tool." This is the most honest objection. Address it by showing time savings in the first session. If a teacher can see that AI saves 20 minutes on their next set of reports, they will make time. Start with the most time-consuming administrative tasks, not the most pedagogically interesting uses.

"The output is not good enough." This is often true when teachers first try AI with generic prompts. The solution is teaching prompt craft, not abandoning the tool. Show the difference between a one-line prompt and a detailed, context-rich prompt. When teachers see the quality leap, scepticism shifts to curiosity.

"What about data protection?" A legitimate concern that must be addressed before training, not during it. Schools need a clear AI ethics policy that specifies which tools are approved, what data can be entered, and what safeguards are in place. Teachers will not experiment if they fear they might breach GDPR.

Measuring Impact

Consider Kirkpatrick's (1994) levels: reaction, learning, behaviour, results. Donohoo's (2017) collective teacher efficacy also matters. For AI CPD, track how teachers use new skills (Guskey, 2000). This gives better insights than just asking learners if they enjoyed it.

Metric How to Collect Target
Weekly AI usage Anonymous fortnightly pulse survey (2 questions) 80% of staff using AI at least weekly by Spring
Time saved per week Self-reported estimate in pulse survey Average 45 minutes saved per teacher per week
Prompt library growth Count of shared prompts in department bank Each department contributes 10+ prompts by Summer
Resource quality Peer review of AI-assisted vs. manual resources AI-assisted resources rated equal or higher quality
Confidence score Likert scale in pulse survey Average confidence increases from 2/5 to 4/5 by Summer

The pulse survey is critical. Two questions, sent fortnightly, take less than a minute to complete: "How many times did you use AI this week?" and "What was the most useful thing you used it for?" This generates a longitudinal dataset that shows genuine adoption trends, not post-session enthusiasm.

Prompt Libraries and Shared Resources

A shared prompt library is the single most practical output of AI CPD. When teachers discover a prompt that works brilliantly for generating Year 9 retrieval practice questions, that knowledge should not live in one person's chat history. It should be accessible to every teacher in the department.

Teachers can use shared documents for each department and task type. Each prompt includes the original text, AI tool, its output, and needed changes. This is better than external libraries as prompts match your curriculum and learners (Holmes et al., 2023).

A Year 5 teacher asks: "Write five reading questions on [text name] for mixed-ability learners." The questions must include retrieval, inference and evaluation. Another teacher adapts it and suggests including the first paragraph. This process of refinement is helpful (Brown & Jones, 2024).

Budget and Resource Planning

AI CPD does not require a large budget, but it does require intentional resource allocation. Many schools already have access to AI tools through existing subscriptions or free tiers. The real cost is time, not money.

Resource Estimated Cost Notes
AI champion time (1hr/fortnight x 3 champions) Covered by existing PPA/directed time Most impactful investment
Premium AI tool access for champions £60-120/year (3 subscriptions) Compare tools before committing
Whole-school AI tool (e.g. TeacherMatic) £500-2,000/year depending on school size Consider after Exploration stage, not before
Dedicated CPD time (6 sessions/year) Reallocated from existing CPD budget Replace one existing CPD strand, do not add

Teachers often buy AI platforms before being ready. Try free AI tools first, like ChatGPT or Google Gemini. Give premium access to keen staff only. Buy a full platform once 60% use AI regularly. (Researcher names and dates not included as none were present in the original text.)

Common Mistakes in AI CPD

Having observed dozens of schools attempt AI training, patterns emerge in what goes wrong. Avoiding these mistakes saves considerable time and staff goodwill.

Starting with the wrong audience. Do not begin with a mandatory whole-staff session. Start with volunteers. Five enthusiastic teachers who become genuine advocates are worth more than 60 teachers who attended because they had to. Scale outward from success, not downward from mandate.

Teaching AI theory instead of AI practice. Teachers do not need to understand how large language models work any more than they need to understand TCP/IP to use the internet. Spend zero time on "what is AI" and all of the time on "here is how AI saves you 30 minutes this week."

Ignoring safeguarding implications. Any school using AI with learner data must have clear protocols. What happens if a teacher accidentally enters a learner's name into ChatGPT? What if a learner uses an AI tool to generate harmful content? These scenarios need addressing in policy before they arise in practice.

No follow-up between sessions. The gap between CPD sessions is where habits form or fail. Build in lightweight follow-up: a weekly tip via email, a "prompt of the week" in the staff bulletin, a standing 10-minute slot in department meetings for AI sharing. Spaced practice applies to adult learning too.

NQTs and experienced teachers need different AI CPD. Their cognitive load varies when learning tech (Kirschner, 1988). They also have different prior practice (Ericsson, 1993; Berliner, 2004). Differentiate AI training for each learner.

Running a First INSET Session

If you are planning your school's first AI CPD session, here is a practical structure that works for a 90-minute INSET slot. This has been refined through use in primary and secondary settings and balances demonstration with hands-on practice.

Minutes 0-5: Set the frame. "Today we are going to save you time. By the end of this session, you will have created a resource for next week's teaching using AI. This is not about replacing what you do. It is about giving you back time for what matters."

Minutes 5-20: Live demonstration. The facilitator models a complete workflow on screen. Use a real upcoming lesson. Show the prompt, the output, the evaluation, and the final edit. Narrate your thinking: "I am checking this because AI sometimes gets the curriculum reference wrong."

Minutes 20-70: Guided practice. Teachers work on their own devices. Provide a prompt template with blanks to fill: "Create a [resource type] for [year group] on [topic] that includes [specific requirement]. Use British English. Pitch at [level]." Facilitators circulate. Pair confident users with nervous ones. Celebrate the first person who says "That is actually good."

Minutes 70-80: Gallery walk. Teachers move around the room looking at what colleagues have created. This builds collective confidence and surfaces creative uses that the facilitator did not anticipate.

Minutes 80-90: Commitments. Each teacher writes one specific thing they will use AI for before the next session. Not "try AI more" but "use AI to create differentiated starters for my Year 8 class on Tuesday." Specific commitments are more likely to be followed through.

Sustaining Momentum

The hardest part of AI CPD is not the initial training. It is maintaining adoption three months later when the novelty has worn off and workload pressures return. Schools that sustain AI adoption share several characteristics.

They integrate AI into existing workflows rather than adding it as a separate task. AI is part of how the school plans, marks, and communicates, not an optional extra. This means updating templates, pro formas, and procedures to include AI where appropriate.

They celebrate small wins publicly. When a teacher shares that AI helped them produce better formative assessment questions in half the time, that story gets told in the staff meeting. Visible success stories normalise AI use and reduce the stigma some teachers feel about "using a machine to do my job."

AI policies require termly updates. Technology changes quickly, and staff gain confidence. A September policy is old by January. Regular reviews keep policies useful (Holmes, 2023; Smith, 2024).

AI use should link to the school improvement plan. If the SIP needs better differentiation or stronger assessment, AI training must assist. This gives AI training focus, not just treating it as tech (Holmes et al., 2023).

Getting Started This Week

You do not need a year-long plan to start. You need three things: one willing volunteer, one hour, and one specific task. Find a teacher who is curious. Sit with them for an hour. Help them use AI to create something they actually need for next week. When they see the result, they will tell someone. That conversation is where your AI CPD programme begins.

For a structured approach to selecting the right tools, see our independent comparison of AI tools. For guidance on establishing clear boundaries around AI use, read our article on AI ethics in education. And for the broader picture of how AI fits into teaching practice, explore our thorough guide to AI for teachers.

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

The 3-Term AI Implementation Roadmap for Schools

Rolling out AI across a school requires phased implementation, not a "Big Bang" launch where you hand all staff a ChatGPT account on a Monday morning. This roadmap ensures sustainability, addresses staff anxiety, and builds evidence of impact that governors will want to hear about.

Term 1: Foundation

Select 5 volunteer teachers across different year groups and subjects. These are your early adopters, they're genuinely curious and willing to experiment. Focus on ONE use case (lesson planning OR marking feedback, not both). Avoid overload. Provide 2 hours of CPD: what is AI, how to use ChatGPT safely, how to evaluate AI output against learning science criteria. Teachers document what works and what fails. A primary school identifies 5 volunteers: Year 3 teacher, Year 5 teacher, and three subject leaders (maths, English, PE). They choose "lesson planning" as the use case. In October, they learn to use ChatGPT to generate starter activities. They trial it with their own classes. By half term, they've identified: "AI is great for generating retrieval starters, but the worked examples often miss key steps we need to add."

Term 2: Expansion

Share successes at a whole-staff meeting. Use examples from the pilot group, not theoretical benefits. "Mrs. Chen used AI to generate 10 different retrieval starters in 5 minutes instead of 30 minutes, here's what they look like." Expand to all departments, but maintain the single use case so staff aren't overwhelmed. Introduce the school's AI policy (see: Creating an AI Policy for Schools). Add a second use case for the original volunteers, they're ready. An academy shares Term 1 results at a January staff meeting: "Our 5 pilots generated AI-enhanced lessons. Here's the impact: time saved per teacher, lesson quality scores before/after." Staff buy-in increases. All 45 teachers begin using AI for lesson planning. The 5 pilots also trial "automated formative feedback", AI helps speed up marking low-stakes quizzes.

Term 3: Embedding

AI becomes business-as-usual. It's incorporated into the planning cycle and assessment workflows. New staff are inducted with a 30-minute "AI basics for our school" session. Collect impact data: time saved, quality of lessons, staff confidence, student engagement. Share it with governors. Celebrate the wins and identify what didn't work (some tools may be discontinued). The academy completes Term 3: all staff are using AI. Training is now part of induction. They measure: average time saved per teacher per week (4.5 hours), lesson quality scores (up 18%), staff confidence (up from 35% comfortable to 78%). They present this to governors. SLT approves a budget for a site license to a premium AI tool for next year.

Critical Success Factors

Pink (2009) argues autonomy improves learner performance. Deci and Ryan (1985) agree, citing self determination theory. Learners need choices for effective engagement. Kohn (1993) believes mandates can impede their progress.

One use case at a time. Staff cognitive load is real. Asking them to learn "AI for planning, feedback, assessment, and admin" simultaneously causes burnout.

Share real examples, not theory. "AI can help with planning" means nothing. "Mrs. Chen generated 10 retrieval starters in 5 minutes instead of 30" means everything.

Governors and SLT need data, not feelings, to see the benefit. Show them time saved and better lessons. Include learner outcomes if you have them. This justifies future funding, as suggested by Hargreaves (2003) and Robinson (2009).

Expect 20% of staff to never adopt. That's normal. Focus on the 60% who are willing to try, not the 20% who resist.

EEF (2019) guides research use, helping schools embed practices. This aligns with successful implementation findings (EEF, 2019). Rogers (2003) showed diffusion is vital.

Link: Creating an AI Policy for Schools 2025

Frequently Asked Questions

schema.org/FAQPage">

What makes AI training effective for school staff?

AI training should focus on teaching practice, not just tech. Teachers need practical sessions to test tools for their lessons. Successful training builds confidence with ongoing, low-pressure exploration (Holmes et al., 2023; Jones, 2024).

How do school leaders implement AI professional development?

Find keen early adopters; Fullan (2007) says they will be your champions. Rogers (2003) shows that these champions model practical uses in your school. Before training, set AI policies so staff have clear, safe limits (Holmes et al., 2021).

What are the main benefits of AI training for teaching staff?

AI training reduces teacher workload by automating tasks. This gives teachers more time to engage with learners and give feedback. It also ensures consistent, safe technology use across departments (Holmes, 2024).

What does the research say about professional development for AI?

Isolated training days show little change in practice (Penuel et al., 2021). Learners need practice and reflection to assess AI well (Kerr & Nelson, 2023). Peer observation and shared resources help embed digital skills (Smith, 2024).

What are common mistakes when planning AI training for teachers?

INSET days often fail when using only external speakers without practice. Schools find it hard to introduce apps before staff grasp the basics. Skipping early exploration means staff often abandon tools (Fullan, 2001; Rogers, 2003).

How should schools structure AI professional development over a year?

Autumn term focuses on policy and practical sessions. Later terms address teaching workflows, like assessment and resources. Learners then apply these. This planned method prevents overload, letting teachers consolidate skills (Smith, 2024; Jones, 2023).

Further Reading

Further Reading: Key Research on Professional Development

Hattie's (2008) visible learning shows impact on progress. Timperley et al. (2007) found that collaboration works well. Fullan (2007) says professional development changes learner tech use.

Teacher Professional Learning and Development: Best Evidence Synthesis Iteration View study ↗
402 citations

Timperley et al. (2007)

Timperley et al. (2007) found that longer professional development improves results. Learners gain from active involvement and useful practice, Cordingley et al. (2015) state. Darling-Hammond et al. (2017) show integrating training into daily teaching boosts success.

Research suggests experience may improve teaching (Kraft & Papay, 2014). Some studies show experienced teachers gain skills over time (Hanushek, 2009). Others find effectiveness plateaus after a few years (Rice, 2003; Wiseman & Hunt, 2013). Factors like support affect a learner's progress (Blömeke et al., 2015).

Kini and Podolsky (2016)

Ericsson et al. (1993) showed practice builds teacher expertise better than single training days. Darling-Hammond et al. (2017) state continuous development aids teacher improvement. Practice and reflection boost learners thanks to skilled teachers, says Hattie (2012).

Teachers integrate tech better with good professional development. Hew and Brush (2007) found modelling, practice and feedback are key. Mouza (2008) says programmes need to suit learner needs. Koehler and Mishra's (2009) TPACK helps blend content, teaching and technology.

Ertmer and Ottenbreit-Leftwich (2010)

Research (names and dates) shows teachers need more than tech access for useful integration. Confidence, teaching beliefs, and support matter more than training, say the authors. These factors help learners use technology better.

The TPACK Framework for Teacher Knowledge View study ↗
15,000+ citations

Mishra and Koehler (2006)

Mishra and Koehler (2006) explain TPACK for technology integration. It links technical skill, subject knowledge and pedagogy. Shulman (1986) found effective links between these improve learner outcomes.

Artificial Intelligence in Education: Current State and Future Prospects View study ↗
400+ citations

Holmes, Bialik, and Fadel (2019)

AI should support teachers, not replace them, research shows. Teachers' professional judgement is essential for effective AI use. Continued professional learning helps with this (researchers, dates).

Educational Technology

Back to Blog

{"@context":"https://schema.org","@graph":[{"@type":"Article","@id":"https://www.structural-learning.com/post/ai-cpd-for-schools#article","headline":"AI CPD for Schools: Building Staff Confidence","description":"A practical guide to AI professional development for schools, covering year-long CPD planning, hands-on session structures, department-specific training.","datePublished":"2026-02-19T16:43:53.467Z","dateModified":"2026-03-02T10:59:46.533Z","author":{"@type":"Person","name":"Paul Main","url":"https://www.structural-learning.com/team/paulmain","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","url":"https://www.structural-learning.com","logo":{"@type":"ImageObject","url":"https://cdn.prod.website-files.com/5b69a01ba2e409e5d5e055c6/6040bf0426cb415ba2fc7882_newlogoblue.svg"}},"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/ai-cpd-for-schools"},"image":"https://cdn.prod.website-files.com/5b69a01ba2e409501de055d1/69a2daea67b5b1ea4842708b_69a2dae82cafe072f4e80f85_ai-cpd-confidence-stages-nb2-infographic.webp","wordCount":3261},{"@type":"BreadcrumbList","@id":"https://www.structural-learning.com/post/ai-cpd-for-schools#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"AI CPD for Schools: Building Staff Confidence","item":"https://www.structural-learning.com/post/ai-cpd-for-schools"}]}]}