A practical guide to AI professional development for schools, covering year-long CPD planning, hands-on session structures, department-specific training, overcoming staff resistance, and measuring impact.
Start with pedagogy, not technology: The most effective AI training begins with what teachers already know about learning, then shows how AI supports those principles.
Practice trumps presentation: Staff need hands-on time with AI tools during CPD, not slide decks about AI theory. Budget 70% of each session for guided practice.
Build internal expertise first: Identify two or three confident early adopters and develop them as AI champions before attempting whole-school rollout.
Create a shared AI policy before training: Teachers need clear boundaries on what is and is not acceptable AI use before they will experiment confidently.
Most schools that attempt AI training make the same mistake: they book a one-off INSET day, invite an external speaker, and hope the technology sticks. Three weeks later, the same teachers are back to doing everything manually. The problem is not a lack of enthusiasm. The problem is that traditional CPD models were never designed for a technology that changes every few months.
Effective AI professional development looks different from anything schools have done before. It requires ongoing, embedded practice rather than isolated events. It demands a shift from "here is a tool" to "here is how this tool supports what you already do well." And it works best when teachers learn from each other, not from external consultants who have never planned a lesson for Year 4.
Why Traditional CPD Fails for AI
Traditional CPD follows a predictable pattern: an expert presents, teachers listen, everyone returns to their classrooms, and nothing changes. Research on professional development effectiveness (Timperley et al., 2007) consistently shows that one-off sessions produce negligible shifts in practice. AI makes this worse because the tools update faster than any training materials.
Consider what happened when schools introduced interactive whiteboards. Billions spent on hardware and training, yet most teachers used them as expensive projectors. The same pattern threatens AI adoption. Without sustained, practice-based CPD, teachers default to surface-level use: asking ChatGPT to write reports they then rewrite anyway.
The distinction matters because AI, unlike a whiteboard, requires ongoing professional judgement. A teacher using AI for differentiated resource creation needs to evaluate the output against their knowledge of each pupil. That skill cannot be taught in a 45-minute twilight session. It develops through repeated practice with structured reflection.
A Framework for AI CPD
Effective AI CPD follows a progression that mirrors how any complex skill develops. Staff move through stages of confidence, and training should be designed around these stages rather than delivered as a single event.
Stage
Teacher Behaviour
CPD Focus
Typical Duration
Awareness
Curious but unsure where to start
Demonstrate one classroom use case with live modelling
1 session
Exploration
Trying AI for personal tasks (reports, emails)
Guided practice with prompts for planning and marking
Action research, mentoring colleagues, refining school policy
1-2 terms
The key insight is that most schools attempt to jump from Awareness to Integration in a single INSET day. That gap is where confidence collapses. Teachers who feel overwhelmed retreat to familiar methods. Building in the Exploration stage, with low-stakes practice and regular check-ins, makes the difference between adoption and abandonment.
Identifying AI Champions
Every school has two or three teachers who are already experimenting with AI, even if nobody has asked them to. These early adopters are your most valuable CPD resource. They understand the school context, know the curriculum, and can translate abstract AI capabilities into specific classroom applications.
Identifying AI champions does not require a formal application process. Look for teachers who mention AI in staffroom conversation, who have adapted resources using ChatGPT, or who ask questions about school AI policy. These teachers do not need to be technology specialists. The best AI champions are strong practitioners who happen to be curious about new tools.
Give your AI champions three things: time (one hour per fortnight for experimentation), a small budget for premium tool access, and a platform to share what they discover. A standing slot in the weekly staff briefing or a shared channel on Teams works well. Their role is not to become AI trainers but to demonstrate what practical, time-saving AI use looks like from someone who teaches the same pupils.
Building a Year-Long CPD Plan
A structured year-long plan prevents the common pattern of initial excitement followed by gradual abandonment. Each term has a different focus, building on the previous one.
Term
Focus
Activities
Success Measure
Autumn 1
Foundations and policy
Draft AI policy, staff survey, first hands-on session
100% staff have used an AI tool at least once
Autumn 2
Planning and resources
AI for lesson planning, prompt libraries, department-specific workshops
Differentiated resources produced in half the time
Summer 1
Pupil-facing AI
Academic integrity lessons, teaching pupils to evaluate AI output, classroom trials
Pupils can articulate when AI use is appropriate
Summer 2
Review and sustainability
Impact review, policy update, plan for next year, share successes
AI champions identified for next year; policy updated
This plan assumes approximately one hour of dedicated AI CPD per half-term, plus informal sharing through staff briefings and department meetings. Schools that try to compress this into two INSET days typically find that teachers cannot retain or apply the skills without the spaced practice between sessions.
Structuring Hands-On Sessions
The single biggest mistake in AI training is talking about AI instead of using it. Every CPD session should follow a 20/70/10 structure: 20% demonstration, 70% guided practice, 10% reflection and next steps.
A well-structured hands-on session looks like this. The facilitator opens their laptop and shares their screen. They model a complete workflow: taking a real lesson from next week, writing a prompt, evaluating the output, editing the result, and producing a finished resource. This takes about 12 minutes. Then teachers open their own devices with a specific task: "Use AI to create a differentiated worksheet for your next lesson on a topic you are actually teaching this week." The facilitator circulates, troubleshoots, and pairs teachers who are struggling with those who are confident.
The final 10% is where real learning happens. Teachers share one thing that worked and one thing that surprised them. This reflective practice, grounded in the experiential learning cycle, converts isolated practice into transferable understanding. It also surfaces the specific challenges that become the focus of the next session.
Department-Specific AI Training
Generic AI training frustrates experienced teachers because it fails to address the specific demands of their subject. A maths teacher's AI workflow differs fundamentally from an English teacher's. Training must reflect these differences.
Department
Primary AI Use Cases
Recommended First Session
English
Model answer generation, feedback on drafts, differentiated reading materials
Generate model paragraphs at three ability levels for a current text
Maths
Varied practice sets, worked examples, misconception-targeted questions
Create 10 questions targeting a specific misconception from last assessment
Generate a visual schedule or social story for a specific pupil
Department-level sessions work best when led by AI champions within that department, or at least by someone who teaches the same subject. The facilitator can use real examples from upcoming schemes of work, which makes the training immediately applicable rather than theoretical.
Overcoming Staff Resistance
Resistance to AI is rational. Teachers are right to worry about data privacy, job displacement, and academic integrity. Effective CPD acknowledges these concerns directly rather than dismissing them with enthusiasm about the technology.
The most common objections and productive responses:
"AI will replace teachers." No credible evidence supports this. AI cannot build relationships, read a room, or make the hundreds of micro-decisions a teacher makes every lesson. Frame AI as reducing administrative burden so teachers can spend more time on the parts of teaching that require a human: questioning, responsive scaffolding, and pastoral care.
"I do not have time to learn another tool." This is the most honest objection. Address it by showing time savings in the first session. If a teacher can see that AI saves 20 minutes on their next set of reports, they will make time. Start with the most time-consuming administrative tasks, not the most pedagogically interesting uses.
"The output is not good enough." This is often true when teachers first try AI with generic prompts. The solution is teaching prompt craft, not abandoning the tool. Show the difference between a one-line prompt and a detailed, context-rich prompt. When teachers see the quality leap, scepticism shifts to curiosity.
"What about data protection?" A legitimate concern that must be addressed before training, not during it. Schools need a clear AI ethics policy that specifies which tools are approved, what data can be entered, and what safeguards are in place. Teachers will not experiment if they fear they might breach GDPR.
Measuring Impact
Schools often struggle to measure CPD impact because they rely on satisfaction surveys ("Was the session useful?") rather than behaviour change. For AI CPD, measure what teachers actually do differently.
Metric
How to Collect
Target
Weekly AI usage
Anonymous fortnightly pulse survey (2 questions)
80% of staff using AI at least weekly by Spring
Time saved per week
Self-reported estimate in pulse survey
Average 45 minutes saved per teacher per week
Prompt library growth
Count of shared prompts in department bank
Each department contributes 10+ prompts by Summer
Resource quality
Peer review of AI-assisted vs. manual resources
AI-assisted resources rated equal or higher quality
Confidence score
Likert scale in pulse survey
Average confidence increases from 2/5 to 4/5 by Summer
The pulse survey is critical. Two questions, sent fortnightly, take less than a minute to complete: "How many times did you use AI this week?" and "What was the most useful thing you used it for?" This generates a longitudinal dataset that shows genuine adoption trends, not post-session enthusiasm.
Prompt Libraries and Shared Resources
A shared prompt library is the single most practical output of AI CPD. When teachers discover a prompt that works brilliantly for generating Year 9 retrieval practice questions, that knowledge should not live in one person's chat history. It should be accessible to every teacher in the department.
The simplest approach is a shared document or folder organised by department and task type. Each prompt includes the original text, the AI tool used, an example of what it produced, and any modifications needed. This is more useful than any external prompt library because it contains prompts tested against your curriculum, your pupils, and your expectations.
For example, a Year 5 teacher shares this prompt: "Create five reading comprehension questions about [text name] for mixed-ability Year 5. Include two retrieval questions, two inference questions, and one evaluation question. Use British English. Pitch at National Curriculum expected standard." Another teacher in the same year group adapts it for a different text and adds a note: "Works better if you paste the first paragraph as context." This iterative refinement is where the real value lies.
Budget and Resource Planning
AI CPD does not require a large budget, but it does require intentional resource allocation. Many schools already have access to AI tools through existing subscriptions or free tiers. The real cost is time, not money.
The most common budgeting error is purchasing a whole-school AI platform before staff are ready to use it. A better approach: start with free tools (ChatGPT free tier, Google Gemini), invest in premium access for champions only, and only purchase a whole-school platform once at least 60% of staff are using AI regularly.
Common Mistakes in AI CPD
Having observed dozens of schools attempt AI training, patterns emerge in what goes wrong. Avoiding these mistakes saves considerable time and staff goodwill.
Starting with the wrong audience. Do not begin with a mandatory whole-staff session. Start with volunteers. Five enthusiastic teachers who become genuine advocates are worth more than 60 teachers who attended because they had to. Scale outward from success, not downward from mandate.
Teaching AI theory instead of AI practice. Teachers do not need to understand how large language models work any more than they need to understand TCP/IP to use the internet. Spend zero time on "what is AI" and all of the time on "here is how AI saves you 30 minutes this week."
Ignoring safeguarding implications. Any school using AI with pupil data must have clear protocols. What happens if a teacher accidentally enters a pupil's name into ChatGPT? What if a pupil uses an AI tool to generate harmful content? These scenarios need addressing in policy before they arise in practice.
No follow-up between sessions. The gap between CPD sessions is where habits form or fail. Build in lightweight follow-up: a weekly tip via email, a "prompt of the week" in the staff bulletin, a standing 10-minute slot in department meetings for AI sharing. Spaced practice applies to adult learning too.
Treating all teachers the same. An NQT and a teacher with 25 years of experience have different needs, different levels of cognitive load when learning new technology, and different amounts of existing practice to protect. Differentiate your AI CPD just as you would differentiate a lesson.
Running a First INSET Session
If you are planning your school's first AI CPD session, here is a practical structure that works for a 90-minute INSET slot. This has been refined through use in primary and secondary settings and balances demonstration with hands-on practice.
Minutes 0-5: Set the frame. "Today we are going to save you time. By the end of this session, you will have created a resource for next week's teaching using AI. This is not about replacing what you do. It is about giving you back time for what matters."
Minutes 5-20: Live demonstration. The facilitator models a complete workflow on screen. Use a real upcoming lesson. Show the prompt, the output, the evaluation, and the final edit. Narrate your thinking: "I am checking this because AI sometimes gets the curriculum reference wrong."
Minutes 20-70: Guided practice. Teachers work on their own devices. Provide a prompt template with blanks to fill: "Create a [resource type] for [year group] on [topic] that includes [specific requirement]. Use British English. Pitch at [level]." Facilitators circulate. Pair confident users with nervous ones. Celebrate the first person who says "That is actually good."
Minutes 70-80: Gallery walk. Teachers move around the room looking at what colleagues have created. This builds collective confidence and surfaces creative uses that the facilitator did not anticipate.
Minutes 80-90: Commitments. Each teacher writes one specific thing they will use AI for before the next session. Not "try AI more" but "use AI to create differentiated starters for my Year 8 class on Tuesday." Specific commitments are more likely to be followed through.
Sustaining Momentum
The hardest part of AI CPD is not the initial training. It is maintaining adoption three months later when the novelty has worn off and workload pressures return. Schools that sustain AI adoption share several characteristics.
They integrate AI into existing workflows rather than adding it as a separate task. AI is part of how the school plans, marks, and communicates, not an optional extra. This means updating templates, pro formas, and procedures to include AI where appropriate.
They celebrate small wins publicly. When a teacher shares that AI helped them produce better formative assessment questions in half the time, that story gets told in the staff meeting. Visible success stories normalise AI use and reduce the stigma some teachers feel about "using a machine to do my job."
They update their AI policy termly. The technology changes, new tools emerge, and staff confidence grows. A policy written in September is outdated by January. Building in regular review keeps the policy practical rather than restrictive.
They connect AI use to the school improvement plan. If the SIP priorities include improving differentiation or strengthening assessment, AI CPD should directly support those priorities. This gives AI training strategic legitimacy rather than treating it as a technology initiative.
Getting Started This Week
You do not need a year-long plan to start. You need three things: one willing volunteer, one hour, and one specific task. Find a teacher who is curious. Sit with them for an hour. Help them use AI to create something they actually need for next week. When they see the result, they will tell someone. That conversation is where your AI CPD programme begins.
For a structured approach to selecting the right tools, see our independent comparison of AI tools. For guidance on establishing clear boundaries around AI use, read our article on AI ethics in education. And for the broader picture of how AI fits into teaching practice, explore our comprehensive guide to AI for teachers.
Further Reading
Further Reading: Key Research on Professional Development
These papers address what makes teacher professional development effective, with particular relevance to technology adoption and sustained practice change.
Teacher Professional Learning and Development: Best Evidence Synthesis IterationView study ↗ 1,200+ citations
Timperley et al. (2007)
The definitive synthesis on what makes professional development effective. Extended timeframes, active teacher participation, and integration with daily practice consistently produced the strongest outcomes across all contexts studied.
Does Teaching Experience Increase Teacher Effectiveness? A Review of the ResearchView study ↗ 800+ citations
Kini and Podolsky (2016)
Demonstrates that teacher expertise develops through deliberate practice over time, not through isolated training events. The findings support sustained CPD models where teachers refine their practice through repeated cycles of experimentation and reflection.
Preparing Teachers for a Digital Age: Professional Development Programmes for Technology IntegrationView study ↗ 500+ citations
Ertmer and Ottenbreit-Leftwich (2010)
Identifies the gap between teachers having access to technology and actually integrating it meaningfully. The authors argue that confidence, pedagogical beliefs, and ongoing support matter more than technical training for sustained technology adoption.
The TPACK Framework for Teacher KnowledgeView study ↗ 15,000+ citations
Mishra and Koehler (2006)
The most cited framework for understanding how teachers integrate technology. Technological Pedagogical Content Knowledge (TPACK) demonstrates that effective technology use requires the intersection of technical skill, subject knowledge, and pedagogical understanding.
Artificial Intelligence in Education: Current State and Future ProspectsView study ↗ 400+ citations
Holmes, Bialik, and Fadel (2019)
A comprehensive examination of AI's role in education that emphasises teacher agency and professional judgement. The authors argue that AI should augment rather than replace teacher expertise, and that effective implementation requires sustained professional learning.
Start with pedagogy, not technology: The most effective AI training begins with what teachers already know about learning, then shows how AI supports those principles.
Practice trumps presentation: Staff need hands-on time with AI tools during CPD, not slide decks about AI theory. Budget 70% of each session for guided practice.
Build internal expertise first: Identify two or three confident early adopters and develop them as AI champions before attempting whole-school rollout.
Create a shared AI policy before training: Teachers need clear boundaries on what is and is not acceptable AI use before they will experiment confidently.
Most schools that attempt AI training make the same mistake: they book a one-off INSET day, invite an external speaker, and hope the technology sticks. Three weeks later, the same teachers are back to doing everything manually. The problem is not a lack of enthusiasm. The problem is that traditional CPD models were never designed for a technology that changes every few months.
Effective AI professional development looks different from anything schools have done before. It requires ongoing, embedded practice rather than isolated events. It demands a shift from "here is a tool" to "here is how this tool supports what you already do well." And it works best when teachers learn from each other, not from external consultants who have never planned a lesson for Year 4.
Why Traditional CPD Fails for AI
Traditional CPD follows a predictable pattern: an expert presents, teachers listen, everyone returns to their classrooms, and nothing changes. Research on professional development effectiveness (Timperley et al., 2007) consistently shows that one-off sessions produce negligible shifts in practice. AI makes this worse because the tools update faster than any training materials.
Consider what happened when schools introduced interactive whiteboards. Billions spent on hardware and training, yet most teachers used them as expensive projectors. The same pattern threatens AI adoption. Without sustained, practice-based CPD, teachers default to surface-level use: asking ChatGPT to write reports they then rewrite anyway.
The distinction matters because AI, unlike a whiteboard, requires ongoing professional judgement. A teacher using AI for differentiated resource creation needs to evaluate the output against their knowledge of each pupil. That skill cannot be taught in a 45-minute twilight session. It develops through repeated practice with structured reflection.
A Framework for AI CPD
Effective AI CPD follows a progression that mirrors how any complex skill develops. Staff move through stages of confidence, and training should be designed around these stages rather than delivered as a single event.
Stage
Teacher Behaviour
CPD Focus
Typical Duration
Awareness
Curious but unsure where to start
Demonstrate one classroom use case with live modelling
1 session
Exploration
Trying AI for personal tasks (reports, emails)
Guided practice with prompts for planning and marking
Action research, mentoring colleagues, refining school policy
1-2 terms
The key insight is that most schools attempt to jump from Awareness to Integration in a single INSET day. That gap is where confidence collapses. Teachers who feel overwhelmed retreat to familiar methods. Building in the Exploration stage, with low-stakes practice and regular check-ins, makes the difference between adoption and abandonment.
Identifying AI Champions
Every school has two or three teachers who are already experimenting with AI, even if nobody has asked them to. These early adopters are your most valuable CPD resource. They understand the school context, know the curriculum, and can translate abstract AI capabilities into specific classroom applications.
Identifying AI champions does not require a formal application process. Look for teachers who mention AI in staffroom conversation, who have adapted resources using ChatGPT, or who ask questions about school AI policy. These teachers do not need to be technology specialists. The best AI champions are strong practitioners who happen to be curious about new tools.
Give your AI champions three things: time (one hour per fortnight for experimentation), a small budget for premium tool access, and a platform to share what they discover. A standing slot in the weekly staff briefing or a shared channel on Teams works well. Their role is not to become AI trainers but to demonstrate what practical, time-saving AI use looks like from someone who teaches the same pupils.
Building a Year-Long CPD Plan
A structured year-long plan prevents the common pattern of initial excitement followed by gradual abandonment. Each term has a different focus, building on the previous one.
Term
Focus
Activities
Success Measure
Autumn 1
Foundations and policy
Draft AI policy, staff survey, first hands-on session
100% staff have used an AI tool at least once
Autumn 2
Planning and resources
AI for lesson planning, prompt libraries, department-specific workshops
Differentiated resources produced in half the time
Summer 1
Pupil-facing AI
Academic integrity lessons, teaching pupils to evaluate AI output, classroom trials
Pupils can articulate when AI use is appropriate
Summer 2
Review and sustainability
Impact review, policy update, plan for next year, share successes
AI champions identified for next year; policy updated
This plan assumes approximately one hour of dedicated AI CPD per half-term, plus informal sharing through staff briefings and department meetings. Schools that try to compress this into two INSET days typically find that teachers cannot retain or apply the skills without the spaced practice between sessions.
Structuring Hands-On Sessions
The single biggest mistake in AI training is talking about AI instead of using it. Every CPD session should follow a 20/70/10 structure: 20% demonstration, 70% guided practice, 10% reflection and next steps.
A well-structured hands-on session looks like this. The facilitator opens their laptop and shares their screen. They model a complete workflow: taking a real lesson from next week, writing a prompt, evaluating the output, editing the result, and producing a finished resource. This takes about 12 minutes. Then teachers open their own devices with a specific task: "Use AI to create a differentiated worksheet for your next lesson on a topic you are actually teaching this week." The facilitator circulates, troubleshoots, and pairs teachers who are struggling with those who are confident.
The final 10% is where real learning happens. Teachers share one thing that worked and one thing that surprised them. This reflective practice, grounded in the experiential learning cycle, converts isolated practice into transferable understanding. It also surfaces the specific challenges that become the focus of the next session.
Department-Specific AI Training
Generic AI training frustrates experienced teachers because it fails to address the specific demands of their subject. A maths teacher's AI workflow differs fundamentally from an English teacher's. Training must reflect these differences.
Department
Primary AI Use Cases
Recommended First Session
English
Model answer generation, feedback on drafts, differentiated reading materials
Generate model paragraphs at three ability levels for a current text
Maths
Varied practice sets, worked examples, misconception-targeted questions
Create 10 questions targeting a specific misconception from last assessment
Generate a visual schedule or social story for a specific pupil
Department-level sessions work best when led by AI champions within that department, or at least by someone who teaches the same subject. The facilitator can use real examples from upcoming schemes of work, which makes the training immediately applicable rather than theoretical.
Overcoming Staff Resistance
Resistance to AI is rational. Teachers are right to worry about data privacy, job displacement, and academic integrity. Effective CPD acknowledges these concerns directly rather than dismissing them with enthusiasm about the technology.
The most common objections and productive responses:
"AI will replace teachers." No credible evidence supports this. AI cannot build relationships, read a room, or make the hundreds of micro-decisions a teacher makes every lesson. Frame AI as reducing administrative burden so teachers can spend more time on the parts of teaching that require a human: questioning, responsive scaffolding, and pastoral care.
"I do not have time to learn another tool." This is the most honest objection. Address it by showing time savings in the first session. If a teacher can see that AI saves 20 minutes on their next set of reports, they will make time. Start with the most time-consuming administrative tasks, not the most pedagogically interesting uses.
"The output is not good enough." This is often true when teachers first try AI with generic prompts. The solution is teaching prompt craft, not abandoning the tool. Show the difference between a one-line prompt and a detailed, context-rich prompt. When teachers see the quality leap, scepticism shifts to curiosity.
"What about data protection?" A legitimate concern that must be addressed before training, not during it. Schools need a clear AI ethics policy that specifies which tools are approved, what data can be entered, and what safeguards are in place. Teachers will not experiment if they fear they might breach GDPR.
Measuring Impact
Schools often struggle to measure CPD impact because they rely on satisfaction surveys ("Was the session useful?") rather than behaviour change. For AI CPD, measure what teachers actually do differently.
Metric
How to Collect
Target
Weekly AI usage
Anonymous fortnightly pulse survey (2 questions)
80% of staff using AI at least weekly by Spring
Time saved per week
Self-reported estimate in pulse survey
Average 45 minutes saved per teacher per week
Prompt library growth
Count of shared prompts in department bank
Each department contributes 10+ prompts by Summer
Resource quality
Peer review of AI-assisted vs. manual resources
AI-assisted resources rated equal or higher quality
Confidence score
Likert scale in pulse survey
Average confidence increases from 2/5 to 4/5 by Summer
The pulse survey is critical. Two questions, sent fortnightly, take less than a minute to complete: "How many times did you use AI this week?" and "What was the most useful thing you used it for?" This generates a longitudinal dataset that shows genuine adoption trends, not post-session enthusiasm.
Prompt Libraries and Shared Resources
A shared prompt library is the single most practical output of AI CPD. When teachers discover a prompt that works brilliantly for generating Year 9 retrieval practice questions, that knowledge should not live in one person's chat history. It should be accessible to every teacher in the department.
The simplest approach is a shared document or folder organised by department and task type. Each prompt includes the original text, the AI tool used, an example of what it produced, and any modifications needed. This is more useful than any external prompt library because it contains prompts tested against your curriculum, your pupils, and your expectations.
For example, a Year 5 teacher shares this prompt: "Create five reading comprehension questions about [text name] for mixed-ability Year 5. Include two retrieval questions, two inference questions, and one evaluation question. Use British English. Pitch at National Curriculum expected standard." Another teacher in the same year group adapts it for a different text and adds a note: "Works better if you paste the first paragraph as context." This iterative refinement is where the real value lies.
Budget and Resource Planning
AI CPD does not require a large budget, but it does require intentional resource allocation. Many schools already have access to AI tools through existing subscriptions or free tiers. The real cost is time, not money.
The most common budgeting error is purchasing a whole-school AI platform before staff are ready to use it. A better approach: start with free tools (ChatGPT free tier, Google Gemini), invest in premium access for champions only, and only purchase a whole-school platform once at least 60% of staff are using AI regularly.
Common Mistakes in AI CPD
Having observed dozens of schools attempt AI training, patterns emerge in what goes wrong. Avoiding these mistakes saves considerable time and staff goodwill.
Starting with the wrong audience. Do not begin with a mandatory whole-staff session. Start with volunteers. Five enthusiastic teachers who become genuine advocates are worth more than 60 teachers who attended because they had to. Scale outward from success, not downward from mandate.
Teaching AI theory instead of AI practice. Teachers do not need to understand how large language models work any more than they need to understand TCP/IP to use the internet. Spend zero time on "what is AI" and all of the time on "here is how AI saves you 30 minutes this week."
Ignoring safeguarding implications. Any school using AI with pupil data must have clear protocols. What happens if a teacher accidentally enters a pupil's name into ChatGPT? What if a pupil uses an AI tool to generate harmful content? These scenarios need addressing in policy before they arise in practice.
No follow-up between sessions. The gap between CPD sessions is where habits form or fail. Build in lightweight follow-up: a weekly tip via email, a "prompt of the week" in the staff bulletin, a standing 10-minute slot in department meetings for AI sharing. Spaced practice applies to adult learning too.
Treating all teachers the same. An NQT and a teacher with 25 years of experience have different needs, different levels of cognitive load when learning new technology, and different amounts of existing practice to protect. Differentiate your AI CPD just as you would differentiate a lesson.
Running a First INSET Session
If you are planning your school's first AI CPD session, here is a practical structure that works for a 90-minute INSET slot. This has been refined through use in primary and secondary settings and balances demonstration with hands-on practice.
Minutes 0-5: Set the frame. "Today we are going to save you time. By the end of this session, you will have created a resource for next week's teaching using AI. This is not about replacing what you do. It is about giving you back time for what matters."
Minutes 5-20: Live demonstration. The facilitator models a complete workflow on screen. Use a real upcoming lesson. Show the prompt, the output, the evaluation, and the final edit. Narrate your thinking: "I am checking this because AI sometimes gets the curriculum reference wrong."
Minutes 20-70: Guided practice. Teachers work on their own devices. Provide a prompt template with blanks to fill: "Create a [resource type] for [year group] on [topic] that includes [specific requirement]. Use British English. Pitch at [level]." Facilitators circulate. Pair confident users with nervous ones. Celebrate the first person who says "That is actually good."
Minutes 70-80: Gallery walk. Teachers move around the room looking at what colleagues have created. This builds collective confidence and surfaces creative uses that the facilitator did not anticipate.
Minutes 80-90: Commitments. Each teacher writes one specific thing they will use AI for before the next session. Not "try AI more" but "use AI to create differentiated starters for my Year 8 class on Tuesday." Specific commitments are more likely to be followed through.
Sustaining Momentum
The hardest part of AI CPD is not the initial training. It is maintaining adoption three months later when the novelty has worn off and workload pressures return. Schools that sustain AI adoption share several characteristics.
They integrate AI into existing workflows rather than adding it as a separate task. AI is part of how the school plans, marks, and communicates, not an optional extra. This means updating templates, pro formas, and procedures to include AI where appropriate.
They celebrate small wins publicly. When a teacher shares that AI helped them produce better formative assessment questions in half the time, that story gets told in the staff meeting. Visible success stories normalise AI use and reduce the stigma some teachers feel about "using a machine to do my job."
They update their AI policy termly. The technology changes, new tools emerge, and staff confidence grows. A policy written in September is outdated by January. Building in regular review keeps the policy practical rather than restrictive.
They connect AI use to the school improvement plan. If the SIP priorities include improving differentiation or strengthening assessment, AI CPD should directly support those priorities. This gives AI training strategic legitimacy rather than treating it as a technology initiative.
Getting Started This Week
You do not need a year-long plan to start. You need three things: one willing volunteer, one hour, and one specific task. Find a teacher who is curious. Sit with them for an hour. Help them use AI to create something they actually need for next week. When they see the result, they will tell someone. That conversation is where your AI CPD programme begins.
For a structured approach to selecting the right tools, see our independent comparison of AI tools. For guidance on establishing clear boundaries around AI use, read our article on AI ethics in education. And for the broader picture of how AI fits into teaching practice, explore our comprehensive guide to AI for teachers.
Further Reading
Further Reading: Key Research on Professional Development
These papers address what makes teacher professional development effective, with particular relevance to technology adoption and sustained practice change.
Teacher Professional Learning and Development: Best Evidence Synthesis IterationView study ↗ 1,200+ citations
Timperley et al. (2007)
The definitive synthesis on what makes professional development effective. Extended timeframes, active teacher participation, and integration with daily practice consistently produced the strongest outcomes across all contexts studied.
Does Teaching Experience Increase Teacher Effectiveness? A Review of the ResearchView study ↗ 800+ citations
Kini and Podolsky (2016)
Demonstrates that teacher expertise develops through deliberate practice over time, not through isolated training events. The findings support sustained CPD models where teachers refine their practice through repeated cycles of experimentation and reflection.
Preparing Teachers for a Digital Age: Professional Development Programmes for Technology IntegrationView study ↗ 500+ citations
Ertmer and Ottenbreit-Leftwich (2010)
Identifies the gap between teachers having access to technology and actually integrating it meaningfully. The authors argue that confidence, pedagogical beliefs, and ongoing support matter more than technical training for sustained technology adoption.
The TPACK Framework for Teacher KnowledgeView study ↗ 15,000+ citations
Mishra and Koehler (2006)
The most cited framework for understanding how teachers integrate technology. Technological Pedagogical Content Knowledge (TPACK) demonstrates that effective technology use requires the intersection of technical skill, subject knowledge, and pedagogical understanding.
Artificial Intelligence in Education: Current State and Future ProspectsView study ↗ 400+ citations
Holmes, Bialik, and Fadel (2019)
A comprehensive examination of AI's role in education that emphasises teacher agency and professional judgement. The authors argue that AI should augment rather than replace teacher expertise, and that effective implementation requires sustained professional learning.
{"@context":"https://schema.org","@graph":[{"@type":"Article","headline":"AI CPD for Schools: Building Staff Confidence","description":"A practical guide to AI professional development for schools, covering year-long CPD planning, hands-on session structures, department-specific training, and measuring impact.","datePublished":"2026-02-19T16:44:07.643Z","dateModified":"2026-02-19T16:44:07.643Z","author":{"@type":"Person","name":"Paul Main","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","logo":{"@type":"ImageObject","url":"https://assets-global.website-files.com/63316a087abe0185b617138d/63316a087abe01d55017153e_SL-logo.png"}},"wordCount":3500},{"@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Articles","item":"https://www.structural-learning.com/post"},{"@type":"ListItem","position":3,"name":"AI CPD for Schools: Building Staff Confidence"}]}]}