10 AI Prompts Every Teacher Should Master

10 AI Prompts Every Teacher Should Master

|

November 18, 2025

Proven AI prompts for differentiation, retrieval practice, feedback, and lesson planning. Ready-to-use templates that address real classroom challenges and save teacher time.

Teachers in England work an average of 50.9 hours per week, with 18.7 hours spent on tasks outside direct teaching time (Department for Education, 2024). Large language models like ChatGPT and Claude offer a practical solution. This is not about replacing teacher expertise. It's about reclaiming time for the work that matters: supporting students, refining pedagogy, and maintaining professional wellbeing.

This guide provides ten proven AI prompts that address real classroom challenges. Each prompt is ready to copy and adapt. The focus is on practical application, not theoretical possibility.

What Is an AI Prompt?


A prompt is the instruction you give to an AI language model like ChatGPT or Claude. Think of it as the brief you'd give a teaching assistant: the clearer and more specific your instructions, the more useful the output.
When you type "write me a lesson plan", the AI has to guess your year group, subject, curriculum requirements, lesson length, and teaching approach. The result is generic and needs extensive editing. When you write "create a 50-minute Year 8 history lesson on the causes of World War One, including a 10-minute retrieval starter and activities aligned to AQA GCSE", the AI has clear parameters to work within.


Research on prompt engineering shows that structured prompts consistently outperform vague requests across educational tasks (Woo et al., 2024). The difference isn't the AI's capability but the precision of your input.

Effective prompts typically include four elements:
Role: Who you are and what context you're working in (e.g., "You are a secondary science teacher in England")
Task: What you need created (e.g., "Generate five retrieval questions")
Context: Relevant details about your students and curriculum (e.g., "for Year 10 students studying AQA Biology, covering cell structure from last week's lesson")
Format: How you want the output structured (e.g., "with answers and mark schemes")

This is not programming. You don't need technical expertise. You're simply being explicit about requirements you'd naturally communicate to a colleague. The following ten prompts demonstrate this structure in action across common teaching tasks.

Why Specific Prompts Produce Better Results

The quality of AI output depends entirely on the quality of your input. Generic requests produce generic results. Specific prompts that include role, context, task, and format constraints produce usable resources (Anthropic, 2024).

Consider the difference:

The second prompt delivers a resource you can print and use. The first requires significant editing. Prompt engineering is a teachable skill that improves with practice, and understanding how AI tools work in educational contexts transforms their utility from experimental to essential.

1. Differentiated Reading Materials at Three Levels

The Problem: Creating multiple versions of the same text for different reading abilities takes hours. Your Year 6 class spans reading ages from Year 3 to Year 8.

The Prompt:


You are a primary teacher in England. Take this text about [TOPIC] and rewrite it at three reading levels:

1. Emerging (Year 3-4 reading age): Simple sentences, high-frequency vocabulary, one concept per sentence
2. Expected (Year 5-6 reading age): Mix of simple and compound sentences, age-appropriate vocabulary
3. Greater depth (Year 7-8 reading age): Complex sentences, subject-specific terminology, inference required

Maintain the same key facts in all versions. Length: 150-200 words per version.

Original text: [PASTE YOUR TEXT]

Why It Works: This prompt uses the Extract thinking skill (green card) from the Thinking Framework. You're asking the AI to identify core information and present it at varying complexity levels. The clear age bands and structural constraints ensure outputs match UK curriculum expectations.

When implementing differentiation strategies, the goal is maintaining cognitive challenge whilst adjusting access points. This prompt achieves precisely that balance.

Time Saved: Approximately 45 minutes per text.

2. Retrieval Practice Questions Aligned to Your Curriculum

The Problem: Effective retrieval practice requires carefully sequenced questions that revisit prior learning. Creating these questions from scratch is time-consuming.

The Prompt:


You are a [SUBJECT] teacher for Year [X] in England. Generate 10 retrieval practice questions based on this topic: [TOPIC]

Requirements:
- Questions 1-3: Recall of basic facts (What? When? Who?)
- Questions 4-7: Application and understanding (How? Why?)
- Questions 8-10: Links to prior units we studied: [LIST PREVIOUS TOPICS]

Format: Question followed by a concise answer (1-2 sentences). Avoid multiple choice. Focus on short written responses.

Why It Works: This structure follows the spacing and interleaving principles documented by Dunlosky et al. (2013) in their research on effective learning strategies. The prompt explicitly asks for connections to previous learning, strengthening long-term retention through spaced practice.

The final three questions require students to draw on their working memory to connect new and existing knowledge, which deepens understanding beyond surface recall.

Variation: Specify "include two questions that address common misconceptions about [TOPIC]" to target known areas of difficulty.

3. Constructive Marking Feedback That Reduces Workload

The Problem: Providing meaningful feedback on 30 books takes over two hours. You want feedback that improves learning without exhausting yourself.

The Prompt:


You are a [SUBJECT] teacher marking Year [X] work. A student has written this [TYPE OF WORK]:

[PASTE STUDENT WORK]

Their current target: [SPECIFIC TARGET]

Provide feedback using this structure:
1. One specific success (reference exact words or techniques they used)
2. One clear next step linked to their target (must be actionable in next lesson)
3. One question to extend their thinking

Tone: Encouraging but honest. Length: 3-4 sentences maximum. Avoid general praise like "good work" or "well done."

Why It Works: This prompt generates feedback that creates a positive feedback loop (Hattie & Timperley, 2007). It balances acknowledgment of progress with specific guidance for improvement. The question at the end promotes metacognition, helping students develop metacognitive strategies for monitoring their own learning.

This approach aligns with effective formative assessment principles where feedback directly informs next learning steps rather than serving as end-point judgment.

Assessment Note: Use this prompt for formative assessments where feedback influences next steps, not for high-stakes summative marking that requires your professional judgment.

4. EAL and SEND Modifications for Mainstream Activities

The Problem: Your lesson plan works well for most students, but you need rapid modifications for learners with English as an additional language or special educational needs.

The Prompt:


You are a teacher in England with students who have [SPECIFIC NEEDS: e.g., dyslexia, limited English proficiency, working memory difficulties].

Original classroom task: [DESCRIBE ACTIVITY]

Create three scaffolded versions:
1. Visual scaffold: Add images, diagrams, or graphic organisers
2. Language scaffold: Simplified instructions, sentence stems, word banks
3. Cognitive scaffold: Break task into smaller steps, reduce cognitive load

Each version should achieve the same learning objective but make success more accessible.

Why It Works: This addresses learning needs through multiple access points while maintaining academic expectations. Research from the Education Endowment Foundation (2023) shows that scaffolding is most effective when it's temporary and gradually removed as competence develops.

The visual scaffold suggestion pairs naturally with Map It graphic organisers like the Fishbone or Flow-chart, depending on the task type. When working with learners who have special educational needs, combining AI-generated scaffolds with adaptive teaching approaches that adjust challenge level based on real-time assessment addresses diverse processing profiles simultaneously.

Integration with Tools: For students requiring significant support, combine AI-generated scaffolds with adaptive teaching approaches that adjust challenge level based on real-time assessment.

5. Lesson Planning with Thinking Framework Integration

The Problem: You want to incorporate metacognitive strategies but lack time to plan which thinking skills fit specific lesson objectives.

The Prompt:


You are a teacher in England using the Thinking Framework (30 metacognitive strategies grouped by colour: green-Extract, blue-Categorise, yellow-Explain, orange-Target Vocabulary, red-Combine).

Plan a 50-minute lesson for Year [X] [SUBJECT] on [TOPIC].

Include:
- Learning objective and success criteria
- Starter (5 min): Retrieval activity
- Main teaching (15 min): Input with think-aloud modelling
- Guided practice (20 min): Specify which Thinking Framework card to use and why
- Independent application (8 min)
- Exit check (2 min): Assessment question

Also suggest one Map It graphic organiser (Fishbone, Cycle, Flow-chart, or Diamond 9) that supports the thinking required.

Why It Works: This prompt structures lesson outlines using the gradual release model (Fisher & Frey, 2013) while explicitly embedding metacognitive tools. The timing constraints force realistic planning rather than idealised sequences.

Example Output: For a Year 5 lesson on causes of coastal erosion, AI in lesson planning might suggest the blue Categorise card to sort physical versus human factors, followed by a Fishbone organiser to map causes and effects. This combination makes thinking processes visible while organising content logically.

6. Discussion Questions That Generate Student Engagement

The Problem: Whole-class discussions stall because your prepared questions are too closed or too abstract. You need questions that spark genuine thinking.

The Prompt:


You are a [SUBJECT] teacher for Year [X]. Generate 5 discussion questions about [TOPIC] that promote deep thinking.

Requirements:
- No questions that can be answered with "yes/no" or single words
- Include one question that asks students to compare or contrast
- Include one question that asks "What if...?" or "How might...?"
- Include one question where students must provide evidence or examples
- One question should connect to students' own experiences

Specify which Thinking Framework card each question aligns with (Extract, Categorise, Explain, Target Vocabulary, or Combine).

Why It Works: High-quality questions drive student engagement more effectively than activities or technology (Quality Assurance Agency, 2023). This prompt ensures questions require genuine cognitive work rather than simple recall.

Effective metacognitive questioning pushes students beyond surface answers to articulate their reasoning processes, which strengthens both understanding and retention.

Classroom Task Extension: Pair these questions with Say It prompt cards (Starter, Tell-me-more, Challenger) to scaffold student responses during discussion. The oracy tools help students elaborate, challenge assumptions, and build on peers' contributions.

7. Success Criteria That Students Can Actually Use

The Problem: Your success criteria are either too vague ("write neatly") or too complex for students to self-assess against.

The Prompt:


You are a Year [X] teacher. Create success criteria for this learning objective: [STATE OBJECTIVE]

Format the criteria as:
- 3-5 "I can..." statements
- Each statement must be observable and checkable by the student
- Use specific verbs (identify, explain, compare, calculate) not vague ones (understand, know)
- Include one statement about the process or method, not just the outcome

Provide two examples: one showing work that meets all criteria, one showing work that meets some criteria.

Why It Works: Success criteria support self-regulation when they're specific and observable (Andrade, 2019). The "I can" format transfers ownership to students. Including examples makes abstract criteria concrete, addressing the gap between teacher expectations and student interpretation.

Link to Assessment: These criteria work for both formative assessments during lessons and for setting targets in marking feedback. When students understand exactly what success looks like, they can monitor their own progress and identify specific areas needing improvement.

8. Assessment Rubrics for Complex Tasks

The Problem: Creating a fair, detailed rubric for project work or extended writing takes hours. You need consistency across student work but want to avoid overly mechanical marking.

The Prompt:


You are a teacher creating a rubric for [TASK TYPE] in Year [X] [SUBJECT].

Task description: [BRIEF OVERVIEW]

Create a 4-level rubric (Emerging, Developing, Secure, Mastery) with 4 criteria:
1. [SPECIFIC CRITERION 1, e.g., use of evidence]
2. [SPECIFIC CRITERION 2, e.g., organisation]
3. [SPECIFIC CRITERION 3, e.g., subject terminology]
4. [SPECIFIC CRITERION 4, e.g., analysis depth]

For each level, describe what the work looks like. Use specific examples rather than general descriptors. Avoid comparative language like "better than" or "more than."

Why It Works: Well-designed rubrics improve both marking reliability and student understanding of expectations (Brookhart, 2018). The prompt's requirement for specific examples prevents vague descriptors that lead to inconsistent judgment.

When used alongside AI and student assessment tools, rubrics provide the structure needed for meaningful evaluation whilst maintaining professional oversight of complex judgments.

Time Saved: Approximately 60 minutes per rubric, with the added benefit of consistency across parallel classes.

9. Parent Communication Templates for Common Scenarios

The Problem: Drafting emails to parents about concerns, celebrations, or administrative matters consumes time and emotional energy. You want communication that's professional, clear, and constructive.

The Prompt:


You are a Year [X] teacher in England. Draft a brief email (150-200 words) to a parent about: [SITUATION]

Student context: [RELEVANT DETAILS]

Tone: Professional, solution-focused, partnership-oriented
Structure:
- Acknowledge the situation directly (no generic opening)
- Provide specific examples or evidence
- Explain what you're doing in school
- Suggest one clear action for home support
- End with an invitation for dialogue

Avoid educational jargon. Use everyday language a non-teacher would understand.

Why It Works: Effective parent communication builds partnerships that improve student outcomes (Goodall & Montgomery, 2014). This prompt ensures emails are specific rather than generic, action-oriented rather than complaint-focused.

The partnership framing is critical. Parents need to understand what you're doing in school before they can support at home effectively. This collaborative approach strengthens home-school relationships and creates consistency for students.

10. Professional Reflection Prompts for Teacher Growth

The Problem: End-of-term reflection feels like box-ticking rather than genuine professional learning. You want prompts that generate actionable insights.

The Prompt:


You are an experienced teacher mentor. Generate 5 reflection questions for a teacher who has just completed a half-term focusing on [SPECIFIC GOAL, e.g., improving questioning, managing transitions, supporting reluctant writers].

Requirements:
- Questions should prompt specific examples from lessons, not general feelings
- Include one question about a moment that surprised the teacher
- Include one question about what students' responses revealed about their thinking
- Include one question that asks "What would you do differently next time?"
- Final question should identify one precise next step for continued development

Avoid vague questions like "How did it go?" or "What went well?"

Why It Works: Structured reflection supports professional development more effectively than unguided self-assessment (Schön, 1983). This prompt generates questions that uncover evidence of practice rather than opinions about practice.

The surprise question is particularly valuable because unexpected moments often reveal assumptions or blind spots in our teaching. Similarly, asking what students' responses revealed shifts focus from teacher performance to student learning, which is the ultimate measure of teaching effectiveness.

Integration: These reflection questions support professional portfolios for career development or programmes like the NPQ Leadership qualification. They also pair well with teacher coaching conversations and contribute to overall teacher wellbeing by making professional growth feel purposeful rather than performative.

Making These Prompts Work in Your Context

Each prompt template requires customisation. Add your specific curriculum standards, student names, prior learning, or contextual details. The more specific your input, the more usable the output.

Three practical strategies to improve results:

1. Build a prompt library: Save your most-used prompts in a document with placeholders for variable information. This transforms one-off requests into reusable templates. Over time, you'll develop a personalised collection tailored to your teaching context.

2. Iterate on outputs: If the first result isn't quite right, ask follow-up questions: "Simplify the language in version 2" or "Add more challenge to questions 8-10." Generative AI improves through conversation. Think of it as refining drafts rather than expecting perfection immediately.

3. Combine with your pedagogy: AI tools generate content. You provide pedagogical judgment, knowledge of your students, and curricular expertise. The prompts here are starting points, not finished resources. Your professional insight remains essential at every stage.

Ethical Considerations and Digital Tools

When using AI tools for classroom tasks, three principles remain non-negotiable:

Never input personal student data into public AI models. Avoid names, assessment scores, or sensitive information. Use anonymous descriptors: "a Year 5 student working below age-related expectations in writing."

Verify factual claims, particularly in subjects like science or history where AI-generated content may contain errors. Large language models generate plausible text; they don't fact-check. Your subject knowledge is the quality control mechanism.

Maintain your professional judgment. AI assists with content generation and admin tasks. It cannot replace your understanding of individual students, classroom dynamics, or the contextual decisions that define effective teaching.

The Department for Education (2024) emphasises that artificial intelligence in modern education should complement, not direct, professional expertise. Teachers remain accountable for the resources they use and the feedback they provide.

What to Try Next

Start with one prompt from this list. Test it in your next planning session. Refine it based on the output quality. Add your own constraints or requirements that reflect your classroom context.

AI literacy begins with practical experimentation, not theoretical understanding. The more you use these tools, the more intuitive prompt crafting becomes. Each iteration teaches you something about what produces useful results versus generic filler.

Consider also how these prompts might support colleagues. Sharing effective templates within your department or school creates collective efficiency gains whilst maintaining teaching quality across classes.

Further Reading: Research on Prompt Engineering in Education

As teachers experiment with generative AI tools, one skill consistently shapes the quality of outcomes: prompt engineering. It's no longer about simply asking better questions. It's about structuring inputs to guide AI toward meaningful, accurate, and pedagogically sound responses. The following studies explore how prompt engineering is being taught, studied, and applied across schools and universities. Together, they show that mastering this emerging skill can enhance creativity, critical thinking, and efficiency in educational contexts.

1. Prompt Engineering in Higher Education

Prompt engineering in higher education: a systematic review to help inform curricula – Lee, D. & Palmer, E. (2025). International Journal of Educational Technology in Higher Education.

This systematic review synthesises current research on how prompt engineering is used in higher education. The authors find that specific, structured prompts can significantly improve the quality and reliability of AI-generated academic work. They call for prompt engineering to be formally taught as part of digital-skills and research-literacy programmes, helping students and staff use AI more critically and effectively.

2. Prompt Engineering as a New 21st-Century Skill

Prompt engineering as a new 21st-century skill – Federiakin, D. (2024). Frontiers in Education.

Federiakin argues that prompt engineering should be treated as a fundamental digital competency, similar to information literacy or coding. The study explores how crafting prompts involves cognitive framing, problem definition, and iterative refinement: skills essential for both AI collaboration and independent thinking. It suggests integrating prompt-engineering exercises into project-based and inquiry-led learning.

3. Prompt Engineering Intervention and AI Self-Efficacy

Effects of a prompt engineering intervention on undergraduate students' AI self-efficacy, AI knowledge, and prompt engineering ability – Woo, D. J., Wang, D., Yung, T., & Guo, K. (2024). arXiv (Preprint).

In this mixed-methods study, undergraduate students participated in a short course teaching structured prompting techniques. Post-course evaluations revealed marked increases in students' AI self-efficacy, technical knowledge, and ability to refine prompts to achieve academic goals. The authors argue that even brief, focused training can empower learners to collaborate with AI confidently and critically.

4. AI Literacy and Its Implications for Prompt Engineering

AI literacy and its implications for prompt engineering strategies – Knoth, N. (2024). Computers & Education: Artificial Intelligence.

This paper connects prompt engineering with AI literacy, showing that understanding AI's underlying mechanisms directly improves prompt quality. Knoth finds that educators who grasp AI's strengths, limitations, and biases are better able to design prompts that yield accurate and relevant outcomes. The study recommends embedding prompt engineering within broader AI-literacy and ethics training for teachers.

5. Prompt Engineering in K–12 STEM Education

A systematic review on prompt engineering in large language models for K–12 STEM education – Chen, E., Wang, D., Xu, L., Cao, C., Fang, X., & Lin, J. (2024). arXiv (Preprint).

This systematic review examines 30 studies applying prompt engineering to STEM subjects in K–12 contexts. It categorises common strategies such as zero-shot, few-shot, and chain-of-thought prompting, and finds that well-scaffolded prompts help students reason more effectively and avoid AI hallucinations. The authors identify a need for classroom-based studies to validate these methods in real teaching settings.

Step 1/6
Your free resource

Enhance Learner Outcomes Across Your School

Download an Overview of our Support and Resources

Step 2/6
Contact Details

We'll send it over now.

Please fill in the details so we can send over the resources.

Step 3/6
School Type

What type of school are you?

We'll get you the right resource

Step 4/6
CPD

Is your school involved in any staff development projects?

Are your colleagues running any research projects or courses?

Step 5/6
Priorities

Do you have any immediate school priorities?

Please check the ones that apply.

Step 6/6
Confirmation

Download your resource

Thanks for taking the time to complete this form, submit the form to get the tool.

Previous
Next step
Thanks, submission has been recieved.

Click below to download.
Download
Oops! Something went wrong while submitting the form

Educational Technology

Teachers in England work an average of 50.9 hours per week, with 18.7 hours spent on tasks outside direct teaching time (Department for Education, 2024). Large language models like ChatGPT and Claude offer a practical solution. This is not about replacing teacher expertise. It's about reclaiming time for the work that matters: supporting students, refining pedagogy, and maintaining professional wellbeing.

This guide provides ten proven AI prompts that address real classroom challenges. Each prompt is ready to copy and adapt. The focus is on practical application, not theoretical possibility.

What Is an AI Prompt?


A prompt is the instruction you give to an AI language model like ChatGPT or Claude. Think of it as the brief you'd give a teaching assistant: the clearer and more specific your instructions, the more useful the output.
When you type "write me a lesson plan", the AI has to guess your year group, subject, curriculum requirements, lesson length, and teaching approach. The result is generic and needs extensive editing. When you write "create a 50-minute Year 8 history lesson on the causes of World War One, including a 10-minute retrieval starter and activities aligned to AQA GCSE", the AI has clear parameters to work within.


Research on prompt engineering shows that structured prompts consistently outperform vague requests across educational tasks (Woo et al., 2024). The difference isn't the AI's capability but the precision of your input.

Effective prompts typically include four elements:
Role: Who you are and what context you're working in (e.g., "You are a secondary science teacher in England")
Task: What you need created (e.g., "Generate five retrieval questions")
Context: Relevant details about your students and curriculum (e.g., "for Year 10 students studying AQA Biology, covering cell structure from last week's lesson")
Format: How you want the output structured (e.g., "with answers and mark schemes")

This is not programming. You don't need technical expertise. You're simply being explicit about requirements you'd naturally communicate to a colleague. The following ten prompts demonstrate this structure in action across common teaching tasks.

Why Specific Prompts Produce Better Results

The quality of AI output depends entirely on the quality of your input. Generic requests produce generic results. Specific prompts that include role, context, task, and format constraints produce usable resources (Anthropic, 2024).

Consider the difference:

The second prompt delivers a resource you can print and use. The first requires significant editing. Prompt engineering is a teachable skill that improves with practice, and understanding how AI tools work in educational contexts transforms their utility from experimental to essential.

1. Differentiated Reading Materials at Three Levels

The Problem: Creating multiple versions of the same text for different reading abilities takes hours. Your Year 6 class spans reading ages from Year 3 to Year 8.

The Prompt:


You are a primary teacher in England. Take this text about [TOPIC] and rewrite it at three reading levels:

1. Emerging (Year 3-4 reading age): Simple sentences, high-frequency vocabulary, one concept per sentence
2. Expected (Year 5-6 reading age): Mix of simple and compound sentences, age-appropriate vocabulary
3. Greater depth (Year 7-8 reading age): Complex sentences, subject-specific terminology, inference required

Maintain the same key facts in all versions. Length: 150-200 words per version.

Original text: [PASTE YOUR TEXT]

Why It Works: This prompt uses the Extract thinking skill (green card) from the Thinking Framework. You're asking the AI to identify core information and present it at varying complexity levels. The clear age bands and structural constraints ensure outputs match UK curriculum expectations.

When implementing differentiation strategies, the goal is maintaining cognitive challenge whilst adjusting access points. This prompt achieves precisely that balance.

Time Saved: Approximately 45 minutes per text.

2. Retrieval Practice Questions Aligned to Your Curriculum

The Problem: Effective retrieval practice requires carefully sequenced questions that revisit prior learning. Creating these questions from scratch is time-consuming.

The Prompt:


You are a [SUBJECT] teacher for Year [X] in England. Generate 10 retrieval practice questions based on this topic: [TOPIC]

Requirements:
- Questions 1-3: Recall of basic facts (What? When? Who?)
- Questions 4-7: Application and understanding (How? Why?)
- Questions 8-10: Links to prior units we studied: [LIST PREVIOUS TOPICS]

Format: Question followed by a concise answer (1-2 sentences). Avoid multiple choice. Focus on short written responses.

Why It Works: This structure follows the spacing and interleaving principles documented by Dunlosky et al. (2013) in their research on effective learning strategies. The prompt explicitly asks for connections to previous learning, strengthening long-term retention through spaced practice.

The final three questions require students to draw on their working memory to connect new and existing knowledge, which deepens understanding beyond surface recall.

Variation: Specify "include two questions that address common misconceptions about [TOPIC]" to target known areas of difficulty.

3. Constructive Marking Feedback That Reduces Workload

The Problem: Providing meaningful feedback on 30 books takes over two hours. You want feedback that improves learning without exhausting yourself.

The Prompt:


You are a [SUBJECT] teacher marking Year [X] work. A student has written this [TYPE OF WORK]:

[PASTE STUDENT WORK]

Their current target: [SPECIFIC TARGET]

Provide feedback using this structure:
1. One specific success (reference exact words or techniques they used)
2. One clear next step linked to their target (must be actionable in next lesson)
3. One question to extend their thinking

Tone: Encouraging but honest. Length: 3-4 sentences maximum. Avoid general praise like "good work" or "well done."

Why It Works: This prompt generates feedback that creates a positive feedback loop (Hattie & Timperley, 2007). It balances acknowledgment of progress with specific guidance for improvement. The question at the end promotes metacognition, helping students develop metacognitive strategies for monitoring their own learning.

This approach aligns with effective formative assessment principles where feedback directly informs next learning steps rather than serving as end-point judgment.

Assessment Note: Use this prompt for formative assessments where feedback influences next steps, not for high-stakes summative marking that requires your professional judgment.

4. EAL and SEND Modifications for Mainstream Activities

The Problem: Your lesson plan works well for most students, but you need rapid modifications for learners with English as an additional language or special educational needs.

The Prompt:


You are a teacher in England with students who have [SPECIFIC NEEDS: e.g., dyslexia, limited English proficiency, working memory difficulties].

Original classroom task: [DESCRIBE ACTIVITY]

Create three scaffolded versions:
1. Visual scaffold: Add images, diagrams, or graphic organisers
2. Language scaffold: Simplified instructions, sentence stems, word banks
3. Cognitive scaffold: Break task into smaller steps, reduce cognitive load

Each version should achieve the same learning objective but make success more accessible.

Why It Works: This addresses learning needs through multiple access points while maintaining academic expectations. Research from the Education Endowment Foundation (2023) shows that scaffolding is most effective when it's temporary and gradually removed as competence develops.

The visual scaffold suggestion pairs naturally with Map It graphic organisers like the Fishbone or Flow-chart, depending on the task type. When working with learners who have special educational needs, combining AI-generated scaffolds with adaptive teaching approaches that adjust challenge level based on real-time assessment addresses diverse processing profiles simultaneously.

Integration with Tools: For students requiring significant support, combine AI-generated scaffolds with adaptive teaching approaches that adjust challenge level based on real-time assessment.

5. Lesson Planning with Thinking Framework Integration

The Problem: You want to incorporate metacognitive strategies but lack time to plan which thinking skills fit specific lesson objectives.

The Prompt:


You are a teacher in England using the Thinking Framework (30 metacognitive strategies grouped by colour: green-Extract, blue-Categorise, yellow-Explain, orange-Target Vocabulary, red-Combine).

Plan a 50-minute lesson for Year [X] [SUBJECT] on [TOPIC].

Include:
- Learning objective and success criteria
- Starter (5 min): Retrieval activity
- Main teaching (15 min): Input with think-aloud modelling
- Guided practice (20 min): Specify which Thinking Framework card to use and why
- Independent application (8 min)
- Exit check (2 min): Assessment question

Also suggest one Map It graphic organiser (Fishbone, Cycle, Flow-chart, or Diamond 9) that supports the thinking required.

Why It Works: This prompt structures lesson outlines using the gradual release model (Fisher & Frey, 2013) while explicitly embedding metacognitive tools. The timing constraints force realistic planning rather than idealised sequences.

Example Output: For a Year 5 lesson on causes of coastal erosion, AI in lesson planning might suggest the blue Categorise card to sort physical versus human factors, followed by a Fishbone organiser to map causes and effects. This combination makes thinking processes visible while organising content logically.

6. Discussion Questions That Generate Student Engagement

The Problem: Whole-class discussions stall because your prepared questions are too closed or too abstract. You need questions that spark genuine thinking.

The Prompt:


You are a [SUBJECT] teacher for Year [X]. Generate 5 discussion questions about [TOPIC] that promote deep thinking.

Requirements:
- No questions that can be answered with "yes/no" or single words
- Include one question that asks students to compare or contrast
- Include one question that asks "What if...?" or "How might...?"
- Include one question where students must provide evidence or examples
- One question should connect to students' own experiences

Specify which Thinking Framework card each question aligns with (Extract, Categorise, Explain, Target Vocabulary, or Combine).

Why It Works: High-quality questions drive student engagement more effectively than activities or technology (Quality Assurance Agency, 2023). This prompt ensures questions require genuine cognitive work rather than simple recall.

Effective metacognitive questioning pushes students beyond surface answers to articulate their reasoning processes, which strengthens both understanding and retention.

Classroom Task Extension: Pair these questions with Say It prompt cards (Starter, Tell-me-more, Challenger) to scaffold student responses during discussion. The oracy tools help students elaborate, challenge assumptions, and build on peers' contributions.

7. Success Criteria That Students Can Actually Use

The Problem: Your success criteria are either too vague ("write neatly") or too complex for students to self-assess against.

The Prompt:


You are a Year [X] teacher. Create success criteria for this learning objective: [STATE OBJECTIVE]

Format the criteria as:
- 3-5 "I can..." statements
- Each statement must be observable and checkable by the student
- Use specific verbs (identify, explain, compare, calculate) not vague ones (understand, know)
- Include one statement about the process or method, not just the outcome

Provide two examples: one showing work that meets all criteria, one showing work that meets some criteria.

Why It Works: Success criteria support self-regulation when they're specific and observable (Andrade, 2019). The "I can" format transfers ownership to students. Including examples makes abstract criteria concrete, addressing the gap between teacher expectations and student interpretation.

Link to Assessment: These criteria work for both formative assessments during lessons and for setting targets in marking feedback. When students understand exactly what success looks like, they can monitor their own progress and identify specific areas needing improvement.

8. Assessment Rubrics for Complex Tasks

The Problem: Creating a fair, detailed rubric for project work or extended writing takes hours. You need consistency across student work but want to avoid overly mechanical marking.

The Prompt:


You are a teacher creating a rubric for [TASK TYPE] in Year [X] [SUBJECT].

Task description: [BRIEF OVERVIEW]

Create a 4-level rubric (Emerging, Developing, Secure, Mastery) with 4 criteria:
1. [SPECIFIC CRITERION 1, e.g., use of evidence]
2. [SPECIFIC CRITERION 2, e.g., organisation]
3. [SPECIFIC CRITERION 3, e.g., subject terminology]
4. [SPECIFIC CRITERION 4, e.g., analysis depth]

For each level, describe what the work looks like. Use specific examples rather than general descriptors. Avoid comparative language like "better than" or "more than."

Why It Works: Well-designed rubrics improve both marking reliability and student understanding of expectations (Brookhart, 2018). The prompt's requirement for specific examples prevents vague descriptors that lead to inconsistent judgment.

When used alongside AI and student assessment tools, rubrics provide the structure needed for meaningful evaluation whilst maintaining professional oversight of complex judgments.

Time Saved: Approximately 60 minutes per rubric, with the added benefit of consistency across parallel classes.

9. Parent Communication Templates for Common Scenarios

The Problem: Drafting emails to parents about concerns, celebrations, or administrative matters consumes time and emotional energy. You want communication that's professional, clear, and constructive.

The Prompt:


You are a Year [X] teacher in England. Draft a brief email (150-200 words) to a parent about: [SITUATION]

Student context: [RELEVANT DETAILS]

Tone: Professional, solution-focused, partnership-oriented
Structure:
- Acknowledge the situation directly (no generic opening)
- Provide specific examples or evidence
- Explain what you're doing in school
- Suggest one clear action for home support
- End with an invitation for dialogue

Avoid educational jargon. Use everyday language a non-teacher would understand.

Why It Works: Effective parent communication builds partnerships that improve student outcomes (Goodall & Montgomery, 2014). This prompt ensures emails are specific rather than generic, action-oriented rather than complaint-focused.

The partnership framing is critical. Parents need to understand what you're doing in school before they can support at home effectively. This collaborative approach strengthens home-school relationships and creates consistency for students.

10. Professional Reflection Prompts for Teacher Growth

The Problem: End-of-term reflection feels like box-ticking rather than genuine professional learning. You want prompts that generate actionable insights.

The Prompt:


You are an experienced teacher mentor. Generate 5 reflection questions for a teacher who has just completed a half-term focusing on [SPECIFIC GOAL, e.g., improving questioning, managing transitions, supporting reluctant writers].

Requirements:
- Questions should prompt specific examples from lessons, not general feelings
- Include one question about a moment that surprised the teacher
- Include one question about what students' responses revealed about their thinking
- Include one question that asks "What would you do differently next time?"
- Final question should identify one precise next step for continued development

Avoid vague questions like "How did it go?" or "What went well?"

Why It Works: Structured reflection supports professional development more effectively than unguided self-assessment (Schön, 1983). This prompt generates questions that uncover evidence of practice rather than opinions about practice.

The surprise question is particularly valuable because unexpected moments often reveal assumptions or blind spots in our teaching. Similarly, asking what students' responses revealed shifts focus from teacher performance to student learning, which is the ultimate measure of teaching effectiveness.

Integration: These reflection questions support professional portfolios for career development or programmes like the NPQ Leadership qualification. They also pair well with teacher coaching conversations and contribute to overall teacher wellbeing by making professional growth feel purposeful rather than performative.

Making These Prompts Work in Your Context

Each prompt template requires customisation. Add your specific curriculum standards, student names, prior learning, or contextual details. The more specific your input, the more usable the output.

Three practical strategies to improve results:

1. Build a prompt library: Save your most-used prompts in a document with placeholders for variable information. This transforms one-off requests into reusable templates. Over time, you'll develop a personalised collection tailored to your teaching context.

2. Iterate on outputs: If the first result isn't quite right, ask follow-up questions: "Simplify the language in version 2" or "Add more challenge to questions 8-10." Generative AI improves through conversation. Think of it as refining drafts rather than expecting perfection immediately.

3. Combine with your pedagogy: AI tools generate content. You provide pedagogical judgment, knowledge of your students, and curricular expertise. The prompts here are starting points, not finished resources. Your professional insight remains essential at every stage.

Ethical Considerations and Digital Tools

When using AI tools for classroom tasks, three principles remain non-negotiable:

Never input personal student data into public AI models. Avoid names, assessment scores, or sensitive information. Use anonymous descriptors: "a Year 5 student working below age-related expectations in writing."

Verify factual claims, particularly in subjects like science or history where AI-generated content may contain errors. Large language models generate plausible text; they don't fact-check. Your subject knowledge is the quality control mechanism.

Maintain your professional judgment. AI assists with content generation and admin tasks. It cannot replace your understanding of individual students, classroom dynamics, or the contextual decisions that define effective teaching.

The Department for Education (2024) emphasises that artificial intelligence in modern education should complement, not direct, professional expertise. Teachers remain accountable for the resources they use and the feedback they provide.

What to Try Next

Start with one prompt from this list. Test it in your next planning session. Refine it based on the output quality. Add your own constraints or requirements that reflect your classroom context.

AI literacy begins with practical experimentation, not theoretical understanding. The more you use these tools, the more intuitive prompt crafting becomes. Each iteration teaches you something about what produces useful results versus generic filler.

Consider also how these prompts might support colleagues. Sharing effective templates within your department or school creates collective efficiency gains whilst maintaining teaching quality across classes.

Further Reading: Research on Prompt Engineering in Education

As teachers experiment with generative AI tools, one skill consistently shapes the quality of outcomes: prompt engineering. It's no longer about simply asking better questions. It's about structuring inputs to guide AI toward meaningful, accurate, and pedagogically sound responses. The following studies explore how prompt engineering is being taught, studied, and applied across schools and universities. Together, they show that mastering this emerging skill can enhance creativity, critical thinking, and efficiency in educational contexts.

1. Prompt Engineering in Higher Education

Prompt engineering in higher education: a systematic review to help inform curricula – Lee, D. & Palmer, E. (2025). International Journal of Educational Technology in Higher Education.

This systematic review synthesises current research on how prompt engineering is used in higher education. The authors find that specific, structured prompts can significantly improve the quality and reliability of AI-generated academic work. They call for prompt engineering to be formally taught as part of digital-skills and research-literacy programmes, helping students and staff use AI more critically and effectively.

2. Prompt Engineering as a New 21st-Century Skill

Prompt engineering as a new 21st-century skill – Federiakin, D. (2024). Frontiers in Education.

Federiakin argues that prompt engineering should be treated as a fundamental digital competency, similar to information literacy or coding. The study explores how crafting prompts involves cognitive framing, problem definition, and iterative refinement: skills essential for both AI collaboration and independent thinking. It suggests integrating prompt-engineering exercises into project-based and inquiry-led learning.

3. Prompt Engineering Intervention and AI Self-Efficacy

Effects of a prompt engineering intervention on undergraduate students' AI self-efficacy, AI knowledge, and prompt engineering ability – Woo, D. J., Wang, D., Yung, T., & Guo, K. (2024). arXiv (Preprint).

In this mixed-methods study, undergraduate students participated in a short course teaching structured prompting techniques. Post-course evaluations revealed marked increases in students' AI self-efficacy, technical knowledge, and ability to refine prompts to achieve academic goals. The authors argue that even brief, focused training can empower learners to collaborate with AI confidently and critically.

4. AI Literacy and Its Implications for Prompt Engineering

AI literacy and its implications for prompt engineering strategies – Knoth, N. (2024). Computers & Education: Artificial Intelligence.

This paper connects prompt engineering with AI literacy, showing that understanding AI's underlying mechanisms directly improves prompt quality. Knoth finds that educators who grasp AI's strengths, limitations, and biases are better able to design prompts that yield accurate and relevant outcomes. The study recommends embedding prompt engineering within broader AI-literacy and ethics training for teachers.

5. Prompt Engineering in K–12 STEM Education

A systematic review on prompt engineering in large language models for K–12 STEM education – Chen, E., Wang, D., Xu, L., Cao, C., Fang, X., & Lin, J. (2024). arXiv (Preprint).

This systematic review examines 30 studies applying prompt engineering to STEM subjects in K–12 contexts. It categorises common strategies such as zero-shot, few-shot, and chain-of-thought prompting, and finds that well-scaffolded prompts help students reason more effectively and avoid AI hallucinations. The authors identify a need for classroom-based studies to validate these methods in real teaching settings.