AI Prompts for Every Level of Bloom's Taxonomy: A Teacher's Guide

Updated on  

March 24, 2026

AI Prompts for Every Level of Bloom's Taxonomy: A Teacher's Guide

|

March 24, 2026

Copy-paste AI prompts for every level of Bloom's Taxonomy. Practical templates for Remember, Understand, Apply, Analyse, Evaluate and Create, with worked examples across subjects.

Key Takeaways

  1. AI defaults to the lowest cognitive levels. Without deliberate prompt engineering, AI tools produce Remember-level outputs: lists, definitions, and summaries. You need to explicitly name the cognitive operation you want.
  2. Bloom's taxonomy gives you six prompt registers. Each level requires a different verb structure and context. "List the causes" produces different thinking to "Evaluate the relative significance of each cause."
  3. Copy-paste templates save time. The prompt templates in this guide work across subjects. Swap the topic, keep the structure, and the cognitive demand stays consistent.
  4. Higher-order prompts need more scaffolding in the brief. Evaluate and Create prompts should specify the criteria, audience, and constraints. Vague higher-order prompts produce vague AI outputs.
  5. The Thinking Framework maps directly onto Bloom's levels. Compare sits at Analyse; Classify at Analyse; Sequence at Apply; Cause and Effect at Analyse; Part-Whole at Understand; Systems Thinking at Evaluate. This gives you a practical classroom bridge between taxonomy theory and daily lesson design.

Most teachers using AI in the classroom are accidentally stuck at the bottom of Bloom's taxonomy. They ask the AI to "summarise the key points" or "list the main causes" and then wonder why the output feels thin. The problem is not the tool. It is the prompt.

Bloom's taxonomy, revised by Anderson and Krathwohl (2001) from Bloom's original 1956 framework, describes six levels of cognitive demand: Remember, Understand, Apply, Analyse, Evaluate, and Create. Each level requires a qualitatively different type of thinking. When you write your AI prompt without specifying the level, the model defaults to the easiest interpretation, which is almost always Remember or Understand.

This guide gives you copy-paste prompt templates for all six levels, worked examples showing the difference in AI output, and teacher tips on when each level is appropriate. Every template is tested across English, maths, science, and humanities so you can see how the structure transfers across subjects.

Why Bloom's Matters for AI Prompting

Sweller (1988) showed that the complexity of a task determines how much working memory a learner needs to engage. The same principle applies to AI prompts. A vague instruction produces a low-demand response because the model finds the path of least resistance through its training data. A structured, level-specific prompt forces the model to generate content that matches a higher cognitive register.

The mechanism behind this matters for how you frame every AI prompt. Working memory — the cognitive system that holds and manipulates information in the moment — has a capacity of roughly four chunks at any one time (Cowan, 2001). When a prompt is vague, pupils must simultaneously decode what the task requires, hold the topic in mind, generate content, and organise it for output. Each of those operations draws on the same limited pool of working memory resources, and when the pool empties, pupils default to the simplest possible response. Sweller (1988) called this extraneous cognitive load: mental effort caused by poor task design rather than genuine learning. A well-constructed AI prompt acts as a scaffold that offloads the design work, leaving working memory free for the cognitive operation you actually want pupils to perform. A prompt specifying "compare two causes using evidence from our source pack, structured as claim-evidence-analysis" gives the task structure directly to the pupil, so their working memory can focus on the comparison itself rather than on figuring out what to do.

EdTechTeacher and similar sites offer generic "AI prompt templates" without cognitive scaffolding. They give teachers starter phrases but no framework for knowing whether those phrases produce the right level of thinking. The result is that teachers get AI outputs calibrated to Year 7 recall tasks even when they are preparing Year 11 evaluation questions.

Understanding Bloom's taxonomy is the first step. Using it to write AI prompts is the practical application that actually saves you time in the classroom.

The key rule is simple: name the cognitive operation in your prompt. Do not say "write about photosynthesis." Say "Analyse the relationship between light intensity and the rate of photosynthesis, identifying the limiting factors at each stage."

Remember Level Prompts

Remember is the foundation of all learning. It involves retrieving facts, definitions, and basic information from memory. Bloom (1956) placed this at the base of the taxonomy not because it is unimportant, but because everything else depends on it. Retrieval practice research confirms that deliberately retrieving information strengthens long-term memory storage.

AI is very good at Remember-level tasks. The challenge is that many teachers stop here without realising they have done so.

When to use: Lesson starters, vocabulary checks, knowledge recall before a new topic, and low-stakes quizzes. Remember prompts are also useful for generating revision resources.

Prompt templates:

Template Subject Example
List the key events of [topic] in chronological order. Year 7 History: List the key events of the Norman Conquest in chronological order.
Define the term [concept] as it is used in [subject]. Year 9 Science: Define the term 'osmosis' as it is used in biology.
Name the [number] key [facts/terms/dates] associated with [topic]. Year 8 Geography: Name the five key physical features of a meander.
Recall what [person/group] did during [event/period]. Year 10 History: Recall what the suffragettes did during the campaign for women's votes 1905–1914.

Worked example: Prompt: "List the key events of the Norman Conquest in chronological order." AI output: A numbered timeline from the death of Edward the Confessor in January 1066 through Harold's coronation, the battles of Gate Fulford and Stamford Bridge, Hastings on 14 October 1066, and William's coronation on Christmas Day. Clean, factual, appropriately pitched for Year 7.

Teacher tip: Use Remember prompts to generate question banks for formative assessment starters. Ask the AI to produce 10 retrieval questions on the topic, then pick the five that match your lesson objectives.

Understand Level Prompts

Understanding requires constructing meaning from information, not just recalling it. A pupil who can list the causes of World War One but cannot explain the relationship between them is operating at Remember, not Understand. Anderson and Krathwohl (2001) describe Understand-level tasks as those involving interpretation, exemplification, classification, summarisation, inference, comparison, and explanation.

The cognitive jump from Remember to Understand is where many AI prompts stall. Teachers ask for a "summary" but do not specify that the summary should show how ideas connect.

When to use: After introducing a new concept, when checking comprehension before a more complex task, and when pupils need to process information in their own words. Understand prompts are effective for generating model texts that show pupils how to explain ideas.

Prompt templates:

Template Subject Example
Explain in your own words how [concept/process] works. Year 5 Science: Explain in your own words how the water cycle works.
Summarise the relationship between [A] and [B]. Year 9 English: Summarise the relationship between Macbeth and Lady Macbeth at the start of Act 1.
Paraphrase [concept] so that a [year group] pupil could understand it. Year 6 Maths: Paraphrase the concept of equivalent fractions so that a Year 6 pupil could understand it.
Give three examples that illustrate [concept/principle]. Year 8 RE: Give three examples that illustrate the Buddhist concept of impermanence.

Worked example: Prompt: "Explain in your own words how the water cycle works for a Year 5 class." AI output: A clear three-paragraph explanation covering evaporation from oceans and lakes, condensation as water vapour rises and cools to form clouds, and precipitation as rain or snow returning to the surface. The language is appropriate for 9–10 year olds without being patronising. This is a strong model text for pupils to annotate or adapt.

Teacher tip: Pair Understand prompts with metacognitive questioning. After the AI generates an explanation, ask pupils to identify which parts they found surprising and which parts they already knew. This activates prior knowledge and shows you where genuine understanding gaps exist.

Apply Level Prompts

Application involves using knowledge and procedures to carry out tasks in new situations. This is where abstract concepts become concrete tools. Maths word problems are the classic Apply-level task: the pupil knows the formula but must recognise when and how to deploy it in an unfamiliar context.

Apply-level AI prompts generate worked examples, practice problems, and scenarios that require pupils to use what they know. The key prompt move is to specify the context that makes the knowledge application non-trivial.

When to use: After teaching a concept and before assessing it independently. Apply prompts are particularly effective for generating differentiated practice problems and real-world scenarios. They sit naturally within scaffolding sequences where pupils move from teacher demonstration to guided practice.

Prompt templates:

Template Subject Example
Use [concept/formula/rule] to solve this problem: [problem context]. Year 9 Maths: Use Pythagoras' theorem to solve this problem: a builder needs to cut a diagonal support beam across a rectangular doorframe that is 2.1m tall and 0.9m wide. How long should the beam be?
Demonstrate how [concept] would be applied in [real-world scenario]. Year 10 Business: Demonstrate how the concept of supply and demand would apply to a bakery that introduces a new sourdough loaf during a local food festival.
Write a worked example that shows how to [process] step by step. Year 7 Maths: Write a worked example showing how to find the area of a compound shape step by step, using a real-life L-shaped floor plan.
Create three practice problems that require pupils to apply [concept] in different contexts. Year 8 Science: Create three practice problems that require pupils to apply their knowledge of density (mass ÷ volume) in different real-world contexts.

Worked example: Prompt: "Use Pythagoras' theorem to solve this: a builder needs to cut a diagonal beam across a doorframe that is 2.1m tall and 0.9m wide. How long should the beam be?" AI output: A fully worked solution with the formula stated, values substituted (2.1² + 0.9² = 4.41 + 0.81 = 5.22), square root calculated (approximately 2.28m), and a sentence contextualising the answer. This is a ready-to-use modelling resource.

Teacher tip: Apply prompts pair well with Rosenshine's (2012) principle of modelling. Use the AI output as your 'I do' demonstration, then give pupils a similar problem to try as their 'we do.' The AI has done the heavy lifting of creating a well-structured worked example; you focus on the live explanation.

Analyse Level Prompts

Analysis involves breaking material into its component parts, determining how the parts relate to each other, and identifying how they relate to an overall structure or purpose. This is where higher-order thinking begins in earnest. Anderson and Krathwohl (2001) identify three cognitive processes at this level: differentiating, organising, and attributing.

Analyse-level AI prompts require you to specify both the object of analysis and the analytical framework. "Analyse this poem" is too vague. "Analyse how Wilfred Owen uses imagery in 'Dulce et Decorum Est' to challenge the idea that war is glorious" is precise enough to produce a substantive response.

When to use: When pupils need to go beyond surface-level understanding, when preparing for extended writing tasks, and when teaching pupils to think in disciplinary ways. This is the natural level for higher-order thinking activities in Years 9–13.

Analyse vs Understand: Knowing the Difference

Understand Level Analyse Level
Summarise the causes of World War One. Explain how the alliance system transformed a regional dispute into a world war.
Explain what a food web shows. Analyse what happens to the food web if the population of a top predator collapses.
Describe the features of a Shakespearean soliloquy. Identify how Shakespeare uses Hamlet's soliloquy to reveal the contrast between thought and action.
Explain how a market economy works. Break down how the 2008 financial crisis exposed structural weaknesses in deregulated markets.

Prompt templates:

Template Subject Example
Compare and contrast [A] and [B], focusing on [specific criteria]. Year 10 English: Compare and contrast the way power is presented in 'My Last Duchess' and 'Ozymandias', focusing on the relationship between the speaker and their subject.
What are the causes and effects of [event/phenomenon]? Organise your answer by [short-term/long-term or direct/indirect]. Year 9 History: What are the causes and effects of the Industrial Revolution? Organise by short-term and long-term effects on working-class life.
Break down how [process/system] works by identifying its component parts and the function of each. Year 10 Science: Break down how the human immune system works by identifying its component parts and the function of each in fighting bacterial infection.
Identify the assumptions underlying [argument/policy/text]. Which assumptions are most open to challenge? Year 11 Economics: Identify the assumptions underlying the case for free trade. Which assumptions are most open to challenge in the context of developing economies?

Worked example: Prompt: "Compare and contrast how power is presented in 'My Last Duchess' and 'Ozymandias', focusing on the relationship between the speaker and their subject." AI output: A structured comparative response identifying the dramatic monologue form in Browning as presenting power through control and possessiveness, while Shelley uses the external narrator and ironic juxtaposition to critique the illusion of permanence. The response organises points by criterion rather than poem-by-poem, which models the analytical approach required by GCSE mark schemes.

Teacher tip: Webb's (1997) Depth of Knowledge framework suggests that Analyse-level tasks (DOK level 3) require extended thinking with multiple steps. When using AI to generate analysis tasks for pupils, specify the number of steps in your prompt to get appropriately demanding content. See Webb's Depth of Knowledge for a fuller guide to task calibration.

Evaluate Level Prompts

Evaluation requires making judgements based on criteria and standards. This is where pupils must weigh evidence, consider multiple perspectives, and defend a position. Anderson and Krathwohl (2001) distinguish two cognitive processes: checking (testing for internal consistency) and critiquing (judging against external criteria).

Evaluate prompts are where most teachers find AI genuinely useful for lesson preparation. Generating model arguments and counter-arguments, constructing mark-scheme-aligned responses, and producing "to what extent" essay frames all sit at this level.

When to use: GCSE and A-level preparation, essay writing workshops, debate preparation, and teaching pupils to construct arguments. Evaluate prompts are also effective for staff development: generating balanced perspectives on teaching approaches before a department discussion.

Prompt templates:

Template Subject Example
To what extent do you agree that [claim]? Argue both sides, then reach a justified conclusion. Year 11 Geography: To what extent do you agree that economic development is the most important factor in reducing a country's birth rate? Argue both sides, then reach a justified conclusion.
What are the strengths and limitations of [approach/theory/policy] when applied to [context]? Year 12 Psychology: What are the strengths and limitations of the behaviourist approach when applied to explaining phobias in adults?
Judge whether [decision/action/policy] was justified, using [criteria] as your evaluative framework. Year 10 History: Judge whether Chamberlain's policy of appeasement was justified, using the evidence available to British policymakers in 1938 as your evaluative framework.
Critique [text/argument/model] by identifying its strongest claim and its most significant weakness. Year 11 English Language: Critique this opinion article by identifying its strongest rhetorical technique and its most significant logical weakness.

Worked example: Prompt: "To what extent do you agree that economic development is the most important factor in reducing birth rate? Argue both sides, then reach a justified conclusion." AI output: A structured four-paragraph response with a thesis, two developed counter-arguments (education of women, government population policy), a concession that economic development often drives these factors indirectly, and a conclusion qualifying the extent of agreement. This matches the structure required for a high-band GCSE geography response.

Teacher tip: Generate two versions of the same Evaluate prompt: one arguing strongly for the position and one arguing against. Use these as paired texts in a questioning sequence to help pupils identify the analytical moves that distinguish a well-supported argument from a weak one.

Create Level Prompts

Create is the highest level of Bloom's revised taxonomy. It requires putting elements together to form a coherent or functional whole, or reorganising existing elements into a new pattern or structure. This is not simply producing something; it involves generating, planning, and producing something that represents novel thinking.

AI at the Create level is most useful as a collaborator, not a producer. The best Create prompts use AI to generate constraints, criteria, or starting points that pupils then work with. Giving pupils the AI's first attempt and asking them to improve it also sits firmly at Create level.

When to use: Extended projects, cross-curricular work, design tasks, and any assessment where synthesis is required. Create-level prompts also support differentiation: higher-attaining pupils can be given more open Create briefs while others work with more constraints.

Prompt templates:

Template Subject Example
Design a [product/system/solution] that addresses [problem], specifying [constraints]. Year 8 DT: Design a packaging solution for a fragile product that uses only recycled materials, must protect the item during postal delivery, and must be assembled without tools or adhesives.
Write a [genre] piece that demonstrates [concept/technique], aimed at [audience]. Year 9 English: Write the opening of a Gothic short story that demonstrates the use of pathetic fallacy, foreshadowing, and an unreliable narrator. Aimed at a Year 9 reading level.
Propose a solution to [problem] that integrates knowledge from [subject area 1] and [subject area 2]. Year 10 cross-curricular: Propose a solution to food insecurity in sub-Saharan Africa that integrates knowledge from geography (climate, water access) and science (crop modification, soil chemistry).
Construct a [argument/model/experiment/plan] that [achieves goal], explaining the reasoning behind each decision. Year 12 Biology: Construct an experimental design to test whether increasing CO2 concentration affects the rate of photosynthesis in pondweed, explaining the reasoning behind each methodological decision.

Worked example: Prompt: "Write the opening of a Gothic short story that demonstrates pathetic fallacy, foreshadowing, and an unreliable narrator, aimed at Year 9 reading level." AI output: A 200-word opening paragraph with a storm-lashed Victorian house (pathetic fallacy), a narrator who repeatedly reassures themselves that 'everything is perfectly ordinary' (unreliable narrator), and a detail about a locked door that rattles for 'no reason that could be explained' (foreshadowing). Teachers can use this as a model text, a text to annotate, or a starting point pupils improve and extend.

Teacher tip: The most effective use of AI at Create level is generating the brief, not the final product. Ask AI to produce three different design briefs for a product, three different essay titles at different levels of difficulty, or three alternative starting points for a creative piece. Then pupils choose and execute one. This keeps the creative decision-making with the learner.

Common Mistakes When Prompting AI

Most AI prompting mistakes in classrooms come down to a mismatch between what the teacher wants and the cognitive level implied by the prompt wording.

Mistake 1: Asking for lists when you need analysis. "List the effects of deforestation" produces a bulleted inventory. "Analyse how deforestation creates feedback loops that accelerate climate change" produces a reasoned chain of causation. The subject matter is the same. The cognitive demand is completely different.

Mistake 2: Using 'discuss' without specifying the framework. 'Discuss the causes of the French Revolution' could mean anything from a Remember-level list to an Evaluate-level argument. Add the framework: "Discuss the relative importance of economic, social, and political causes of the French Revolution, reaching a supported conclusion."

Mistake 3: Forgetting to specify audience and constraints. A Create prompt without constraints produces generic output. The constraint is what forces specificity. "Write a lesson plan" produces a mediocre template. "Write a 50-minute lesson plan on fractions for a Year 6 class where six pupils have dyscalculia, using concrete resources before abstract notation" produces something genuinely useful.

Mistake 4: Accepting the first output. AI first drafts at higher Bloom's levels often start strongly but collapse into lists in the later paragraphs. Read the full output. If it reverts to Remember-level content partway through, add to your prompt: "Do not use bullet points or numbered lists. Maintain analytical prose throughout."

How to 'Level Up' Any Prompt

Take any prompt and add a Bloom's verb to shift the cognitive demand upwards. Here is the same topic at three levels:

Level Prompt What Pupils Produce
Remember List the features of a river's upper course. A bulleted list of geographical terms.
Analyse Explain how the processes of erosion and deposition change as a river moves from its upper to lower course. A causal chain connecting gradient, velocity, energy, and landform change.
Evaluate To what extent is human intervention the primary cause of flooding in a river's lower course? Use geographical evidence to justify your answer. A balanced argument weighing human versus physical factors, with a supported conclusion.

The Structural Learning Approach

The Thinking Framework maps directly onto Bloom's revised taxonomy, giving you a practical classroom bridge between taxonomy theory and daily lesson design. The framework identifies eight cognitive operations, each of which corresponds to a specific Bloom's level:

Thinking Framework Operation Bloom's Level Example AI Prompt Verb
Part-Whole Understand Identify the components of...
Sequence Apply Order the stages of... explaining how each leads to the next
Compare Analyse Compare and contrast... focusing on [criteria]
Classify Analyse Categorise these examples into groups, explaining your criteria
Cause and Effect Analyse Trace the chain of causes leading to...
Analogy Understand / Analyse Explain [concept] by analogy with something familiar to [year group]
Perspective Evaluate From the perspective of [stakeholder], evaluate the decision to...
Systems Thinking Evaluate Explain how [system] would respond to a change in [variable], including feedback effects

This mapping means that when you plan a lesson using the Thinking Framework, you already know which Bloom's level the cognitive work sits at, and you can write your AI prompts to match. The AI for Teachers article includes a live AI Prompt Builder widget that generates prompts aligned to each of these operations across ten subjects and five year groups.

The significance of this alignment extends beyond lesson planning. Cognitive load theory tells us that learners can only process a limited amount of new information at once (Sweller, 1988). When your AI prompts are calibrated to the right Bloom's level for where your pupils are in their learning, you reduce the risk of generating material that either under-challenges or overwhelms them.

Using AI Prompts With Rosenshine's Principles

Rosenshine (2012) identified ten principles of instruction derived from research on effective teachers. Several of these map directly onto the Bloom's-based prompting approach described in this guide.

Principle 1 is daily review of previous learning. Remember-level AI prompts are excellent for generating the ten-question retrieval starters Rosenshine recommends. You can produce a week's worth of varied recall questions in under two minutes.

Principle 2 is presenting new material in small steps. Understand-level prompts help you generate clear, well-sequenced explanations of new concepts that you can annotate, adapt, or use as models. See Rosenshine's Principles for a fuller account of how each principle applies to lesson design.

Principle 6 is checking for student understanding regularly. Apply and Analyse prompts generate the kind of practice problems and discussion questions that give you real-time evidence of whether pupils have moved beyond surface knowledge. Combine these with the AI in lesson planning strategies to build a coherent sequence.

What to Try Next Lesson

Pick one topic you are teaching this week. Write prompts at three different Bloom's levels using the templates above. Run all three prompts and compare the outputs side by side.

You will notice three things. First, the AI outputs differ substantially in depth and complexity. Second, the higher-level prompts produce content that is harder to generate yourself from scratch but that your pupils genuinely need. Third, the outputs give you an immediate sense of which level your current lesson activities are actually sitting at.

If most of your current activities generate Remember-level AI outputs, your lessons may be spending too much time on recall and not enough on the thinking skills that build long-term understanding. That is not a failing; it is useful diagnostic information.

The next step is to take one Remember-level activity from your existing scheme of work and redesign it at Analyse level using the templates in this guide. Use AI tools for teachers to generate the new version and compare it against the original. The difference in cognitive demand will be visible immediately.

Further Reading: Key Research Papers

These five studies provide the evidence base for using Bloom's taxonomy to design higher-order AI prompts in classroom contexts.

A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Educational Objectives View study ↗
248 citations (widely used)

Anderson, L. W., & Krathwohl, D. R. (2001). Longman.

The definitive revision of Bloom's original taxonomy. Anderson and Krathwohl restructured the six levels, changed nouns to verbs, and repositioned Create above Evaluate. This revised framework is the basis for all six levels described in this guide and remains the standard reference for cognitive task classification in educational research.

Benjamin Bloom did not work alone. The original 1956 taxonomy was the product of a committee of educational psychologists convened at a University of Chicago conference in 1948, with Bloom as chair. The group's ambition was practical: they wanted a shared classification system that would allow examiners at different universities to compare the cognitive demands of their assessments. The taxonomy was never intended as a hierarchy of value, with Create superior to Remember. Bloom's committee understood that recall is the prerequisite for higher-order thinking, not its enemy. Anderson and Krathwohl's 2001 revision, which changed the category names from nouns to verbs (Knowledge became Remember, Comprehension became Understand), also added a second dimension: the Knowledge Dimension. This distinguishes between Factual knowledge (isolated facts and terminology), Conceptual knowledge (relationships between facts, categories and principles), Procedural knowledge (how to do something), and Metacognitive knowledge (awareness of one's own learning processes). AI prompts can and should target specific cells in this two-dimensional matrix, not just rows on the single-axis ladder most teachers picture.

Cognitive Load Theory View study ↗
Sweller, 1988

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Sweller's foundational paper established that instructional design must account for working memory limits. Directly relevant to AI prompting: prompts that produce content pitched above pupils' current schema level create extraneous cognitive load without supporting learning. Calibrating prompts to the appropriate Bloom's level reduces this risk.

Principles of Instruction: Research-Based Strategies That All Teachers Should Know View study ↗
Rosenshine, 2012

Rosenshine, B. (2012). Principles of instruction. American Educator, 36(1), 12–19.

Rosenshine synthesised decades of observational classroom research into ten evidence-based principles. His emphasis on daily review, small-step presentation, and frequent checking for understanding maps directly onto the Remember, Understand, and Apply levels of Bloom's taxonomy, providing a practical bridge between cognitive taxonomy and instructional design.

Depth of Knowledge View study ↗
Webb, 1997

Webb, N. L. (1997). Research monograph number 6: Criteria for alignment of expectations and assessments in mathematics and science education. Washington, DC: CCSSO.

Webb's Depth of Knowledge framework offers a complementary lens to Bloom's taxonomy, particularly useful for identifying whether an AI-generated task genuinely requires extended thinking (DOK 3–4) or merely gives the appearance of complexity. Using both frameworks together strengthens task calibration for higher-attaining pupils.

Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping View study ↗
Karpicke & Blunt, 2011

Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331(6018), 772–775.

Karpicke and Blunt's study demonstrated that retrieval practice outperforms elaborative study strategies for long-term retention. This supports the use of Remember-level AI prompts for generating retrieval starters, while also explaining why teachers must then move pupils to higher Bloom's levels to build the schema depth that makes retrieval meaningful over time.

Loading audit...

Key Takeaways

  1. AI defaults to the lowest cognitive levels. Without deliberate prompt engineering, AI tools produce Remember-level outputs: lists, definitions, and summaries. You need to explicitly name the cognitive operation you want.
  2. Bloom's taxonomy gives you six prompt registers. Each level requires a different verb structure and context. "List the causes" produces different thinking to "Evaluate the relative significance of each cause."
  3. Copy-paste templates save time. The prompt templates in this guide work across subjects. Swap the topic, keep the structure, and the cognitive demand stays consistent.
  4. Higher-order prompts need more scaffolding in the brief. Evaluate and Create prompts should specify the criteria, audience, and constraints. Vague higher-order prompts produce vague AI outputs.
  5. The Thinking Framework maps directly onto Bloom's levels. Compare sits at Analyse; Classify at Analyse; Sequence at Apply; Cause and Effect at Analyse; Part-Whole at Understand; Systems Thinking at Evaluate. This gives you a practical classroom bridge between taxonomy theory and daily lesson design.

Most teachers using AI in the classroom are accidentally stuck at the bottom of Bloom's taxonomy. They ask the AI to "summarise the key points" or "list the main causes" and then wonder why the output feels thin. The problem is not the tool. It is the prompt.

Bloom's taxonomy, revised by Anderson and Krathwohl (2001) from Bloom's original 1956 framework, describes six levels of cognitive demand: Remember, Understand, Apply, Analyse, Evaluate, and Create. Each level requires a qualitatively different type of thinking. When you write your AI prompt without specifying the level, the model defaults to the easiest interpretation, which is almost always Remember or Understand.

This guide gives you copy-paste prompt templates for all six levels, worked examples showing the difference in AI output, and teacher tips on when each level is appropriate. Every template is tested across English, maths, science, and humanities so you can see how the structure transfers across subjects.

Why Bloom's Matters for AI Prompting

Sweller (1988) showed that the complexity of a task determines how much working memory a learner needs to engage. The same principle applies to AI prompts. A vague instruction produces a low-demand response because the model finds the path of least resistance through its training data. A structured, level-specific prompt forces the model to generate content that matches a higher cognitive register.

The mechanism behind this matters for how you frame every AI prompt. Working memory — the cognitive system that holds and manipulates information in the moment — has a capacity of roughly four chunks at any one time (Cowan, 2001). When a prompt is vague, pupils must simultaneously decode what the task requires, hold the topic in mind, generate content, and organise it for output. Each of those operations draws on the same limited pool of working memory resources, and when the pool empties, pupils default to the simplest possible response. Sweller (1988) called this extraneous cognitive load: mental effort caused by poor task design rather than genuine learning. A well-constructed AI prompt acts as a scaffold that offloads the design work, leaving working memory free for the cognitive operation you actually want pupils to perform. A prompt specifying "compare two causes using evidence from our source pack, structured as claim-evidence-analysis" gives the task structure directly to the pupil, so their working memory can focus on the comparison itself rather than on figuring out what to do.

EdTechTeacher and similar sites offer generic "AI prompt templates" without cognitive scaffolding. They give teachers starter phrases but no framework for knowing whether those phrases produce the right level of thinking. The result is that teachers get AI outputs calibrated to Year 7 recall tasks even when they are preparing Year 11 evaluation questions.

Understanding Bloom's taxonomy is the first step. Using it to write AI prompts is the practical application that actually saves you time in the classroom.

The key rule is simple: name the cognitive operation in your prompt. Do not say "write about photosynthesis." Say "Analyse the relationship between light intensity and the rate of photosynthesis, identifying the limiting factors at each stage."

Remember Level Prompts

Remember is the foundation of all learning. It involves retrieving facts, definitions, and basic information from memory. Bloom (1956) placed this at the base of the taxonomy not because it is unimportant, but because everything else depends on it. Retrieval practice research confirms that deliberately retrieving information strengthens long-term memory storage.

AI is very good at Remember-level tasks. The challenge is that many teachers stop here without realising they have done so.

When to use: Lesson starters, vocabulary checks, knowledge recall before a new topic, and low-stakes quizzes. Remember prompts are also useful for generating revision resources.

Prompt templates:

Template Subject Example
List the key events of [topic] in chronological order. Year 7 History: List the key events of the Norman Conquest in chronological order.
Define the term [concept] as it is used in [subject]. Year 9 Science: Define the term 'osmosis' as it is used in biology.
Name the [number] key [facts/terms/dates] associated with [topic]. Year 8 Geography: Name the five key physical features of a meander.
Recall what [person/group] did during [event/period]. Year 10 History: Recall what the suffragettes did during the campaign for women's votes 1905–1914.

Worked example: Prompt: "List the key events of the Norman Conquest in chronological order." AI output: A numbered timeline from the death of Edward the Confessor in January 1066 through Harold's coronation, the battles of Gate Fulford and Stamford Bridge, Hastings on 14 October 1066, and William's coronation on Christmas Day. Clean, factual, appropriately pitched for Year 7.

Teacher tip: Use Remember prompts to generate question banks for formative assessment starters. Ask the AI to produce 10 retrieval questions on the topic, then pick the five that match your lesson objectives.

Understand Level Prompts

Understanding requires constructing meaning from information, not just recalling it. A pupil who can list the causes of World War One but cannot explain the relationship between them is operating at Remember, not Understand. Anderson and Krathwohl (2001) describe Understand-level tasks as those involving interpretation, exemplification, classification, summarisation, inference, comparison, and explanation.

The cognitive jump from Remember to Understand is where many AI prompts stall. Teachers ask for a "summary" but do not specify that the summary should show how ideas connect.

When to use: After introducing a new concept, when checking comprehension before a more complex task, and when pupils need to process information in their own words. Understand prompts are effective for generating model texts that show pupils how to explain ideas.

Prompt templates:

Template Subject Example
Explain in your own words how [concept/process] works. Year 5 Science: Explain in your own words how the water cycle works.
Summarise the relationship between [A] and [B]. Year 9 English: Summarise the relationship between Macbeth and Lady Macbeth at the start of Act 1.
Paraphrase [concept] so that a [year group] pupil could understand it. Year 6 Maths: Paraphrase the concept of equivalent fractions so that a Year 6 pupil could understand it.
Give three examples that illustrate [concept/principle]. Year 8 RE: Give three examples that illustrate the Buddhist concept of impermanence.

Worked example: Prompt: "Explain in your own words how the water cycle works for a Year 5 class." AI output: A clear three-paragraph explanation covering evaporation from oceans and lakes, condensation as water vapour rises and cools to form clouds, and precipitation as rain or snow returning to the surface. The language is appropriate for 9–10 year olds without being patronising. This is a strong model text for pupils to annotate or adapt.

Teacher tip: Pair Understand prompts with metacognitive questioning. After the AI generates an explanation, ask pupils to identify which parts they found surprising and which parts they already knew. This activates prior knowledge and shows you where genuine understanding gaps exist.

Apply Level Prompts

Application involves using knowledge and procedures to carry out tasks in new situations. This is where abstract concepts become concrete tools. Maths word problems are the classic Apply-level task: the pupil knows the formula but must recognise when and how to deploy it in an unfamiliar context.

Apply-level AI prompts generate worked examples, practice problems, and scenarios that require pupils to use what they know. The key prompt move is to specify the context that makes the knowledge application non-trivial.

When to use: After teaching a concept and before assessing it independently. Apply prompts are particularly effective for generating differentiated practice problems and real-world scenarios. They sit naturally within scaffolding sequences where pupils move from teacher demonstration to guided practice.

Prompt templates:

Template Subject Example
Use [concept/formula/rule] to solve this problem: [problem context]. Year 9 Maths: Use Pythagoras' theorem to solve this problem: a builder needs to cut a diagonal support beam across a rectangular doorframe that is 2.1m tall and 0.9m wide. How long should the beam be?
Demonstrate how [concept] would be applied in [real-world scenario]. Year 10 Business: Demonstrate how the concept of supply and demand would apply to a bakery that introduces a new sourdough loaf during a local food festival.
Write a worked example that shows how to [process] step by step. Year 7 Maths: Write a worked example showing how to find the area of a compound shape step by step, using a real-life L-shaped floor plan.
Create three practice problems that require pupils to apply [concept] in different contexts. Year 8 Science: Create three practice problems that require pupils to apply their knowledge of density (mass ÷ volume) in different real-world contexts.

Worked example: Prompt: "Use Pythagoras' theorem to solve this: a builder needs to cut a diagonal beam across a doorframe that is 2.1m tall and 0.9m wide. How long should the beam be?" AI output: A fully worked solution with the formula stated, values substituted (2.1² + 0.9² = 4.41 + 0.81 = 5.22), square root calculated (approximately 2.28m), and a sentence contextualising the answer. This is a ready-to-use modelling resource.

Teacher tip: Apply prompts pair well with Rosenshine's (2012) principle of modelling. Use the AI output as your 'I do' demonstration, then give pupils a similar problem to try as their 'we do.' The AI has done the heavy lifting of creating a well-structured worked example; you focus on the live explanation.

Analyse Level Prompts

Analysis involves breaking material into its component parts, determining how the parts relate to each other, and identifying how they relate to an overall structure or purpose. This is where higher-order thinking begins in earnest. Anderson and Krathwohl (2001) identify three cognitive processes at this level: differentiating, organising, and attributing.

Analyse-level AI prompts require you to specify both the object of analysis and the analytical framework. "Analyse this poem" is too vague. "Analyse how Wilfred Owen uses imagery in 'Dulce et Decorum Est' to challenge the idea that war is glorious" is precise enough to produce a substantive response.

When to use: When pupils need to go beyond surface-level understanding, when preparing for extended writing tasks, and when teaching pupils to think in disciplinary ways. This is the natural level for higher-order thinking activities in Years 9–13.

Analyse vs Understand: Knowing the Difference

Understand Level Analyse Level
Summarise the causes of World War One. Explain how the alliance system transformed a regional dispute into a world war.
Explain what a food web shows. Analyse what happens to the food web if the population of a top predator collapses.
Describe the features of a Shakespearean soliloquy. Identify how Shakespeare uses Hamlet's soliloquy to reveal the contrast between thought and action.
Explain how a market economy works. Break down how the 2008 financial crisis exposed structural weaknesses in deregulated markets.

Prompt templates:

Template Subject Example
Compare and contrast [A] and [B], focusing on [specific criteria]. Year 10 English: Compare and contrast the way power is presented in 'My Last Duchess' and 'Ozymandias', focusing on the relationship between the speaker and their subject.
What are the causes and effects of [event/phenomenon]? Organise your answer by [short-term/long-term or direct/indirect]. Year 9 History: What are the causes and effects of the Industrial Revolution? Organise by short-term and long-term effects on working-class life.
Break down how [process/system] works by identifying its component parts and the function of each. Year 10 Science: Break down how the human immune system works by identifying its component parts and the function of each in fighting bacterial infection.
Identify the assumptions underlying [argument/policy/text]. Which assumptions are most open to challenge? Year 11 Economics: Identify the assumptions underlying the case for free trade. Which assumptions are most open to challenge in the context of developing economies?

Worked example: Prompt: "Compare and contrast how power is presented in 'My Last Duchess' and 'Ozymandias', focusing on the relationship between the speaker and their subject." AI output: A structured comparative response identifying the dramatic monologue form in Browning as presenting power through control and possessiveness, while Shelley uses the external narrator and ironic juxtaposition to critique the illusion of permanence. The response organises points by criterion rather than poem-by-poem, which models the analytical approach required by GCSE mark schemes.

Teacher tip: Webb's (1997) Depth of Knowledge framework suggests that Analyse-level tasks (DOK level 3) require extended thinking with multiple steps. When using AI to generate analysis tasks for pupils, specify the number of steps in your prompt to get appropriately demanding content. See Webb's Depth of Knowledge for a fuller guide to task calibration.

Evaluate Level Prompts

Evaluation requires making judgements based on criteria and standards. This is where pupils must weigh evidence, consider multiple perspectives, and defend a position. Anderson and Krathwohl (2001) distinguish two cognitive processes: checking (testing for internal consistency) and critiquing (judging against external criteria).

Evaluate prompts are where most teachers find AI genuinely useful for lesson preparation. Generating model arguments and counter-arguments, constructing mark-scheme-aligned responses, and producing "to what extent" essay frames all sit at this level.

When to use: GCSE and A-level preparation, essay writing workshops, debate preparation, and teaching pupils to construct arguments. Evaluate prompts are also effective for staff development: generating balanced perspectives on teaching approaches before a department discussion.

Prompt templates:

Template Subject Example
To what extent do you agree that [claim]? Argue both sides, then reach a justified conclusion. Year 11 Geography: To what extent do you agree that economic development is the most important factor in reducing a country's birth rate? Argue both sides, then reach a justified conclusion.
What are the strengths and limitations of [approach/theory/policy] when applied to [context]? Year 12 Psychology: What are the strengths and limitations of the behaviourist approach when applied to explaining phobias in adults?
Judge whether [decision/action/policy] was justified, using [criteria] as your evaluative framework. Year 10 History: Judge whether Chamberlain's policy of appeasement was justified, using the evidence available to British policymakers in 1938 as your evaluative framework.
Critique [text/argument/model] by identifying its strongest claim and its most significant weakness. Year 11 English Language: Critique this opinion article by identifying its strongest rhetorical technique and its most significant logical weakness.

Worked example: Prompt: "To what extent do you agree that economic development is the most important factor in reducing birth rate? Argue both sides, then reach a justified conclusion." AI output: A structured four-paragraph response with a thesis, two developed counter-arguments (education of women, government population policy), a concession that economic development often drives these factors indirectly, and a conclusion qualifying the extent of agreement. This matches the structure required for a high-band GCSE geography response.

Teacher tip: Generate two versions of the same Evaluate prompt: one arguing strongly for the position and one arguing against. Use these as paired texts in a questioning sequence to help pupils identify the analytical moves that distinguish a well-supported argument from a weak one.

Create Level Prompts

Create is the highest level of Bloom's revised taxonomy. It requires putting elements together to form a coherent or functional whole, or reorganising existing elements into a new pattern or structure. This is not simply producing something; it involves generating, planning, and producing something that represents novel thinking.

AI at the Create level is most useful as a collaborator, not a producer. The best Create prompts use AI to generate constraints, criteria, or starting points that pupils then work with. Giving pupils the AI's first attempt and asking them to improve it also sits firmly at Create level.

When to use: Extended projects, cross-curricular work, design tasks, and any assessment where synthesis is required. Create-level prompts also support differentiation: higher-attaining pupils can be given more open Create briefs while others work with more constraints.

Prompt templates:

Template Subject Example
Design a [product/system/solution] that addresses [problem], specifying [constraints]. Year 8 DT: Design a packaging solution for a fragile product that uses only recycled materials, must protect the item during postal delivery, and must be assembled without tools or adhesives.
Write a [genre] piece that demonstrates [concept/technique], aimed at [audience]. Year 9 English: Write the opening of a Gothic short story that demonstrates the use of pathetic fallacy, foreshadowing, and an unreliable narrator. Aimed at a Year 9 reading level.
Propose a solution to [problem] that integrates knowledge from [subject area 1] and [subject area 2]. Year 10 cross-curricular: Propose a solution to food insecurity in sub-Saharan Africa that integrates knowledge from geography (climate, water access) and science (crop modification, soil chemistry).
Construct a [argument/model/experiment/plan] that [achieves goal], explaining the reasoning behind each decision. Year 12 Biology: Construct an experimental design to test whether increasing CO2 concentration affects the rate of photosynthesis in pondweed, explaining the reasoning behind each methodological decision.

Worked example: Prompt: "Write the opening of a Gothic short story that demonstrates pathetic fallacy, foreshadowing, and an unreliable narrator, aimed at Year 9 reading level." AI output: A 200-word opening paragraph with a storm-lashed Victorian house (pathetic fallacy), a narrator who repeatedly reassures themselves that 'everything is perfectly ordinary' (unreliable narrator), and a detail about a locked door that rattles for 'no reason that could be explained' (foreshadowing). Teachers can use this as a model text, a text to annotate, or a starting point pupils improve and extend.

Teacher tip: The most effective use of AI at Create level is generating the brief, not the final product. Ask AI to produce three different design briefs for a product, three different essay titles at different levels of difficulty, or three alternative starting points for a creative piece. Then pupils choose and execute one. This keeps the creative decision-making with the learner.

Common Mistakes When Prompting AI

Most AI prompting mistakes in classrooms come down to a mismatch between what the teacher wants and the cognitive level implied by the prompt wording.

Mistake 1: Asking for lists when you need analysis. "List the effects of deforestation" produces a bulleted inventory. "Analyse how deforestation creates feedback loops that accelerate climate change" produces a reasoned chain of causation. The subject matter is the same. The cognitive demand is completely different.

Mistake 2: Using 'discuss' without specifying the framework. 'Discuss the causes of the French Revolution' could mean anything from a Remember-level list to an Evaluate-level argument. Add the framework: "Discuss the relative importance of economic, social, and political causes of the French Revolution, reaching a supported conclusion."

Mistake 3: Forgetting to specify audience and constraints. A Create prompt without constraints produces generic output. The constraint is what forces specificity. "Write a lesson plan" produces a mediocre template. "Write a 50-minute lesson plan on fractions for a Year 6 class where six pupils have dyscalculia, using concrete resources before abstract notation" produces something genuinely useful.

Mistake 4: Accepting the first output. AI first drafts at higher Bloom's levels often start strongly but collapse into lists in the later paragraphs. Read the full output. If it reverts to Remember-level content partway through, add to your prompt: "Do not use bullet points or numbered lists. Maintain analytical prose throughout."

How to 'Level Up' Any Prompt

Take any prompt and add a Bloom's verb to shift the cognitive demand upwards. Here is the same topic at three levels:

Level Prompt What Pupils Produce
Remember List the features of a river's upper course. A bulleted list of geographical terms.
Analyse Explain how the processes of erosion and deposition change as a river moves from its upper to lower course. A causal chain connecting gradient, velocity, energy, and landform change.
Evaluate To what extent is human intervention the primary cause of flooding in a river's lower course? Use geographical evidence to justify your answer. A balanced argument weighing human versus physical factors, with a supported conclusion.

The Structural Learning Approach

The Thinking Framework maps directly onto Bloom's revised taxonomy, giving you a practical classroom bridge between taxonomy theory and daily lesson design. The framework identifies eight cognitive operations, each of which corresponds to a specific Bloom's level:

Thinking Framework Operation Bloom's Level Example AI Prompt Verb
Part-Whole Understand Identify the components of...
Sequence Apply Order the stages of... explaining how each leads to the next
Compare Analyse Compare and contrast... focusing on [criteria]
Classify Analyse Categorise these examples into groups, explaining your criteria
Cause and Effect Analyse Trace the chain of causes leading to...
Analogy Understand / Analyse Explain [concept] by analogy with something familiar to [year group]
Perspective Evaluate From the perspective of [stakeholder], evaluate the decision to...
Systems Thinking Evaluate Explain how [system] would respond to a change in [variable], including feedback effects

This mapping means that when you plan a lesson using the Thinking Framework, you already know which Bloom's level the cognitive work sits at, and you can write your AI prompts to match. The AI for Teachers article includes a live AI Prompt Builder widget that generates prompts aligned to each of these operations across ten subjects and five year groups.

The significance of this alignment extends beyond lesson planning. Cognitive load theory tells us that learners can only process a limited amount of new information at once (Sweller, 1988). When your AI prompts are calibrated to the right Bloom's level for where your pupils are in their learning, you reduce the risk of generating material that either under-challenges or overwhelms them.

Using AI Prompts With Rosenshine's Principles

Rosenshine (2012) identified ten principles of instruction derived from research on effective teachers. Several of these map directly onto the Bloom's-based prompting approach described in this guide.

Principle 1 is daily review of previous learning. Remember-level AI prompts are excellent for generating the ten-question retrieval starters Rosenshine recommends. You can produce a week's worth of varied recall questions in under two minutes.

Principle 2 is presenting new material in small steps. Understand-level prompts help you generate clear, well-sequenced explanations of new concepts that you can annotate, adapt, or use as models. See Rosenshine's Principles for a fuller account of how each principle applies to lesson design.

Principle 6 is checking for student understanding regularly. Apply and Analyse prompts generate the kind of practice problems and discussion questions that give you real-time evidence of whether pupils have moved beyond surface knowledge. Combine these with the AI in lesson planning strategies to build a coherent sequence.

What to Try Next Lesson

Pick one topic you are teaching this week. Write prompts at three different Bloom's levels using the templates above. Run all three prompts and compare the outputs side by side.

You will notice three things. First, the AI outputs differ substantially in depth and complexity. Second, the higher-level prompts produce content that is harder to generate yourself from scratch but that your pupils genuinely need. Third, the outputs give you an immediate sense of which level your current lesson activities are actually sitting at.

If most of your current activities generate Remember-level AI outputs, your lessons may be spending too much time on recall and not enough on the thinking skills that build long-term understanding. That is not a failing; it is useful diagnostic information.

The next step is to take one Remember-level activity from your existing scheme of work and redesign it at Analyse level using the templates in this guide. Use AI tools for teachers to generate the new version and compare it against the original. The difference in cognitive demand will be visible immediately.

Further Reading: Key Research Papers

These five studies provide the evidence base for using Bloom's taxonomy to design higher-order AI prompts in classroom contexts.

A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom's Educational Objectives View study ↗
248 citations (widely used)

Anderson, L. W., & Krathwohl, D. R. (2001). Longman.

The definitive revision of Bloom's original taxonomy. Anderson and Krathwohl restructured the six levels, changed nouns to verbs, and repositioned Create above Evaluate. This revised framework is the basis for all six levels described in this guide and remains the standard reference for cognitive task classification in educational research.

Benjamin Bloom did not work alone. The original 1956 taxonomy was the product of a committee of educational psychologists convened at a University of Chicago conference in 1948, with Bloom as chair. The group's ambition was practical: they wanted a shared classification system that would allow examiners at different universities to compare the cognitive demands of their assessments. The taxonomy was never intended as a hierarchy of value, with Create superior to Remember. Bloom's committee understood that recall is the prerequisite for higher-order thinking, not its enemy. Anderson and Krathwohl's 2001 revision, which changed the category names from nouns to verbs (Knowledge became Remember, Comprehension became Understand), also added a second dimension: the Knowledge Dimension. This distinguishes between Factual knowledge (isolated facts and terminology), Conceptual knowledge (relationships between facts, categories and principles), Procedural knowledge (how to do something), and Metacognitive knowledge (awareness of one's own learning processes). AI prompts can and should target specific cells in this two-dimensional matrix, not just rows on the single-axis ladder most teachers picture.

Cognitive Load Theory View study ↗
Sweller, 1988

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Sweller's foundational paper established that instructional design must account for working memory limits. Directly relevant to AI prompting: prompts that produce content pitched above pupils' current schema level create extraneous cognitive load without supporting learning. Calibrating prompts to the appropriate Bloom's level reduces this risk.

Principles of Instruction: Research-Based Strategies That All Teachers Should Know View study ↗
Rosenshine, 2012

Rosenshine, B. (2012). Principles of instruction. American Educator, 36(1), 12–19.

Rosenshine synthesised decades of observational classroom research into ten evidence-based principles. His emphasis on daily review, small-step presentation, and frequent checking for understanding maps directly onto the Remember, Understand, and Apply levels of Bloom's taxonomy, providing a practical bridge between cognitive taxonomy and instructional design.

Depth of Knowledge View study ↗
Webb, 1997

Webb, N. L. (1997). Research monograph number 6: Criteria for alignment of expectations and assessments in mathematics and science education. Washington, DC: CCSSO.

Webb's Depth of Knowledge framework offers a complementary lens to Bloom's taxonomy, particularly useful for identifying whether an AI-generated task genuinely requires extended thinking (DOK 3–4) or merely gives the appearance of complexity. Using both frameworks together strengthens task calibration for higher-attaining pupils.

Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping View study ↗
Karpicke & Blunt, 2011

Karpicke, J. D., & Blunt, J. R. (2011). Retrieval practice produces more learning than elaborative studying with concept mapping. Science, 331(6018), 772–775.

Karpicke and Blunt's study demonstrated that retrieval practice outperforms elaborative study strategies for long-term retention. This supports the use of Remember-level AI prompts for generating retrieval starters, while also explaining why teachers must then move pupils to higher Bloom's levels to build the schema depth that makes retrieval meaningful over time.

No Posts found.
Back to Blog

{"@context":"https://schema.org","@graph":[{"@type":"Article","headline":"AI Prompts for Every Level of Bloom's Taxonomy: A Teacher's Guide","description":"Copy-paste AI prompts for every level of Bloom's Taxonomy. Practical templates for Remember, Understand, Apply, Analyse, Evaluate and Create, with worked examples across subjects.","datePublished":"2026-03-24T11:05:33.403Z","dateModified":"2026-03-24T11:05:33.404Z","author":{"@type":"Person","name":"Paul Main","jobTitle":"Founder & Educational Consultant","url":"https://www.structural-learning.com/about"},"publisher":{"@type":"Organization","name":"Structural Learning","logo":{"@type":"ImageObject","url":"https://cdn.prod.website-files.com/5b69a01ba2e409e5d5e055c6/5b69a01ba2e409e5d5e055c7_structural-learning-logo.png"}},"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/ai-prompts-blooms-taxonomy-teachers-guide"},"about":{"@type":"Thing","name":"Bloom's Taxonomy","sameAs":"https://www.wikidata.org/wiki/Q579804"},"mentions":[{"@type":"Person","name":"Benjamin Bloom","sameAs":"https://www.wikidata.org/wiki/Q455529"},{"@type":"Person","name":"Lorin Anderson","sameAs":"https://www.wikidata.org/wiki/Q6674073"},{"@type":"Person","name":"John Sweller"},{"@type":"Person","name":"Barak Rosenshine"}],"wordCount":3800,"inLanguage":"en-GB"},{"@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"AI Prompts for Every Level of Bloom's Taxonomy: A Teacher's Guide","item":"https://www.structural-learning.com/post/ai-prompts-blooms-taxonomy-teachers-guide"}]}]}