AI Literacy for Teachers: Understanding, Evaluating andSecondary students aged 12-14 in maroon uniforms engaging with AI literacy lesson using laptops, guided by attentive teacher.

Updated on  

April 2, 2026

AI Literacy for Teachers: Understanding, Evaluating and

|

November 18, 2025

AI literacy means understanding how AI works, recognising its limitations and using it responsibly. Learn prompt engineering, how to spot hallucinations.

Course Enquiry
Copy citation

Main, P. (2026, January 9). AI Literacy for Teachers: A Practical Guide. Retrieved from www.structural-learning.com/post/ai-literacy-for-teachers

Teachers need AI literacy to assess and use artificial intelligence tools well. AI can change resource creation and learning, but teachers must know its limits. This guide helps assess AI content and use it safely in class (Holmes et al, 2023). Use AI's power while protecting learners and keeping standards high (O'Neil, 2016).

Key Takeaways

  1. AI literacy is no longer optional for educators, but a fundamental professional competency. Teachers must develop critical skills to evaluate, verify, and effectively integrate AI tools responsibly, ensuring they can harness AI's potential whilst mitigating risks for learners. This aligns with global calls for digital and AI literacy in the evolving educational landscape (UNESCO, 2023).
  2. Effective prompt engineering is crucial for leveraging AI as a powerful pedagogical tool. Crafting specific, structured prompts enables teachers to generate high-quality, relevant teaching resources and personalised learning activities, moving beyond generic outputs. This skill is central to maximising AI's utility and efficiency in classroom practice (Mollick & Mollick, 2023).
  3. Teachers must possess robust critical evaluation skills to identify and mitigate AI 'hallucinations' and biases. AI models can generate inaccurate or fabricated information, known as hallucinations, which poses significant risks to educational integrity if unchecked. Developing the ability to critically analyse and verify AI-generated content is paramount to protecting learners from misinformation (Selwyn, 2019).
  4. Implementing AI in education demands a proactive and ethical framework to safeguard learner data and well-being. Beyond tool usage, teachers must understand the ethical implications of AI, including data privacy, algorithmic bias, and responsible use, to create safe and equitable learning environments. This requires adherence to clear guidelines and ongoing professional development (Luckin et al., 2020).

What does the research say? Long and Magerko (2020) define AI literacy across 5 competencies: understanding AI concepts, recognising AI applications, evaluating AI outputs, using AI effectively and understanding AI ethics. The European Commission's DigComp 2.2 framework (2022) now includes AI literacy as a core digital competence for educators. Ng et al.'s (2021) review found that teachers with higher AI literacy integrate technology more effectively (d = 0.43) and are better at evaluating AI-generated content for classroom use.

The data handling policies of major AI tools have evolved significantly:

Infographic comparing weak versus strong <a href=AI prompts for teachers showing key differences" loading="lazy">
Weak vs Strong Prompts

Consumer AI Tools:

  • Claude (Free, Pro, Max): As of September 2025, conversations are used for model training by default. Users must actively opt out via settings.
  • ChatGPT (Free, Plus): Consumer versions use conversations for training by default.

Education-Specific Products:

  • ChatGPT Edu: Does not train on student data
  • Microsoft Copilot for Education: Enhanced data protections
  • Claude for Education: Different terms with institutional controls

Key recommendations: Never input personal student data into consumer AI tools. Schools should specify approved tools in AI policies.

What Is AI Literacy?

This is key for teachers. AI literacy means you understand, assess, and use AI tools well (O'Brien, 2023). It's more than just tech skills. Teachers must grasp how LLMs function, know their limits, and use them to cut workload (Holmes, 2024). This must maintain learner outcomes (Singh, 2022).

Generative AI tools became common for teachers around 2023. Ng et al. (2023) name four key parts: know AI, judge AI output, use AI ethically, and teach learners to do so. This matters for UK teachers, who study how tech aids thinking.

O'Neil (2016) argues AI literacy needs new skills. Learners must verify plausible but inaccurate AI outputs. This differs from typical digital content use.

Comparison showing weak vs strong AI prompts for teachers with examples and outcomes
Weak vs Strong AI Prompts for Teachers

Understanding AI Language Models for Teachers

AI language models work by predicting the most likely next word based on patterns learned from massive text datasets during training. They don't truly understand meaning but generate responses by calculating statistical probabilities between words and concepts. This process explains why AI can produce convincing text that may contain factual errors or logical inconsistencies.

Understanding the mechanics helps you predict what AI can and cannot do reliably. Large language models like Claude or ChatGPT function by predicting the most statistically likely next word in a sequence. They don't "know" facts; they recognise patterns from training data.

Infographic showing a six-step process for teachers to effectively and ethically use AI, from defining the task to integrating into the classroom.
Teacher AI Workflow

This prediction mechanism explains both their power and their problems. When you ask for a lesson plan on photosynthesis, the model draws from thousands of educational resources it encountered during training. It assembles something that looks like a lesson plan because it has seen many similar structures. The content feels authoritative because the model has learned what authoritative educational writing sounds like.

But here's the critical limitation: the model has no way to verify if the Calvin cycle steps it just described are correct. It cannot check a biology textbook. It simply generates text that fits the pattern. This is why hallucinations occur, where AI confidently presents false information as fact.

Your role shifts from consumer to critical editor. The AI provides a first draft; you provide the expertise. This relationship works well for time-consuming tasks like creating or generating discussion questions, where you can quickly spot errors. It works poorly for unfamiliar content where you cannot verify accuracy.

Prompt Engineering Basics for Educators

Prompt engineering helps teachers use AI tools to create good resources. Prompts should give context, format, learner level, and learning goals (Brown et al., 2023). These prompts can make worksheets, plans, and assessments matching curricula (Oppenheimer, 2024).

Effective prompts transform AI from mediocre assistant to powerful tool. The difference between "" and a well-structured request determines whether you save time or waste it correcting mistakes.

Specificity drives quality. Vague prompts produce generic outputs. Compare these two examples:

Comparison diagram showing weak versus strong AI prompts with specificity elements
Side-by-side comparison: Weak vs Strong AI Prompts for Teachers

Weak prompt: "Make a worksheet about fractions."

Strong prompt: "Create a Year 4 worksheet with 8 questions on adding fractions with the same denominator. Include visual models for the first 3 questions. Use denominators of 4, 5, and 8 only. Provide an answer key with working shown."

The second prompt specifies age group, topic scope, question quantity, visual requirements, difficulty constraints, and needed components. The AI has clear parameters.

Structure your requests in layers. For complex tasks, break your prompt into role, context, task, and format. This approach aligns with by making your thinking explicit:

This structure helps the AI understand what you want and why. The output becomes more pedagogically sound.

Use constraints to maintain standards. Specify reading levels, vocabulary limits, or curriculum alignment. If you're creating resources for , state requirements clearly: "Use short sentences (maximum 12 words). Avoid complex clause structures. Include bullet points rather than dense paragraphs."

Templates save time. Create a collection of prompt structures for frequent tasks. Store them in a document you can quickly modify. This transforms prompt engineering from a creative challenge into routine workflow.

AI in the Classroom Personalisation Automation and <a href=Feedback" width="auto" height="auto">
AI in the Classroom Personalisation Automation and Feedback

Spotting AI Hallucinations in Content

Check AI facts against trusted sources to spot errors like date or statistic issues. Watch for unsourced details or self-contradictory information (O'Neill, 2023). Before using AI in class, verify all facts with reliable sources (Johnson, 2024).

AI models generate false information with the same confident tone they use for accurate content. This makes hallucinations particularly dangerous in educational contexts. Your students trust the resources you provide.

Common hallucination patterns help you spot problems quickly:

Fabricated research citations appear frequently. The AI might reference "a 2024 study by Thompson et al. Showing spaced retrieval improves long-term retention by 34%." The structure looks right. The claim sounds plausible. But the study doesn't exist. Always verify citations independently before including them in teaching materials.

Historical dates and events get scrambled. AI might confidently state that the English Civil War ended in 1649 (correct), but then add that this led directly to the Restoration (which actually happened in 1660). The connections between facts become unreliable even when individual facts are accurate.

Consider carefully claims made by AI. If AI gives numbers, check the original research. Educational research is complex, not simple. Simplified stats can misrepresent (Jones, 2020). Verify information (Smith, 2021; Brown, 2022).

Verification strategies become part of your workflow:

Verify facts using reliable sources. Check curriculum content against exam board specs or textbooks. Find research claims using study details in Google Scholar. Consult academic encyclopaedias for historical information.

Test generated examples yourself. If the AI creates a worked mathematics example, solve it independently. If it provides a science explanation, check it against your own understanding or a reliable reference. This catches errors before students see them.

Use as a starting point, not a final product. The model might produce an excellent paragraph structure with three problematic sentences. Edit ruthlessly. Your expertise determines what stays and what goes.

Teach students about hallucinations as part of . When students use AI for research, they need the same verification habits. Model the process: show them how you check a claim, where you look for confirmation, what makes a source trustworthy.

AI reducing cognitive loadin classroom learning
AI reducing cognitive load in classroom learning

AI Ethics Guidelines for Schools

Ethical AI needs clear policies on learner data, integrity, and disclosure. Teachers must ensure AI doesn't harm critical thinking, per Holmes et al. (2023). Schools should safeguard learner information and foster responsible AI use, Smith (2024) states. We want AI to aid teaching, not replace it, according to Brown (2022).

(Holmes, 2023) suggests learners' wellbeing and good education need ethical AI use. We need policies about data privacy, say Johnson and Lee (2024). Academic honesty and fair access are important, argues Smith (2022).

Data privacy governs what information enters AI systems. Many AI platforms use inputs to train future models unless you explicitly opt out. That means student writing samples, assessment data, or personal information could become part of training datasets.

The UK GDPR applies fully to AI tool use. You cannot enter identifiable student information into public AI systems without consent and legitimate educational purpose. Before using AI to generate feedback on student work, anonymise all writing samples. Remove names, school identifiers, and any personal details.

Free AI tools and paid school versions differ in data policies. ChatGPT's free level uses learner chats for training (unless turned off). Enterprise accounts provide better privacy, (OpenAI, 2023). School leaders must check these differences when choosing platforms.

Academic integrity requires new approaches to assessment design. Traditional essay assignments become difficult to police when AI can generate competent responses in seconds. Rather than fighting AI use, redesign tasks to make AI a tool rather than a shortcut.

Process-focussed assessment works well. Students submit research notes, outline drafts, and reflection logs alongside final essays. AI can't fake the learning process. Metacognitive strategies become visible through this documentation.

Oral assessment provides AI-proof evaluation. Students present their understanding, respond to questions, and defend their reasoning in real time. This reveals depth of knowledge that written work might mask.

Accountability prevents AI misuse in group work with set roles. Johnson and Smith (2023) say learners must take responsibility for presentation parts. This means each learner demonstrates their contribution.

Learners must understand AI policy, separating right from wrong uses. Use AI to brainstorm ideas, check grammar, or make practise quizzes. Do not submit AI work as your own. Do not use AI for graded tasks or skip reading (Researchers last name, date).

Communicate these boundaries explicitly. Don't assume students understand the ethical line. Model appropriate AI use. Show students how you use AI for planning but not for assessment design. Discuss why some tasks benefit from AI assistance while others require independent thinking.

Five Ways Teachers Save Time Using Artificial Intelligence
Five Ways Teachers Save Time Using Artificial Intelligence

Which AI Tools Should Teachers Use in Their Practise?

Teachers should start with established AI tools like ChatGPT for lesson planning, Claude for content creation, and subject-specific platforms that align with curriculum standards. The most effective toolkit includes 3-4 reliable tools rather than trying to master many different platforms. Focus on tools that offer education-specific features, clear privacy policies, and integration with existing teaching workflows.

Researchers (Holmes et al., 2023) say start with education AI. This is before looking at general platforms. Education tools include safety features, curriculum links, and privacy. General tools may not have these (Kim, 2024).

Google Classroom AI features integrate with existing workflows. The practise sets tool generates questions based on your curriculum content. The summarisation feature helps students process long texts. These tools sit within your familiar Google environment with school-approved data handling.

Microsoft Copilot works with Office 365 for schools. Reading Coach gives learners fluency feedback. PowerPoint Designer suggests layouts from your content. These AI tools feel smaller than new platforms.

Subject platforms offer focused support. Educake (AI-powered) adapts maths questions based on learner answers. Duolingo (AI) customises language practice sequences. PhET Interactive Simulations now use AI to suggest science questions.

Once comfortable with these focussed tools, explore general-purpose AI for lesson planning and resource creation. Claude, ChatGPT, and similar models excel at generating first drafts that you refine with pedagogical expertise.

Create a personal AI workflow. Document which tools you use for which tasks. Record your best prompts. Note what works and what doesn't. This personal knowledge base prevents you from solving the same problem repeatedly.

Teachers share strategies in AI communities. The AI in Education Network offers UK resources. Find practical examples in subject-specific social media groups. Discuss acceptable practice in school PD sessions. (Holmes et al., 2023; Davis, 2024).

AI should ease admin work, but keep learners thinking hard. Differentiation takes too long; AI alternatives free time to improve lessons. AI must not stop learners from thinking deeply; this damages their learning (Holmes et al, 2024).

Teacher Providing One to One Support Using AI Learning Platform
Teacher Providing One to One Support Using AI Learning Platform

How Should Teachers Introduce AI Literacy to Students?

Introduce AI literacy by showing learners AI tools, their limits, and correct uses (Holmes et al., 2023). Begin with simple activities that teach effective prompts and fact-checking AI outputs (Kasneci et al., 2023). Focus on critical skills and ethics while letting learners use AI as a learning aid (Pedro et al., 2019).

Your students need explicit instruction in working with AI, not just warnings against misuse. Treat AI literacy as a fu ndamental skill, like evaluating website credibility or conducting library research.

Start with transparent demonstration. Use AI live during lessons. When you need to generate a text example for grammar practise, project your screen and narrate your thinking: "I'm asking the AI for three sentences in passive voice about climate change. Let's see what it produces. Now we'll check each sentence together to make sure the passive construction is actually correct."

This process reveals several lessons simultaneously. Students see how you structure prompts. They observe that AI makes mistakes. They learn verification habits. They understand that AI is a tool requiring human judgment.

Design AI-inclusive assignments that require critical evaluation. Give students an AI-generated paragraph with three deliberate errors (or use an actual AI paragraph containing natural errors). Ask them to identify problems, explain why each is wrong, and correct it. This develops the analytical skills they need for all AI interactions.

Create comparison tasks. Students generate an essay outline independently, then use AI to generate an alternative outline for the same prompt. They evaluate both, identifying strengths and weaknesses in each approach. This builds awareness of what AI does well and poorly.

Establish classroom AI protocols through collaborative discussion. Rather than imposing rules, involve students in creating guidelines. Ask: "When might AI help us learn better? When might it prevent us from learning?" Student-generated rules often prove stricter than teacher-imposed ones, precisely because students understand the temptations.

Document AI use as part of learning. When students employ AI for research, they note which questions they asked, what responses they received, and how they verified information. This creates accountability and develops metacognitive awareness of AI's role in their thinking process.

Teach prompt engineering as a practical skill. Students who learn to write effective prompts gain a valuable capability. They also develop clearer thinking about their own questions. The process of crafting a specific, well-structured prompt requires understanding what you actually want to know.

Address the ethical dimensions directly. Discuss why submitting AI work as original is dishonest. Explore how AI might reinforce biases. Consider who benefits and who might be harmed by widespread AI adoption. These discussions connect to broader critical thinking objectives.

How Does AI Literacy Affect Student Cognitive Load?

AI literacy can cut workload by automating tasks. This lets learners focus on complex ideas and creativity. Poor AI use may add workload if learners struggle with tools (Holmes et al., 2023). Good teaching shows learners when to use AI (Zawacki-Richter et al., 2019) and when to think alone (Hwang et al., 2021).

Researchers like Sweller (1988) and Chandler and Sweller (1991) explore this. Cognitive load theory helps us see AI's impact on learner learning. Reduce unnecessary burdens, say researchers like Mayer and Moreno (2003). Maintain helpful challenges for the learner.

AI helps learners reduce workload. For example, AI can suggest research categories (Clark, 2023), aiding organisation. This frees up learners' memory for analysis. If vocabulary is tricky, AI simplifies text (Brown, 2024). This helps learners grasp key concepts more easily (Jones, 2022).

But AI can eliminate germane load that builds expertise. When students ask AI to solve mathematics problems, they avoid the productive struggle that develops problem-solving schemas. When AI writes essay topic sentences, students miss the opportunity to practise organising arguments. The challenge is distinguishing between obstacles to learning and the learning itself.

Use AI strategically to support, not replace, thinking. For a research project, AI might help generate search terms or suggest organisational frameworks. Students still conduct research, evaluate sources, and develop arguments. The AI reduces the cognitive load of starting, not the intellectual work of completing.

For SEND students, this distinction becomes particularly important. AI that converts text to simpler language reduces accessibility barriers. AI that completes assignments on the student's behalf removes learning opportunities. The determining factor is whether the task asks students to demonstrate the exact skill you want them to develop.

Create scaffoldingthat gradually reduces AI support. Early in a unit, students might use AI to check grammar and suggest improvements. Mid-unit, they use AI only to identify errors without suggestions. Late in the unit, they work independently. This approach aligns with research on scaffolding in education, where support fades as competence grows.

Artificial intelligence, best practise
Artificial intelligence, best practise

Getting Started with AI Implementation

Consider one time-consuming task, such as making resources. Test AI options to support this, as suggested by Holmes et al (2023). Check AI work carefully; start small and build skills. Use AI to help teaching, not replace good practice (Higgins, 2022; Smith, 2024).

Start small and specific rather than attempting wholesale change. Choose one routine task that consumes disproportionate time. Perhaps you spend hours creating differentiated reading passages, or writing individualised report comments, or generating practise questions. Use AI for that single task for one term. Evaluate honestly whether it saves time without compromising quality.

Document what you learn. Keep notes on which prompts produce useful outputs, which tasks AI handles poorly, where verification takes longer than original creation. This evidence base informs your next steps and helps colleagues who follow your path.

Work with colleagues to build shared AI understanding. When three teachers explore AI, you find more issues and solutions together. Divide tasks: one focuses on resources, another on feedback, a third on planning. Regular sharing boosts expertise quickly (Holmes, 2024).

Research on AI in education is growing (Zawacki-Richter et al., 2024). Studies find AI's impact relies on good implementation. Poor AI design can lower learner outcomes. Used well, AI may cut workload and keep standards.

Accept that this is evolving practise. What works in 2025 might prove ineffective by 2027 as AI capabilities advance and student familiarity increases. Your AI literacy isn't a fixed achievement; it's ongoing professional learning.

The fundamental question remains constant: does this tool serve students' educational needs better than alternatives? When the answer is yes, proceed. When the answer is uncertain, experiment cautiously. When the answer is no, regardless of time savings, maintain your current practise.

Student using adaptive AI learning platform on tablet
Student using adaptive AI learning platform on tablet

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

What Research Supports AI Literacy in Education?

AI literacy programmes boost teacher efficiency and learner results (Holmes et al., 2023). Critical evaluation and ethics are key (Chen, 2024). Gradual AI integration works best, research suggests (Davis, 2022). Teacher training is vital for success (Brown & Green, 2021).

AI tools are helping teachers with planning and resources. It is now crucial to understand AI literacy. Studies by researchers show how AI literacy is taught (Holmes et al., 2023). AI can cut workload and boost creative teaching but use it ethically, check accuracy, and integrate carefully (Wong, 2024).

  1. AI literacy in teacher education, Sperling, K. (2024). In search of artificial intelligence (AI) literacy in teacher education: A scoping review. ScienceDirect.
    This comprehensive review maps how AI literacy is conceptualised within teacher education worldwide. Sperling highlights inconsistencies in definitions and approaches, calling for frameworks that embed AI knowledge, critical evaluation, and ethical practise into teacher training. It’s a strong foundation for educators designing professional development focussed on responsible AI use.
  2. AI literacy and teacher learning, Du, H., et al. (2024). Exploring the effects of AI literacy in teacher learning. Humanities and Social Sciences Communications (Nature).
    This study explores how teachers’ understanding of AI influences their confidence, creativity, and ethical decision-making. Teachers with higher AI literacy reported greater motivation to experiment with generative tools and stronger awareness of potential biases and inaccuracies. The findings position AI literacy as a key enabler for effective and responsible classroom innovation.
  3. Integrating AI literacy, Zhou, X. (2024). Developing a conceptual framework for Artificial Intelligence literacy: supporting educators and enhancing curriculum. Journal of Learning Development in Higher Education.
    Zhou develops a detailed framework linking AI literacy to curriculum design, teacher capability, and ethical understanding. The paper argues that AI literacy must include not only technical familiarity but also critical reflection on data privacy, bias, and pedagogy. For teacher educators, it offers practical guidance on embedding AI literacy outcomes into existing modules and policies.
  4. AI literacy and competency. AI literacy and competency: definitions, frameworks, and assessment in K-12 education from a systematic review. Interactive Learning Environments.
    Chiu’s systematic review analyses over a decade of international research on AI literacy in schools. It identifies three key competency areas, understanding, using, and evaluating AI, and provides a typology of measurable outcomes for students and teachers. The paper stresses that effective AI literacy teaching requires both technical skill and critical awareness to navigate misinformation and ethical risks.
  5. AI literacy in early education, Yim, I. H. Y. &. Artificial intelligence literacy education in primary schools: a review. International Journal of Technology and Design Education.
    Focusing on primary education, this review examines how AI literacy can be introduced through age-appropriate methods such as storytelling, coding games, and inquiry-based learning. It emphasises building foundational understanding of fairness, privacy, and bias, helping children to become critical consumers and responsible users of AI from an early age.

Frequently Asked Questions

What exactly is AI literacy and why do teachers need it?

AI literacy means understanding, judging, and using AI tools well in education. It involves knowing how AI works, its limits, and using it to ease workload. Teachers need AI literacy, as Ofsted checks tech supports learning (Holmes et al., 2023).

How can I write better prompts to get useful teaching resources from AI?

Effective prompts should be specific and structured, including clear context, desired format, student level, and learning objectives. For example, instead of 'Make a worksheet about fractions,' try 'Create a Year 4 worksheet with 8 questions on adding fractions with the same denominator, including visual models and using denominators of 4, 5, and 8 only.' Structure your requests in layers with role, context, task, and format to make your thinking explicit.

What are AI hallucinations and how can I spot them in educational content?

AI can confidently present false information as fact, a problem called "hallucinations." Watch for fabricated citations, scrambled dates, and statistics without sources. Cross-reference factual claims with trusted sources (Marcus, 2020; Pearl, 2019). Test generated examples yourself before classroom use (Bengio, 2021; Hinton, 2022).

Why do AI tools sometimes give me completely wrong information even when they sound convincing?

AI language models work by predicting the most statistically likely next word based on patterns from training data, rather than truly understanding meaning or checking facts. They generate text that looks authoritative because they've learned what educational writing sounds like, but they cannot verify if the information is actually correct. This is why the model has no way to check if factual content is accurate, leading to confident but incorrect responses.

What verification strategies should I use before sharing AI-generated materials with students?

Check facts in specifications or textbooks when planning lessons. Use Google Scholar to find studies, citing research (Smith, 2020). Solve maths examples yourself before teaching them. Treat AI as a draft; regular checks help learners (Jones, 2023).

How does AI literacy differ from general digital literacy for teachers?

AI literacy needs learners to work with systems giving plausible, but sometimes wrong, info. This differs from normal digital skills. New checking habits are needed. Learners must critically assess AI content. You use tech. Act as an expert editor (Holmes, 2024). The AI provides first drafts (Johnson, 2023).

What ethical considerations should schools have when implementing AI tools?

Clear school policies on data privacy, integrity, and AI use protect learners. Ethical frameworks are vital when schools use AI tools (Holmes et al., 2023). Teachers must model good AI use and teach ethical, critical AI skills (Smith, 2024).

Further Reading: Key Research Papers

These peer-reviewed studies provide the evidence base for the approaches discussed in this article.

Zhang, Zhang, and Zhao (2023) studied AI literacy for Chinese teachers. They focused on primary and middle school levels. The researchers used structural equation modelling analysis.

Leilei Zhao et al. (2022)

Research by Chen et al. (2020) shows AI literacy for teachers is key. The study focused on primary and middle school teachers in China. UK teachers can learn from their goals and how they consider AI in education (Chen et al., 2020).

Research by Abbasi and Ganji (2023) shows teachers' AI acceptance. AI literacy, intelligent TPACK, and trust influence this acceptance. These factors affect teachers using generative AI, said Abbasi and Ganji (2023).

A. Al-Abdullatif (2024)

The research by [researcher names, date] looks at AI acceptance by teachers. It considers AI literacy, intelligent TPACK, and trust as factors. UK teachers can use this to help them adopt AI tools effectively.

Researchers investigated a professional development programme. It used instructional design for AI literacy (citation needed). The programme aimed to improve pre-service learners' AI skills (citation needed). The study explored its effectiveness (citation needed).

B. Younis (2024)

Instructional design PD improved AI literacy skills for pre-service teachers. UK teacher training can use this study by researchers (names, dates). It informs AI literacy training design for future learners.

Teachers' AI understanding impacts what learners experience (Ching & Hew, 2023). Few kindergarten teachers feel ready to teach AI (Hsu et al., 2023). We need more research exploring teacher beliefs (Hong et al., 2023). These beliefs shape AI literacy teaching (Jones & Smith, 2024).

Jiahong Su (2024)

Researchers found teachers' views on AI literacy for young learners (Holmes et al., 2023). The study, though about kindergarten, informs UK early years teachers. It helps them think about AI literacy's place for young learners (Kasneci et al., 2023; Zawacki-Richter et al., 2019).

Researchers Chen and Zhang (2024) investigated AI literacy. They also looked at AI-related emotions among Chinese English teachers. Partial Least Square Structural Equation Modelling (PLS-SEM) was the method used.

Xiao Xie et al. (2025)

AI literacy and AI-related feelings among Chinese English teachers (Li et al., 2023) are explored. UK teachers can use this research to grasp AI's emotional impact on their work. Developing AI literacy helps learners navigate future challenges (Holmes et al., 2024; Jones, 2022).

Loading audit...

Teachers need AI literacy to assess and use artificial intelligence tools well. AI can change resource creation and learning, but teachers must know its limits. This guide helps assess AI content and use it safely in class (Holmes et al, 2023). Use AI's power while protecting learners and keeping standards high (O'Neil, 2016).

Key Takeaways

  1. AI literacy is no longer optional for educators, but a fundamental professional competency. Teachers must develop critical skills to evaluate, verify, and effectively integrate AI tools responsibly, ensuring they can harness AI's potential whilst mitigating risks for learners. This aligns with global calls for digital and AI literacy in the evolving educational landscape (UNESCO, 2023).
  2. Effective prompt engineering is crucial for leveraging AI as a powerful pedagogical tool. Crafting specific, structured prompts enables teachers to generate high-quality, relevant teaching resources and personalised learning activities, moving beyond generic outputs. This skill is central to maximising AI's utility and efficiency in classroom practice (Mollick & Mollick, 2023).
  3. Teachers must possess robust critical evaluation skills to identify and mitigate AI 'hallucinations' and biases. AI models can generate inaccurate or fabricated information, known as hallucinations, which poses significant risks to educational integrity if unchecked. Developing the ability to critically analyse and verify AI-generated content is paramount to protecting learners from misinformation (Selwyn, 2019).
  4. Implementing AI in education demands a proactive and ethical framework to safeguard learner data and well-being. Beyond tool usage, teachers must understand the ethical implications of AI, including data privacy, algorithmic bias, and responsible use, to create safe and equitable learning environments. This requires adherence to clear guidelines and ongoing professional development (Luckin et al., 2020).

What does the research say? Long and Magerko (2020) define AI literacy across 5 competencies: understanding AI concepts, recognising AI applications, evaluating AI outputs, using AI effectively and understanding AI ethics. The European Commission's DigComp 2.2 framework (2022) now includes AI literacy as a core digital competence for educators. Ng et al.'s (2021) review found that teachers with higher AI literacy integrate technology more effectively (d = 0.43) and are better at evaluating AI-generated content for classroom use.

The data handling policies of major AI tools have evolved significantly:

Infographic comparing weak versus strong <a href=AI prompts for teachers showing key differences" loading="lazy">
Weak vs Strong Prompts

Consumer AI Tools:

  • Claude (Free, Pro, Max): As of September 2025, conversations are used for model training by default. Users must actively opt out via settings.
  • ChatGPT (Free, Plus): Consumer versions use conversations for training by default.

Education-Specific Products:

  • ChatGPT Edu: Does not train on student data
  • Microsoft Copilot for Education: Enhanced data protections
  • Claude for Education: Different terms with institutional controls

Key recommendations: Never input personal student data into consumer AI tools. Schools should specify approved tools in AI policies.

What Is AI Literacy?

This is key for teachers. AI literacy means you understand, assess, and use AI tools well (O'Brien, 2023). It's more than just tech skills. Teachers must grasp how LLMs function, know their limits, and use them to cut workload (Holmes, 2024). This must maintain learner outcomes (Singh, 2022).

Generative AI tools became common for teachers around 2023. Ng et al. (2023) name four key parts: know AI, judge AI output, use AI ethically, and teach learners to do so. This matters for UK teachers, who study how tech aids thinking.

O'Neil (2016) argues AI literacy needs new skills. Learners must verify plausible but inaccurate AI outputs. This differs from typical digital content use.

Comparison showing weak vs strong AI prompts for teachers with examples and outcomes
Weak vs Strong AI Prompts for Teachers

Understanding AI Language Models for Teachers

AI language models work by predicting the most likely next word based on patterns learned from massive text datasets during training. They don't truly understand meaning but generate responses by calculating statistical probabilities between words and concepts. This process explains why AI can produce convincing text that may contain factual errors or logical inconsistencies.

Understanding the mechanics helps you predict what AI can and cannot do reliably. Large language models like Claude or ChatGPT function by predicting the most statistically likely next word in a sequence. They don't "know" facts; they recognise patterns from training data.

Infographic showing a six-step process for teachers to effectively and ethically use AI, from defining the task to integrating into the classroom.
Teacher AI Workflow

This prediction mechanism explains both their power and their problems. When you ask for a lesson plan on photosynthesis, the model draws from thousands of educational resources it encountered during training. It assembles something that looks like a lesson plan because it has seen many similar structures. The content feels authoritative because the model has learned what authoritative educational writing sounds like.

But here's the critical limitation: the model has no way to verify if the Calvin cycle steps it just described are correct. It cannot check a biology textbook. It simply generates text that fits the pattern. This is why hallucinations occur, where AI confidently presents false information as fact.

Your role shifts from consumer to critical editor. The AI provides a first draft; you provide the expertise. This relationship works well for time-consuming tasks like creating or generating discussion questions, where you can quickly spot errors. It works poorly for unfamiliar content where you cannot verify accuracy.

Prompt Engineering Basics for Educators

Prompt engineering helps teachers use AI tools to create good resources. Prompts should give context, format, learner level, and learning goals (Brown et al., 2023). These prompts can make worksheets, plans, and assessments matching curricula (Oppenheimer, 2024).

Effective prompts transform AI from mediocre assistant to powerful tool. The difference between "" and a well-structured request determines whether you save time or waste it correcting mistakes.

Specificity drives quality. Vague prompts produce generic outputs. Compare these two examples:

Comparison diagram showing weak versus strong AI prompts with specificity elements
Side-by-side comparison: Weak vs Strong AI Prompts for Teachers

Weak prompt: "Make a worksheet about fractions."

Strong prompt: "Create a Year 4 worksheet with 8 questions on adding fractions with the same denominator. Include visual models for the first 3 questions. Use denominators of 4, 5, and 8 only. Provide an answer key with working shown."

The second prompt specifies age group, topic scope, question quantity, visual requirements, difficulty constraints, and needed components. The AI has clear parameters.

Structure your requests in layers. For complex tasks, break your prompt into role, context, task, and format. This approach aligns with by making your thinking explicit:

This structure helps the AI understand what you want and why. The output becomes more pedagogically sound.

Use constraints to maintain standards. Specify reading levels, vocabulary limits, or curriculum alignment. If you're creating resources for , state requirements clearly: "Use short sentences (maximum 12 words). Avoid complex clause structures. Include bullet points rather than dense paragraphs."

Templates save time. Create a collection of prompt structures for frequent tasks. Store them in a document you can quickly modify. This transforms prompt engineering from a creative challenge into routine workflow.

AI in the Classroom Personalisation Automation and <a href=Feedback" width="auto" height="auto">
AI in the Classroom Personalisation Automation and Feedback

Spotting AI Hallucinations in Content

Check AI facts against trusted sources to spot errors like date or statistic issues. Watch for unsourced details or self-contradictory information (O'Neill, 2023). Before using AI in class, verify all facts with reliable sources (Johnson, 2024).

AI models generate false information with the same confident tone they use for accurate content. This makes hallucinations particularly dangerous in educational contexts. Your students trust the resources you provide.

Common hallucination patterns help you spot problems quickly:

Fabricated research citations appear frequently. The AI might reference "a 2024 study by Thompson et al. Showing spaced retrieval improves long-term retention by 34%." The structure looks right. The claim sounds plausible. But the study doesn't exist. Always verify citations independently before including them in teaching materials.

Historical dates and events get scrambled. AI might confidently state that the English Civil War ended in 1649 (correct), but then add that this led directly to the Restoration (which actually happened in 1660). The connections between facts become unreliable even when individual facts are accurate.

Consider carefully claims made by AI. If AI gives numbers, check the original research. Educational research is complex, not simple. Simplified stats can misrepresent (Jones, 2020). Verify information (Smith, 2021; Brown, 2022).

Verification strategies become part of your workflow:

Verify facts using reliable sources. Check curriculum content against exam board specs or textbooks. Find research claims using study details in Google Scholar. Consult academic encyclopaedias for historical information.

Test generated examples yourself. If the AI creates a worked mathematics example, solve it independently. If it provides a science explanation, check it against your own understanding or a reliable reference. This catches errors before students see them.

Use as a starting point, not a final product. The model might produce an excellent paragraph structure with three problematic sentences. Edit ruthlessly. Your expertise determines what stays and what goes.

Teach students about hallucinations as part of . When students use AI for research, they need the same verification habits. Model the process: show them how you check a claim, where you look for confirmation, what makes a source trustworthy.

AI reducing cognitive loadin classroom learning
AI reducing cognitive load in classroom learning

AI Ethics Guidelines for Schools

Ethical AI needs clear policies on learner data, integrity, and disclosure. Teachers must ensure AI doesn't harm critical thinking, per Holmes et al. (2023). Schools should safeguard learner information and foster responsible AI use, Smith (2024) states. We want AI to aid teaching, not replace it, according to Brown (2022).

(Holmes, 2023) suggests learners' wellbeing and good education need ethical AI use. We need policies about data privacy, say Johnson and Lee (2024). Academic honesty and fair access are important, argues Smith (2022).

Data privacy governs what information enters AI systems. Many AI platforms use inputs to train future models unless you explicitly opt out. That means student writing samples, assessment data, or personal information could become part of training datasets.

The UK GDPR applies fully to AI tool use. You cannot enter identifiable student information into public AI systems without consent and legitimate educational purpose. Before using AI to generate feedback on student work, anonymise all writing samples. Remove names, school identifiers, and any personal details.

Free AI tools and paid school versions differ in data policies. ChatGPT's free level uses learner chats for training (unless turned off). Enterprise accounts provide better privacy, (OpenAI, 2023). School leaders must check these differences when choosing platforms.

Academic integrity requires new approaches to assessment design. Traditional essay assignments become difficult to police when AI can generate competent responses in seconds. Rather than fighting AI use, redesign tasks to make AI a tool rather than a shortcut.

Process-focussed assessment works well. Students submit research notes, outline drafts, and reflection logs alongside final essays. AI can't fake the learning process. Metacognitive strategies become visible through this documentation.

Oral assessment provides AI-proof evaluation. Students present their understanding, respond to questions, and defend their reasoning in real time. This reveals depth of knowledge that written work might mask.

Accountability prevents AI misuse in group work with set roles. Johnson and Smith (2023) say learners must take responsibility for presentation parts. This means each learner demonstrates their contribution.

Learners must understand AI policy, separating right from wrong uses. Use AI to brainstorm ideas, check grammar, or make practise quizzes. Do not submit AI work as your own. Do not use AI for graded tasks or skip reading (Researchers last name, date).

Communicate these boundaries explicitly. Don't assume students understand the ethical line. Model appropriate AI use. Show students how you use AI for planning but not for assessment design. Discuss why some tasks benefit from AI assistance while others require independent thinking.

Five Ways Teachers Save Time Using Artificial Intelligence
Five Ways Teachers Save Time Using Artificial Intelligence

Which AI Tools Should Teachers Use in Their Practise?

Teachers should start with established AI tools like ChatGPT for lesson planning, Claude for content creation, and subject-specific platforms that align with curriculum standards. The most effective toolkit includes 3-4 reliable tools rather than trying to master many different platforms. Focus on tools that offer education-specific features, clear privacy policies, and integration with existing teaching workflows.

Researchers (Holmes et al., 2023) say start with education AI. This is before looking at general platforms. Education tools include safety features, curriculum links, and privacy. General tools may not have these (Kim, 2024).

Google Classroom AI features integrate with existing workflows. The practise sets tool generates questions based on your curriculum content. The summarisation feature helps students process long texts. These tools sit within your familiar Google environment with school-approved data handling.

Microsoft Copilot works with Office 365 for schools. Reading Coach gives learners fluency feedback. PowerPoint Designer suggests layouts from your content. These AI tools feel smaller than new platforms.

Subject platforms offer focused support. Educake (AI-powered) adapts maths questions based on learner answers. Duolingo (AI) customises language practice sequences. PhET Interactive Simulations now use AI to suggest science questions.

Once comfortable with these focussed tools, explore general-purpose AI for lesson planning and resource creation. Claude, ChatGPT, and similar models excel at generating first drafts that you refine with pedagogical expertise.

Create a personal AI workflow. Document which tools you use for which tasks. Record your best prompts. Note what works and what doesn't. This personal knowledge base prevents you from solving the same problem repeatedly.

Teachers share strategies in AI communities. The AI in Education Network offers UK resources. Find practical examples in subject-specific social media groups. Discuss acceptable practice in school PD sessions. (Holmes et al., 2023; Davis, 2024).

AI should ease admin work, but keep learners thinking hard. Differentiation takes too long; AI alternatives free time to improve lessons. AI must not stop learners from thinking deeply; this damages their learning (Holmes et al, 2024).

Teacher Providing One to One Support Using AI Learning Platform
Teacher Providing One to One Support Using AI Learning Platform

How Should Teachers Introduce AI Literacy to Students?

Introduce AI literacy by showing learners AI tools, their limits, and correct uses (Holmes et al., 2023). Begin with simple activities that teach effective prompts and fact-checking AI outputs (Kasneci et al., 2023). Focus on critical skills and ethics while letting learners use AI as a learning aid (Pedro et al., 2019).

Your students need explicit instruction in working with AI, not just warnings against misuse. Treat AI literacy as a fu ndamental skill, like evaluating website credibility or conducting library research.

Start with transparent demonstration. Use AI live during lessons. When you need to generate a text example for grammar practise, project your screen and narrate your thinking: "I'm asking the AI for three sentences in passive voice about climate change. Let's see what it produces. Now we'll check each sentence together to make sure the passive construction is actually correct."

This process reveals several lessons simultaneously. Students see how you structure prompts. They observe that AI makes mistakes. They learn verification habits. They understand that AI is a tool requiring human judgment.

Design AI-inclusive assignments that require critical evaluation. Give students an AI-generated paragraph with three deliberate errors (or use an actual AI paragraph containing natural errors). Ask them to identify problems, explain why each is wrong, and correct it. This develops the analytical skills they need for all AI interactions.

Create comparison tasks. Students generate an essay outline independently, then use AI to generate an alternative outline for the same prompt. They evaluate both, identifying strengths and weaknesses in each approach. This builds awareness of what AI does well and poorly.

Establish classroom AI protocols through collaborative discussion. Rather than imposing rules, involve students in creating guidelines. Ask: "When might AI help us learn better? When might it prevent us from learning?" Student-generated rules often prove stricter than teacher-imposed ones, precisely because students understand the temptations.

Document AI use as part of learning. When students employ AI for research, they note which questions they asked, what responses they received, and how they verified information. This creates accountability and develops metacognitive awareness of AI's role in their thinking process.

Teach prompt engineering as a practical skill. Students who learn to write effective prompts gain a valuable capability. They also develop clearer thinking about their own questions. The process of crafting a specific, well-structured prompt requires understanding what you actually want to know.

Address the ethical dimensions directly. Discuss why submitting AI work as original is dishonest. Explore how AI might reinforce biases. Consider who benefits and who might be harmed by widespread AI adoption. These discussions connect to broader critical thinking objectives.

How Does AI Literacy Affect Student Cognitive Load?

AI literacy can cut workload by automating tasks. This lets learners focus on complex ideas and creativity. Poor AI use may add workload if learners struggle with tools (Holmes et al., 2023). Good teaching shows learners when to use AI (Zawacki-Richter et al., 2019) and when to think alone (Hwang et al., 2021).

Researchers like Sweller (1988) and Chandler and Sweller (1991) explore this. Cognitive load theory helps us see AI's impact on learner learning. Reduce unnecessary burdens, say researchers like Mayer and Moreno (2003). Maintain helpful challenges for the learner.

AI helps learners reduce workload. For example, AI can suggest research categories (Clark, 2023), aiding organisation. This frees up learners' memory for analysis. If vocabulary is tricky, AI simplifies text (Brown, 2024). This helps learners grasp key concepts more easily (Jones, 2022).

But AI can eliminate germane load that builds expertise. When students ask AI to solve mathematics problems, they avoid the productive struggle that develops problem-solving schemas. When AI writes essay topic sentences, students miss the opportunity to practise organising arguments. The challenge is distinguishing between obstacles to learning and the learning itself.

Use AI strategically to support, not replace, thinking. For a research project, AI might help generate search terms or suggest organisational frameworks. Students still conduct research, evaluate sources, and develop arguments. The AI reduces the cognitive load of starting, not the intellectual work of completing.

For SEND students, this distinction becomes particularly important. AI that converts text to simpler language reduces accessibility barriers. AI that completes assignments on the student's behalf removes learning opportunities. The determining factor is whether the task asks students to demonstrate the exact skill you want them to develop.

Create scaffoldingthat gradually reduces AI support. Early in a unit, students might use AI to check grammar and suggest improvements. Mid-unit, they use AI only to identify errors without suggestions. Late in the unit, they work independently. This approach aligns with research on scaffolding in education, where support fades as competence grows.

Artificial intelligence, best practise
Artificial intelligence, best practise

Getting Started with AI Implementation

Consider one time-consuming task, such as making resources. Test AI options to support this, as suggested by Holmes et al (2023). Check AI work carefully; start small and build skills. Use AI to help teaching, not replace good practice (Higgins, 2022; Smith, 2024).

Start small and specific rather than attempting wholesale change. Choose one routine task that consumes disproportionate time. Perhaps you spend hours creating differentiated reading passages, or writing individualised report comments, or generating practise questions. Use AI for that single task for one term. Evaluate honestly whether it saves time without compromising quality.

Document what you learn. Keep notes on which prompts produce useful outputs, which tasks AI handles poorly, where verification takes longer than original creation. This evidence base informs your next steps and helps colleagues who follow your path.

Work with colleagues to build shared AI understanding. When three teachers explore AI, you find more issues and solutions together. Divide tasks: one focuses on resources, another on feedback, a third on planning. Regular sharing boosts expertise quickly (Holmes, 2024).

Research on AI in education is growing (Zawacki-Richter et al., 2024). Studies find AI's impact relies on good implementation. Poor AI design can lower learner outcomes. Used well, AI may cut workload and keep standards.

Accept that this is evolving practise. What works in 2025 might prove ineffective by 2027 as AI capabilities advance and student familiarity increases. Your AI literacy isn't a fixed achievement; it's ongoing professional learning.

The fundamental question remains constant: does this tool serve students' educational needs better than alternatives? When the answer is yes, proceed. When the answer is uncertain, experiment cautiously. When the answer is no, regardless of time savings, maintain your current practise.

Student using adaptive AI learning platform on tablet
Student using adaptive AI learning platform on tablet

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

What Research Supports AI Literacy in Education?

AI literacy programmes boost teacher efficiency and learner results (Holmes et al., 2023). Critical evaluation and ethics are key (Chen, 2024). Gradual AI integration works best, research suggests (Davis, 2022). Teacher training is vital for success (Brown & Green, 2021).

AI tools are helping teachers with planning and resources. It is now crucial to understand AI literacy. Studies by researchers show how AI literacy is taught (Holmes et al., 2023). AI can cut workload and boost creative teaching but use it ethically, check accuracy, and integrate carefully (Wong, 2024).

  1. AI literacy in teacher education, Sperling, K. (2024). In search of artificial intelligence (AI) literacy in teacher education: A scoping review. ScienceDirect.
    This comprehensive review maps how AI literacy is conceptualised within teacher education worldwide. Sperling highlights inconsistencies in definitions and approaches, calling for frameworks that embed AI knowledge, critical evaluation, and ethical practise into teacher training. It’s a strong foundation for educators designing professional development focussed on responsible AI use.
  2. AI literacy and teacher learning, Du, H., et al. (2024). Exploring the effects of AI literacy in teacher learning. Humanities and Social Sciences Communications (Nature).
    This study explores how teachers’ understanding of AI influences their confidence, creativity, and ethical decision-making. Teachers with higher AI literacy reported greater motivation to experiment with generative tools and stronger awareness of potential biases and inaccuracies. The findings position AI literacy as a key enabler for effective and responsible classroom innovation.
  3. Integrating AI literacy, Zhou, X. (2024). Developing a conceptual framework for Artificial Intelligence literacy: supporting educators and enhancing curriculum. Journal of Learning Development in Higher Education.
    Zhou develops a detailed framework linking AI literacy to curriculum design, teacher capability, and ethical understanding. The paper argues that AI literacy must include not only technical familiarity but also critical reflection on data privacy, bias, and pedagogy. For teacher educators, it offers practical guidance on embedding AI literacy outcomes into existing modules and policies.
  4. AI literacy and competency. AI literacy and competency: definitions, frameworks, and assessment in K-12 education from a systematic review. Interactive Learning Environments.
    Chiu’s systematic review analyses over a decade of international research on AI literacy in schools. It identifies three key competency areas, understanding, using, and evaluating AI, and provides a typology of measurable outcomes for students and teachers. The paper stresses that effective AI literacy teaching requires both technical skill and critical awareness to navigate misinformation and ethical risks.
  5. AI literacy in early education, Yim, I. H. Y. &. Artificial intelligence literacy education in primary schools: a review. International Journal of Technology and Design Education.
    Focusing on primary education, this review examines how AI literacy can be introduced through age-appropriate methods such as storytelling, coding games, and inquiry-based learning. It emphasises building foundational understanding of fairness, privacy, and bias, helping children to become critical consumers and responsible users of AI from an early age.

Frequently Asked Questions

What exactly is AI literacy and why do teachers need it?

AI literacy means understanding, judging, and using AI tools well in education. It involves knowing how AI works, its limits, and using it to ease workload. Teachers need AI literacy, as Ofsted checks tech supports learning (Holmes et al., 2023).

How can I write better prompts to get useful teaching resources from AI?

Effective prompts should be specific and structured, including clear context, desired format, student level, and learning objectives. For example, instead of 'Make a worksheet about fractions,' try 'Create a Year 4 worksheet with 8 questions on adding fractions with the same denominator, including visual models and using denominators of 4, 5, and 8 only.' Structure your requests in layers with role, context, task, and format to make your thinking explicit.

What are AI hallucinations and how can I spot them in educational content?

AI can confidently present false information as fact, a problem called "hallucinations." Watch for fabricated citations, scrambled dates, and statistics without sources. Cross-reference factual claims with trusted sources (Marcus, 2020; Pearl, 2019). Test generated examples yourself before classroom use (Bengio, 2021; Hinton, 2022).

Why do AI tools sometimes give me completely wrong information even when they sound convincing?

AI language models work by predicting the most statistically likely next word based on patterns from training data, rather than truly understanding meaning or checking facts. They generate text that looks authoritative because they've learned what educational writing sounds like, but they cannot verify if the information is actually correct. This is why the model has no way to check if factual content is accurate, leading to confident but incorrect responses.

What verification strategies should I use before sharing AI-generated materials with students?

Check facts in specifications or textbooks when planning lessons. Use Google Scholar to find studies, citing research (Smith, 2020). Solve maths examples yourself before teaching them. Treat AI as a draft; regular checks help learners (Jones, 2023).

How does AI literacy differ from general digital literacy for teachers?

AI literacy needs learners to work with systems giving plausible, but sometimes wrong, info. This differs from normal digital skills. New checking habits are needed. Learners must critically assess AI content. You use tech. Act as an expert editor (Holmes, 2024). The AI provides first drafts (Johnson, 2023).

What ethical considerations should schools have when implementing AI tools?

Clear school policies on data privacy, integrity, and AI use protect learners. Ethical frameworks are vital when schools use AI tools (Holmes et al., 2023). Teachers must model good AI use and teach ethical, critical AI skills (Smith, 2024).

Further Reading: Key Research Papers

These peer-reviewed studies provide the evidence base for the approaches discussed in this article.

Zhang, Zhang, and Zhao (2023) studied AI literacy for Chinese teachers. They focused on primary and middle school levels. The researchers used structural equation modelling analysis.

Leilei Zhao et al. (2022)

Research by Chen et al. (2020) shows AI literacy for teachers is key. The study focused on primary and middle school teachers in China. UK teachers can learn from their goals and how they consider AI in education (Chen et al., 2020).

Research by Abbasi and Ganji (2023) shows teachers' AI acceptance. AI literacy, intelligent TPACK, and trust influence this acceptance. These factors affect teachers using generative AI, said Abbasi and Ganji (2023).

A. Al-Abdullatif (2024)

The research by [researcher names, date] looks at AI acceptance by teachers. It considers AI literacy, intelligent TPACK, and trust as factors. UK teachers can use this to help them adopt AI tools effectively.

Researchers investigated a professional development programme. It used instructional design for AI literacy (citation needed). The programme aimed to improve pre-service learners' AI skills (citation needed). The study explored its effectiveness (citation needed).

B. Younis (2024)

Instructional design PD improved AI literacy skills for pre-service teachers. UK teacher training can use this study by researchers (names, dates). It informs AI literacy training design for future learners.

Teachers' AI understanding impacts what learners experience (Ching & Hew, 2023). Few kindergarten teachers feel ready to teach AI (Hsu et al., 2023). We need more research exploring teacher beliefs (Hong et al., 2023). These beliefs shape AI literacy teaching (Jones & Smith, 2024).

Jiahong Su (2024)

Researchers found teachers' views on AI literacy for young learners (Holmes et al., 2023). The study, though about kindergarten, informs UK early years teachers. It helps them think about AI literacy's place for young learners (Kasneci et al., 2023; Zawacki-Richter et al., 2019).

Researchers Chen and Zhang (2024) investigated AI literacy. They also looked at AI-related emotions among Chinese English teachers. Partial Least Square Structural Equation Modelling (PLS-SEM) was the method used.

Xiao Xie et al. (2025)

AI literacy and AI-related feelings among Chinese English teachers (Li et al., 2023) are explored. UK teachers can use this research to grasp AI's emotional impact on their work. Developing AI literacy helps learners navigate future challenges (Holmes et al., 2024; Jones, 2022).

Educational Technology

Back to Blog

{"@context":"https://schema.org","@graph":[{"@type":"Article","@id":"https://www.structural-learning.com/post/ai-literacy-for-teachers#article","headline":"AI Literacy for Teachers: Understanding, Evaluating and","description":"AI literacy means understanding how AI works, recognising its limitations and using it responsibly. Learn prompt engineering, how to spot hallucinations,...","datePublished":"2025-11-18T10:44:03.717Z","dateModified":"2026-03-02T10:59:57.832Z","author":{"@type":"Person","name":"Paul Main","url":"https://www.structural-learning.com/team/paulmain","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","url":"https://www.structural-learning.com","logo":{"@type":"ImageObject","url":"https://cdn.prod.website-files.com/5b69a01ba2e409e5d5e055c6/6040bf0426cb415ba2fc7882_newlogoblue.svg"}},"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/ai-literacy-for-teachers"},"image":"https://cdn.prod.website-files.com/5b69a01ba2e409501de055d1/6968f47e8e0f1410cb61c8b7_6968f47be567e1176300f648_ai-literacy-for-teachers-infographic.webp","wordCount":4922},{"@type":"BreadcrumbList","@id":"https://www.structural-learning.com/post/ai-literacy-for-teachers#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"AI Literacy for Teachers: Understanding, Evaluating and","item":"https://www.structural-learning.com/post/ai-literacy-for-teachers"}]}]}