AI for SEND Administration: A SENCO's Guide

Updated on  

February 26, 2026

AI for SEND Administration: A SENCO's Guide

|

February 26, 2026

A balanced, practical guide for SENCOs on AI tools for SEND administration. What works, what the DfE says, GDPR implications, and a 10-point procurement checklist.

Three-quarters of SENCOs in England are being pulled away from direct pupil support by administrative demands they cannot escape (nasen National SENCO Workforce Survey). AI tools are now appearing in this space, promising to reduce that burden. Some are genuinely useful. Others carry significant risks that most SENCOs have not had time to evaluate. This guide gives you the balanced picture: what AI can do for SEND administration right now, where the risks sit, what the DfE says, and a decision framework you can use before committing to any tool.

Key Takeaways

  1. The workload case is real: 74% of SENCOs are pulled from pupils by administration; AI offers partial, not total, relief at lower-risk tasks like summarising reports and tracking provision.
  2. SEND data is special category data: Under UK GDPR Article 9, processing a child's SEND information with AI tools requires a lawful basis, a Data Protection Impact Assessment, and transparent parent communication.
  3. Human oversight is non-negotiable: The DfE's June 2025 guidance requires all AI-generated content in education to be reviewed by a qualified professional before use.
  4. The 2026 White Paper changes everything: Mandatory digital Individual Support Plans will create a standardised AI on-ramp; schools that develop AI literacy now will be better positioned for statutory compliance later.

The SENCO Administration Crisis

The data on SENCO workload is not ambiguous. The National SENCO Workforce Survey, conducted by Bath Spa University and nasen, found that 74% of SENCOs are regularly pulled away from supporting pupils with special educational needs to complete administrative tasks. The same survey found that 55% of primary SENCOs and 70% of secondary SENCOs are not allocated enough time to carry out their role effectively. EHCP applications, annual reviews, and provision mapping consume the majority of that allocated time.

EHCP numbers have risen every year since the Children and Families Act 2014 came into force. As of January 2025, over 576,000 children and young people in England held an EHCP, an increase of 11% on the previous year. Each plan requires detailed documentation, multi-agency coordination, and ongoing review cycles that follow the graduated approach assess-plan-do-review cycle. The system was not designed to scale at this rate, and SENCOs are bearing the administrative consequence.

The NEU's SENCO Workload report called for legally protected SENCO time. A Twinkl survey found that a significant proportion of SENCOs were considering leaving the role because of workload. The human cost of the administration burden is not abstract. When a SENCO spends three hours preparing an annual review pack, that is three hours not spent observing a pupil's progress, meeting with a family, or coaching a classroom teacher. This is the context in which AI tools have entered the conversation. The promise is straightforward: if AI can handle first-draft documentation, provision summaries, or data pattern analysis, SENCOs recover time to do the relational and professional work that actually benefits children.

What AI Tools Are Available Right Now

The current landscape splits into two categories: dedicated SEND AI platforms designed for EHCP and provision management, and general AI tools that SENCOs are adapting for administrative tasks. The following table maps the main options, what they do, and the associated data risk.

Tool Primary Function Who Uses It Data Risk Level
Agilisys EHCP Tool AI-powered EHCP first-draft generation from uploaded professional reports Local authority SEND caseworkers Medium (enterprise data processing agreements in place)
Invision360 VITA AI-driven EHCP drafting and quality assurance, Innovate UK funded 50+ UK local authorities Medium (lawful basis required before processing)
Provision Map (Tes/Edukey) Provision mapping, intervention tracking, learning plans Schools, SENCOs Lower (established MIS integration, school data agreements)
ChatGPT / Claude / Gemini (personal accounts) Drafting EHCP sections, parent letters, IEP targets, social stories Individual SENCOs using personal accounts High (not UK GDPR compliant by default)
Microsoft Copilot (M365 for Education) Drafting, summarising, data analysis within school M365 tenant Schools with M365 licences Lower (processed within school's secure M365 environment)
NotebookLM (Google) Summarising professional reports, creating accessible digests SENCOs with Google Workspace for Education schools Medium (depends on Google Workspace for Education agreement)

The risk level column reflects data processing risk, not tool quality. A lower-risk tool may still produce poor output. A higher-risk tool may produce excellent drafts that you cannot legally use without additional safeguards. Both dimensions matter in the evaluation process.

Where AI Can Help Right Now

The most useful AI applications in SEND administration are those that assist rather than replace professional judgement. This distinction matters in practice, not just in principle. The following tasks represent the clearest opportunities based on current SENCO experience.

Summarising lengthy professional reports is one of the clearest use cases. Educational psychologists, speech and language therapists, and occupational therapists produce reports that can run to 20 or more pages. A SENCO preparing for an EHCP annual review must synthesise multiple such reports, often under significant time pressure. AI tools such as NotebookLM can produce structured summaries of uploaded documents within seconds, reducing preparation time considerably. The risk here is relatively low provided the documents are uploaded within a secure environment covered by a data processing agreement, not into a public-facing AI interface.

Provision mapping and tracking is a second area with real potential. Tools that connect to your school's existing management information system can flag gaps between identified needs and current provision. When this works well, it functions as an early warning system: a pupil with a documented need for small-group reading support is not receiving it because the timetable changed two weeks ago, and the AI flags the gap before the next review. The key safeguard is that the tool reads data that already exists within your school's secure environment, rather than requiring you to input identifying information about individual pupils into an external system.

Generating template documents for differentiation strategies and provision planning is a lower-risk starting point for SENCOs who want to experiment with AI. Asking a general AI tool to produce a template SEN support plan structure, or a list of evidence-based strategies for a pupil with working memory difficulties, does not require inputting any pupil data. The output is then edited by a professional who knows the specific child. This approach keeps the data risk low while building staff confidence in evaluating AI output critically.

Report drafting for parents and teachers is the most commonly reported use case in SENCO networks. SENCOs use AI to generate first drafts of progress reports, communication letters, and target summaries. These drafts require significant editing to ensure they reflect the specific child rather than a generic profile. The SEND Network reported in late 2025 that template-based AI output frequently defaults to standard language rather than the specified, quantified provision that EHCPs and SEN support plans legally require. A first draft that sounds plausible but is generic is worse than a blank page: it takes longer to correct than to write from scratch.

Data analysis within your school's existing systems is a fourth application that is underused. If your school uses an MIS that includes attendance data, intervention records, and teacher assessments, AI-assisted analysis can identify patterns that manual review misses. A pupil whose working memory difficulties are most pronounced on Monday mornings after a specific set of weekend circumstances, for example, is a pattern a SENCO might notice intuitively but an AI can confirm statistically across a larger cohort. This supports the formative assessment cycle by making the "assess" stage more systematic.

Where AI Falls Short

There are aspects of SENCO work that AI cannot replicate, and it is worth being precise about what those are. Vague statements that "AI can't do everything" are not useful. Clear statements about specific limitations help you decide where to use tools and where not to.

AI cannot observe a child. The observations that feed an effective graduated approach assess-plan-do-review cycle require a human in the room, reading physical and social cues that no current AI system can interpret. An EHCP that is built entirely on AI-generated text, without grounding in direct observation of the child, risks being legally challenged at SENDIST tribunal. The SEND Code of Practice requires plans to describe the child's needs "in detail" and to be based on the assessments of professionals who have worked with the child. An AI tool has not worked with the child.

AI cannot build the relationships that make SEND provision work. The trust between a SENCO, a family, and a child is central to the co-production principle that runs through the SEND Code of Practice. Families of children with ADHD, autism, or PDA profiles in particular report that the quality of the SENCO relationship directly affects their engagement with the plan and with the school. No AI system can substitute for that relationship, and parents who discover that their child's plan was drafted primarily by an AI may reasonably feel that the process has become impersonal.

AI hallucination is a concrete risk in SEND administration specifically. Large language models generate plausible-sounding text that can be factually incorrect. In a general context, a hallucinated fact is an inconvenience. In an EHCP, a hallucinated provision commitment or a misquoted statutory requirement is an error in a legal document. The Special Needs Jungle investigation (September 2025) found that AI-generated EHCP sections routinely used generic language that failed the specificity test required by the SEND Code of Practice. Jadu, whose technology underpins some local authority EHCP tools, acknowledged directly: "You can speed up the process of creating an EHCP using AI, but if the inputs to the plan don't meet the needs, the benefits and the quality of that plan will be diminished."

AI cannot exercise professional accountability. When an EHCP is inadequate and a family appeals to SENDIST, the responsible professional is the one who signed the document. If that document was drafted by an AI tool and not sufficiently reviewed by the SENCO or LA officer, the professional accountability still rests with the human signatory. Neither the tool vendor nor the AI is named on the plan.

The DfE Position on AI in Schools

The Department for Education published non-statutory guidance, "Generative AI in Education Settings," in June 2025. This is the most current official position and it applies directly to SEND administration, though the guidance does not address SEND specifically.

The DfE guidance sets out four core requirements for AI use in schools. Schools must be open and transparent with all stakeholders, including parents, pupils, governors, and staff, about how they use AI and how personal data is processed. Data protection compliance is described as non-negotiable. Human oversight is required for all AI-generated content before it is used. Staff need adequate training both in producing effective prompts and in evaluating AI output critically.

For SEND administration, these requirements translate into specific practice. An AI-drafted section of an EHCP must be reviewed by a qualified professional before it is included in the final plan. This is not a formality. The reviewer must have sufficient knowledge of the child to identify errors, omissions, or generic language that does not accurately describe the individual's needs and provision. Parents should be informed if AI tools are used to process their child's information, and that information must be reflected in the school's privacy notice. Any AI tool that processes pupil data must be covered by the school's data protection policy and reviewed by the data protection officer before adoption.

The Hansard record of the July 2025 parliamentary debate on "Generative AI: Schools" confirmed the government's position: the duty to protect children's data and ensure human oversight cannot be delegated to a technology provider. Schools remain responsible for the quality and legality of everything in a child's official record. AI tools for teachers more broadly are subject to the same principle: the professional is accountable, not the tool.

GDPR and SEND Data: What You Need to Know

SEND data is special category data under UK GDPR Article 9. This is the highest level of data protection, the same category as health data, racial or ethnic origin, and religious beliefs. Processing special category data requires a lawful basis beyond the standard bases that apply to ordinary personal data.

The lawful bases most likely to apply to AI processing of SEND data in schools are substantial public interest (Schedule 1, Part 2, DPA 2018) and the performance of a task in the public interest. However, relying on these bases does not mean processing is unrestricted. The ICO's guidance on AI and data protection requires that AI processing of special category data includes appropriate technical and organisational safeguards, data minimisation (only the data strictly necessary for the specific task), and clear documentation.

A Data Protection Impact Assessment is required before introducing any new AI tool that processes special category data. This is not optional guidance; it is a legal requirement under UK GDPR Article 35. The DPIA must be completed before the tool is deployed, not after an incident has occurred. Your data protection officer must be involved in this process.

Data Type AI Processing Acceptable? Safeguards Required
Anonymised provision data (aggregated, no names) Yes, with caution Ensure genuinely anonymised; small cohort risk of re-identification
Named pupil SEN support data Only with lawful basis and DPIA DPIA completed, privacy notice updated, DPO sign-off, secure processing environment
EHCP content (Sections B to I) Only with lawful basis, DPIA, and parental awareness As above, plus data residency confirmed, data minimisation applied, human review before any use
Professional reports (EP, SaLT, OT) Only in secure, school-controlled environment Not to be uploaded to public AI tools; M365 Copilot or equivalent secure system only
Generic provision template (no pupil identifiers) Yes Output reviewed before applying to any named pupil's plan

The practical implication of the table above is that using a free ChatGPT account to draft a section of a named pupil's EHCP, based on content pasted from their educational psychology report, is a UK GDPR breach. It does not matter that the information may not be stored permanently by the tool or that the intent was to save time. The processing itself requires a lawful basis, a DPIA, and a secure environment that free consumer AI tools do not provide.

If a parent submitted a Subject Access Request and discovered that their child's EHCP information had been processed by an external AI tool they were not told about, the school would have significant difficulty justifying that decision to the ICO. Invision360, one of the more established EHCP AI vendors, states explicitly: "Before any data is processed by VITA, LAs must confirm they have obtained appropriate consent or have a lawful basis for processing under GDPR and the Data Protection Act 2018." The burden of confirming that lawful basis sits with the school or LA, not with the tool.

Evaluating AI Tools: A SENCO's Procurement Checklist

Before adopting any AI tool for SEND administration, work through these ten questions with your data protection officer. If a vendor cannot answer a question clearly and in writing, treat that as a significant concern.

  1. Data storage and residency: Where is the data stored? Is it processed within the UK or EEA? Who has access to it beyond your school, and under what contractual terms?
  2. Lawful basis: What lawful basis does the vendor rely on for processing special category data? Can they provide documentation and a Data Processing Agreement that meets UK GDPR requirements?
  3. DPIA support: Does the vendor provide a Data Protection Impact Assessment template or completed DPIA for their product? Are they willing to be named in your school's DPIA as a processor?
  4. Privacy notice: What information do you need to add to your school's privacy notice to cover use of this tool? Does the vendor provide template wording for schools?
  5. Human oversight design: Is the product designed to require human review of all AI output before use? Or does it allow AI-generated content to flow into records without a review step?
  6. UK legislation alignment: Has the AI been trained on or validated against the SEND Code of Practice (2015), the Children and Families Act 2014, and current DfE guidance? Can the vendor confirm this specifically, not just "UK education data"?
  7. Hallucination safeguards: How does the tool signal uncertainty or low confidence in its output? What happens when input data is incomplete, contradictory, or missing key professional assessments?
  8. MIS integration: Does the tool integrate with your existing management information system (SIMS, Arbor, ScholarPack, Bromcom)? If not, what data re-entry does that require, and who is responsible for accuracy?
  9. Data portability and exit: If you stop using the tool, can you export all your school's data in a standard, reusable format? What happens to data held by the vendor after the contract ends?
  10. Staff training and accountability: What training does the vendor provide and at what cost? Who in your school is named as accountable for reviewing AI output before it enters a child's official record?

This checklist is not exhaustive, but any tool that cannot satisfy these ten points should not be handling SEND data. The procurement conversation should happen before the SENCO trial period, not after a school has committed to a subscription. Involve your data protection officer from the first conversation with a vendor, not at the point of contract review.

The 2026 White Paper and Digital ISPs

The Schools White Paper "Every Child Achieving and Thriving," published in February 2026, introduces changes to SEND administration that every SENCO needs to understand now, regardless of their current position on AI tools.

The White Paper introduces mandatory digital Individual Support Plans for pupils identified as having SEND. These replace the current informal SEN support records, which vary significantly between schools in format, content, and quality. A standardised digital format, specified by the DfE, will be required from 2027 onwards. This is a direct AI on-ramp: a structured digital format with defined fields is exactly the environment in which AI drafting assistance works most reliably, because the output structure is specified and the human reviewer knows precisely what to check.

The White Paper also introduces Specialist Provision Packages, which are designed by multi-agency expert teams and potentially AI-assisted in their construction. An "Experts at Hand" service, funded at GBP 1.8 billion and commissioned by local authorities and integrated care boards, will be tasked with delivering these packages to schools without requiring a formal EHCP assessment. The design of that service has not yet been specified in detail, but AI-assisted needs analysis and provision matching is widely anticipated in the SEND sector.

The Inclusive Mainstream Fund (GBP 1.6 billion over three years) will be paid directly to schools from 2026/27 for early targeted interventions with no formal assessment required. This is significant for SENCOs because it shifts the emphasis towards rapid, evidence-based identification and response at the SEN support stage, precisely the stage where AI-assisted provision mapping could add most value. Schools that have already developed clear AI data policies and staff competency in evaluating AI output will find the 2027 transition to mandatory digital ISPs significantly less disruptive than those that have not.

A Phased Implementation Plan for Schools

The evidence base for AI in SEND administration is still developing. The tools are new, the regulatory environment is evolving, and the long-term effects on SENCO professional expertise and family trust are not yet known. A phased approach is appropriate and professionally defensible.

Phase one covers the lowest-risk tasks: using AI within a secure school environment to summarise documents that contain no individual pupil identifiers, to generate generic provision mapping templates, and to draft parent communication templates that are then edited to reflect the individual child. This requires no DPIA, no vendor contract review, and no change to privacy notices, provided the AI tool is part of the school's existing secure software environment (for example, Microsoft Copilot within M365 for Education). Start here. The SENCO annual calendar provides a useful framework for identifying which administrative tasks fall at each point in the year and where time-saving tools would have the greatest impact.

Phase two involves selecting and evaluating a dedicated SEND administration tool. Use the procurement checklist above. Complete the DPIA before trialling the tool with any named pupil data. Update your privacy notice to reflect the new processing. Inform parents. Run a controlled trial with a small number of cases where the professional knows the child well and can evaluate AI output accurately. Evaluate whether the time saving is real and whether the output quality meets the specificity standards required by the SEND Code of Practice.

Phase three is adoption of AI-assisted processes in areas where phase two has demonstrated clear benefit and safe practice. At this stage, schools should document their AI use explicitly, maintain the human oversight requirement on all AI-generated content, and review the tool annually against updated DfE guidance and ICO decisions on AI and data protection.

AI and the SENCO's Professional Expertise

A question that SENCO networks are beginning to raise is whether routine use of AI drafting tools will, over time, erode the professional skills that make a SENCO effective. This is not a hypothetical concern. It is analogous to the documented effects of GPS on spatial navigation: when a tool reliably performs a cognitive task, the underlying skill may not be maintained.

The SENCO role requires the ability to construct provision records that are specific, quantified, and individualised. That skill is developed through practice: writing the same type of document repeatedly, receiving feedback from reviews and tribunal decisions, and refining judgement over time. If a SENCO routinely edits AI output rather than drafting from professional knowledge of the child, the analytical process that makes plans legally adequate and practically useful is shortened. This matters most for newly qualified SENCOs. The National Award for SEN Coordination, the mandatory qualification for SENCOs in maintained schools in England, includes needs analysis, provision design, and plan writing as core competencies. Schools should consider whether their AI adoption decisions support or undermine the development of those competencies in newly appointed staff.

The executive function demands of SENCO work, particularly the planning, monitoring, and self-evaluation components, are precisely the cognitive functions that AI risks replacing rather than supporting. The SEND Network's 2025 position is worth quoting directly: "Rather than jumping on the AI bandwagon, we doubled down on the importance of a human-centred and expert-led approach to developing individualised plans." That is not a rejection of AI in principle. It is a commitment to maintaining the professional quality that makes SEND plans legally and practically effective.

What the Research Evidence Shows

The OECD working paper "Leveraging AI to support students with SEN" (2024) provides the most comprehensive international evidence review currently available. The paper finds that AI has demonstrated potential in three areas: personalised learning pathways for pupils with autism spectrum conditions, assistive technology integration including text-to-speech and AAC support, and early identification of learning difficulties through data pattern analysis. The evidence for AI in SEND administration specifically, as distinct from direct learning support, is much thinner.

The paper cautions that most AI applications in SEN have been developed in the United States, where the legislative framework differs significantly from England's SEND Code of Practice. The Individuals with Disabilities Education Act (IDEA), which governs US special education, has different requirements for plan content, timelines, and family rights than the Children and Families Act 2014. Direct transfer of US-developed tools to UK school contexts carries risk. The legal requirements for EHCP content, the graduated approach structure, and the co-production principle with families are specific to English legislation and are not reflected in tools trained primarily on US special education data.

Leicestershire County Council's published approach to automated processing in EHCNA decisions represents one of the few documented UK local authority implementations. Their experience highlights both the time efficiencies available and the governance challenges of maintaining professional accountability when AI is involved in decisions about children's legal entitlements. For SENCOs evaluating tools, asking vendors whether their product has been validated specifically against English SEND legislation, not just international SEN principles, is an essential due diligence step.

Connecting AI Tools to Wider School SEND Practice

AI tools in SEND administration do not exist in isolation. They sit within a broader framework of differentiation strategies and formative assessment practice that shapes how well any individual pupil's needs are identified and met. The most effective use of AI in SEND administration is one that connects to, rather than replaces, strong classroom practice.

A SENCO who uses AI to generate a provision mapping template, for example, still needs classroom teachers who can accurately identify which strategies are being implemented and with what effect. The working memory demands on a pupil with SEND are unlikely to be captured well in any AI-generated plan if the classroom teacher has not developed the observational skills to identify when those demands are causing difficulty. AI tools can speed up the documentation process, but they cannot substitute for the professional development of teachers in identifying and responding to SEND needs in real time.

This means that a school's investment in AI tools for SEND administration should be accompanied by investment in staff understanding of SEND. CPD on identifying and supporting special educational needs, understanding the profile of pupils with different conditions, and implementing effective provision in the classroom is what makes any SEND plan useful. Without that, even a well-drafted AI-assisted plan describes provision that does not happen effectively in practice.

Further Reading: Key Sources on AI and SEND Administration

Further Reading: Key Sources on AI and SEND Administration

The following policy documents and research papers provide the evidence base for this guide.

Generative AI in Education Settings View guidance ↗
DfE, June 2025

The Department for Education's non-statutory guidance on AI use in schools. Covers transparency requirements, data protection obligations, and the human oversight mandate that applies to all AI-generated content in educational settings.

National SENCO Workforce Survey View study ↗
Bath Spa University / nasen

The primary source for SENCO workload data cited throughout this guide. Documents the 74% administrative burden figure, time allocation problems across primary and secondary phases, and the impact on direct pupil support.

The risks and benefits of using AI to power EHCPs View study ↗
Special Needs Jungle, September 2025

Investigative journalism examining Agilisys EHCP Tool, Invision360 VITA, and Jadu. Raises substantive concerns about data handling, consent, and the quality gap between AI output and SEND Code of Practice specificity requirements.

Leveraging AI to support students with SEN View study ↗
OECD CERI Working Paper, 2024

The most comprehensive international evidence review on AI and SEN. Covers personalised learning, assistive technology, and early identification. Cautions against uncritical transfer of US-developed tools to UK legislative contexts where EHCP requirements differ substantially.

Guidance on AI and Data Protection View guidance ↗
Information Commissioner's Office

The ICO's regulatory guidance on AI and UK GDPR compliance. Essential reading for DPOs and SENCOs evaluating AI tools. Covers Article 22 on automated decision-making, Article 35 DPIA requirements, and data minimisation principles as they apply to school settings.

Before adopting any AI tool for SEND administration, complete the procurement checklist in this guide with your data protection officer and share it with any vendor you are evaluating. Start with the lowest-risk tasks, within your school's existing secure software environment, and build staff competency in evaluating AI output before moving to tools that process named pupil EHCP data.

Loading audit...

Three-quarters of SENCOs in England are being pulled away from direct pupil support by administrative demands they cannot escape (nasen National SENCO Workforce Survey). AI tools are now appearing in this space, promising to reduce that burden. Some are genuinely useful. Others carry significant risks that most SENCOs have not had time to evaluate. This guide gives you the balanced picture: what AI can do for SEND administration right now, where the risks sit, what the DfE says, and a decision framework you can use before committing to any tool.

Key Takeaways

  1. The workload case is real: 74% of SENCOs are pulled from pupils by administration; AI offers partial, not total, relief at lower-risk tasks like summarising reports and tracking provision.
  2. SEND data is special category data: Under UK GDPR Article 9, processing a child's SEND information with AI tools requires a lawful basis, a Data Protection Impact Assessment, and transparent parent communication.
  3. Human oversight is non-negotiable: The DfE's June 2025 guidance requires all AI-generated content in education to be reviewed by a qualified professional before use.
  4. The 2026 White Paper changes everything: Mandatory digital Individual Support Plans will create a standardised AI on-ramp; schools that develop AI literacy now will be better positioned for statutory compliance later.

The SENCO Administration Crisis

The data on SENCO workload is not ambiguous. The National SENCO Workforce Survey, conducted by Bath Spa University and nasen, found that 74% of SENCOs are regularly pulled away from supporting pupils with special educational needs to complete administrative tasks. The same survey found that 55% of primary SENCOs and 70% of secondary SENCOs are not allocated enough time to carry out their role effectively. EHCP applications, annual reviews, and provision mapping consume the majority of that allocated time.

EHCP numbers have risen every year since the Children and Families Act 2014 came into force. As of January 2025, over 576,000 children and young people in England held an EHCP, an increase of 11% on the previous year. Each plan requires detailed documentation, multi-agency coordination, and ongoing review cycles that follow the graduated approach assess-plan-do-review cycle. The system was not designed to scale at this rate, and SENCOs are bearing the administrative consequence.

The NEU's SENCO Workload report called for legally protected SENCO time. A Twinkl survey found that a significant proportion of SENCOs were considering leaving the role because of workload. The human cost of the administration burden is not abstract. When a SENCO spends three hours preparing an annual review pack, that is three hours not spent observing a pupil's progress, meeting with a family, or coaching a classroom teacher. This is the context in which AI tools have entered the conversation. The promise is straightforward: if AI can handle first-draft documentation, provision summaries, or data pattern analysis, SENCOs recover time to do the relational and professional work that actually benefits children.

What AI Tools Are Available Right Now

The current landscape splits into two categories: dedicated SEND AI platforms designed for EHCP and provision management, and general AI tools that SENCOs are adapting for administrative tasks. The following table maps the main options, what they do, and the associated data risk.

Tool Primary Function Who Uses It Data Risk Level
Agilisys EHCP Tool AI-powered EHCP first-draft generation from uploaded professional reports Local authority SEND caseworkers Medium (enterprise data processing agreements in place)
Invision360 VITA AI-driven EHCP drafting and quality assurance, Innovate UK funded 50+ UK local authorities Medium (lawful basis required before processing)
Provision Map (Tes/Edukey) Provision mapping, intervention tracking, learning plans Schools, SENCOs Lower (established MIS integration, school data agreements)
ChatGPT / Claude / Gemini (personal accounts) Drafting EHCP sections, parent letters, IEP targets, social stories Individual SENCOs using personal accounts High (not UK GDPR compliant by default)
Microsoft Copilot (M365 for Education) Drafting, summarising, data analysis within school M365 tenant Schools with M365 licences Lower (processed within school's secure M365 environment)
NotebookLM (Google) Summarising professional reports, creating accessible digests SENCOs with Google Workspace for Education schools Medium (depends on Google Workspace for Education agreement)

The risk level column reflects data processing risk, not tool quality. A lower-risk tool may still produce poor output. A higher-risk tool may produce excellent drafts that you cannot legally use without additional safeguards. Both dimensions matter in the evaluation process.

Where AI Can Help Right Now

The most useful AI applications in SEND administration are those that assist rather than replace professional judgement. This distinction matters in practice, not just in principle. The following tasks represent the clearest opportunities based on current SENCO experience.

Summarising lengthy professional reports is one of the clearest use cases. Educational psychologists, speech and language therapists, and occupational therapists produce reports that can run to 20 or more pages. A SENCO preparing for an EHCP annual review must synthesise multiple such reports, often under significant time pressure. AI tools such as NotebookLM can produce structured summaries of uploaded documents within seconds, reducing preparation time considerably. The risk here is relatively low provided the documents are uploaded within a secure environment covered by a data processing agreement, not into a public-facing AI interface.

Provision mapping and tracking is a second area with real potential. Tools that connect to your school's existing management information system can flag gaps between identified needs and current provision. When this works well, it functions as an early warning system: a pupil with a documented need for small-group reading support is not receiving it because the timetable changed two weeks ago, and the AI flags the gap before the next review. The key safeguard is that the tool reads data that already exists within your school's secure environment, rather than requiring you to input identifying information about individual pupils into an external system.

Generating template documents for differentiation strategies and provision planning is a lower-risk starting point for SENCOs who want to experiment with AI. Asking a general AI tool to produce a template SEN support plan structure, or a list of evidence-based strategies for a pupil with working memory difficulties, does not require inputting any pupil data. The output is then edited by a professional who knows the specific child. This approach keeps the data risk low while building staff confidence in evaluating AI output critically.

Report drafting for parents and teachers is the most commonly reported use case in SENCO networks. SENCOs use AI to generate first drafts of progress reports, communication letters, and target summaries. These drafts require significant editing to ensure they reflect the specific child rather than a generic profile. The SEND Network reported in late 2025 that template-based AI output frequently defaults to standard language rather than the specified, quantified provision that EHCPs and SEN support plans legally require. A first draft that sounds plausible but is generic is worse than a blank page: it takes longer to correct than to write from scratch.

Data analysis within your school's existing systems is a fourth application that is underused. If your school uses an MIS that includes attendance data, intervention records, and teacher assessments, AI-assisted analysis can identify patterns that manual review misses. A pupil whose working memory difficulties are most pronounced on Monday mornings after a specific set of weekend circumstances, for example, is a pattern a SENCO might notice intuitively but an AI can confirm statistically across a larger cohort. This supports the formative assessment cycle by making the "assess" stage more systematic.

Where AI Falls Short

There are aspects of SENCO work that AI cannot replicate, and it is worth being precise about what those are. Vague statements that "AI can't do everything" are not useful. Clear statements about specific limitations help you decide where to use tools and where not to.

AI cannot observe a child. The observations that feed an effective graduated approach assess-plan-do-review cycle require a human in the room, reading physical and social cues that no current AI system can interpret. An EHCP that is built entirely on AI-generated text, without grounding in direct observation of the child, risks being legally challenged at SENDIST tribunal. The SEND Code of Practice requires plans to describe the child's needs "in detail" and to be based on the assessments of professionals who have worked with the child. An AI tool has not worked with the child.

AI cannot build the relationships that make SEND provision work. The trust between a SENCO, a family, and a child is central to the co-production principle that runs through the SEND Code of Practice. Families of children with ADHD, autism, or PDA profiles in particular report that the quality of the SENCO relationship directly affects their engagement with the plan and with the school. No AI system can substitute for that relationship, and parents who discover that their child's plan was drafted primarily by an AI may reasonably feel that the process has become impersonal.

AI hallucination is a concrete risk in SEND administration specifically. Large language models generate plausible-sounding text that can be factually incorrect. In a general context, a hallucinated fact is an inconvenience. In an EHCP, a hallucinated provision commitment or a misquoted statutory requirement is an error in a legal document. The Special Needs Jungle investigation (September 2025) found that AI-generated EHCP sections routinely used generic language that failed the specificity test required by the SEND Code of Practice. Jadu, whose technology underpins some local authority EHCP tools, acknowledged directly: "You can speed up the process of creating an EHCP using AI, but if the inputs to the plan don't meet the needs, the benefits and the quality of that plan will be diminished."

AI cannot exercise professional accountability. When an EHCP is inadequate and a family appeals to SENDIST, the responsible professional is the one who signed the document. If that document was drafted by an AI tool and not sufficiently reviewed by the SENCO or LA officer, the professional accountability still rests with the human signatory. Neither the tool vendor nor the AI is named on the plan.

The DfE Position on AI in Schools

The Department for Education published non-statutory guidance, "Generative AI in Education Settings," in June 2025. This is the most current official position and it applies directly to SEND administration, though the guidance does not address SEND specifically.

The DfE guidance sets out four core requirements for AI use in schools. Schools must be open and transparent with all stakeholders, including parents, pupils, governors, and staff, about how they use AI and how personal data is processed. Data protection compliance is described as non-negotiable. Human oversight is required for all AI-generated content before it is used. Staff need adequate training both in producing effective prompts and in evaluating AI output critically.

For SEND administration, these requirements translate into specific practice. An AI-drafted section of an EHCP must be reviewed by a qualified professional before it is included in the final plan. This is not a formality. The reviewer must have sufficient knowledge of the child to identify errors, omissions, or generic language that does not accurately describe the individual's needs and provision. Parents should be informed if AI tools are used to process their child's information, and that information must be reflected in the school's privacy notice. Any AI tool that processes pupil data must be covered by the school's data protection policy and reviewed by the data protection officer before adoption.

The Hansard record of the July 2025 parliamentary debate on "Generative AI: Schools" confirmed the government's position: the duty to protect children's data and ensure human oversight cannot be delegated to a technology provider. Schools remain responsible for the quality and legality of everything in a child's official record. AI tools for teachers more broadly are subject to the same principle: the professional is accountable, not the tool.

GDPR and SEND Data: What You Need to Know

SEND data is special category data under UK GDPR Article 9. This is the highest level of data protection, the same category as health data, racial or ethnic origin, and religious beliefs. Processing special category data requires a lawful basis beyond the standard bases that apply to ordinary personal data.

The lawful bases most likely to apply to AI processing of SEND data in schools are substantial public interest (Schedule 1, Part 2, DPA 2018) and the performance of a task in the public interest. However, relying on these bases does not mean processing is unrestricted. The ICO's guidance on AI and data protection requires that AI processing of special category data includes appropriate technical and organisational safeguards, data minimisation (only the data strictly necessary for the specific task), and clear documentation.

A Data Protection Impact Assessment is required before introducing any new AI tool that processes special category data. This is not optional guidance; it is a legal requirement under UK GDPR Article 35. The DPIA must be completed before the tool is deployed, not after an incident has occurred. Your data protection officer must be involved in this process.

Data Type AI Processing Acceptable? Safeguards Required
Anonymised provision data (aggregated, no names) Yes, with caution Ensure genuinely anonymised; small cohort risk of re-identification
Named pupil SEN support data Only with lawful basis and DPIA DPIA completed, privacy notice updated, DPO sign-off, secure processing environment
EHCP content (Sections B to I) Only with lawful basis, DPIA, and parental awareness As above, plus data residency confirmed, data minimisation applied, human review before any use
Professional reports (EP, SaLT, OT) Only in secure, school-controlled environment Not to be uploaded to public AI tools; M365 Copilot or equivalent secure system only
Generic provision template (no pupil identifiers) Yes Output reviewed before applying to any named pupil's plan

The practical implication of the table above is that using a free ChatGPT account to draft a section of a named pupil's EHCP, based on content pasted from their educational psychology report, is a UK GDPR breach. It does not matter that the information may not be stored permanently by the tool or that the intent was to save time. The processing itself requires a lawful basis, a DPIA, and a secure environment that free consumer AI tools do not provide.

If a parent submitted a Subject Access Request and discovered that their child's EHCP information had been processed by an external AI tool they were not told about, the school would have significant difficulty justifying that decision to the ICO. Invision360, one of the more established EHCP AI vendors, states explicitly: "Before any data is processed by VITA, LAs must confirm they have obtained appropriate consent or have a lawful basis for processing under GDPR and the Data Protection Act 2018." The burden of confirming that lawful basis sits with the school or LA, not with the tool.

Evaluating AI Tools: A SENCO's Procurement Checklist

Before adopting any AI tool for SEND administration, work through these ten questions with your data protection officer. If a vendor cannot answer a question clearly and in writing, treat that as a significant concern.

  1. Data storage and residency: Where is the data stored? Is it processed within the UK or EEA? Who has access to it beyond your school, and under what contractual terms?
  2. Lawful basis: What lawful basis does the vendor rely on for processing special category data? Can they provide documentation and a Data Processing Agreement that meets UK GDPR requirements?
  3. DPIA support: Does the vendor provide a Data Protection Impact Assessment template or completed DPIA for their product? Are they willing to be named in your school's DPIA as a processor?
  4. Privacy notice: What information do you need to add to your school's privacy notice to cover use of this tool? Does the vendor provide template wording for schools?
  5. Human oversight design: Is the product designed to require human review of all AI output before use? Or does it allow AI-generated content to flow into records without a review step?
  6. UK legislation alignment: Has the AI been trained on or validated against the SEND Code of Practice (2015), the Children and Families Act 2014, and current DfE guidance? Can the vendor confirm this specifically, not just "UK education data"?
  7. Hallucination safeguards: How does the tool signal uncertainty or low confidence in its output? What happens when input data is incomplete, contradictory, or missing key professional assessments?
  8. MIS integration: Does the tool integrate with your existing management information system (SIMS, Arbor, ScholarPack, Bromcom)? If not, what data re-entry does that require, and who is responsible for accuracy?
  9. Data portability and exit: If you stop using the tool, can you export all your school's data in a standard, reusable format? What happens to data held by the vendor after the contract ends?
  10. Staff training and accountability: What training does the vendor provide and at what cost? Who in your school is named as accountable for reviewing AI output before it enters a child's official record?

This checklist is not exhaustive, but any tool that cannot satisfy these ten points should not be handling SEND data. The procurement conversation should happen before the SENCO trial period, not after a school has committed to a subscription. Involve your data protection officer from the first conversation with a vendor, not at the point of contract review.

The 2026 White Paper and Digital ISPs

The Schools White Paper "Every Child Achieving and Thriving," published in February 2026, introduces changes to SEND administration that every SENCO needs to understand now, regardless of their current position on AI tools.

The White Paper introduces mandatory digital Individual Support Plans for pupils identified as having SEND. These replace the current informal SEN support records, which vary significantly between schools in format, content, and quality. A standardised digital format, specified by the DfE, will be required from 2027 onwards. This is a direct AI on-ramp: a structured digital format with defined fields is exactly the environment in which AI drafting assistance works most reliably, because the output structure is specified and the human reviewer knows precisely what to check.

The White Paper also introduces Specialist Provision Packages, which are designed by multi-agency expert teams and potentially AI-assisted in their construction. An "Experts at Hand" service, funded at GBP 1.8 billion and commissioned by local authorities and integrated care boards, will be tasked with delivering these packages to schools without requiring a formal EHCP assessment. The design of that service has not yet been specified in detail, but AI-assisted needs analysis and provision matching is widely anticipated in the SEND sector.

The Inclusive Mainstream Fund (GBP 1.6 billion over three years) will be paid directly to schools from 2026/27 for early targeted interventions with no formal assessment required. This is significant for SENCOs because it shifts the emphasis towards rapid, evidence-based identification and response at the SEN support stage, precisely the stage where AI-assisted provision mapping could add most value. Schools that have already developed clear AI data policies and staff competency in evaluating AI output will find the 2027 transition to mandatory digital ISPs significantly less disruptive than those that have not.

A Phased Implementation Plan for Schools

The evidence base for AI in SEND administration is still developing. The tools are new, the regulatory environment is evolving, and the long-term effects on SENCO professional expertise and family trust are not yet known. A phased approach is appropriate and professionally defensible.

Phase one covers the lowest-risk tasks: using AI within a secure school environment to summarise documents that contain no individual pupil identifiers, to generate generic provision mapping templates, and to draft parent communication templates that are then edited to reflect the individual child. This requires no DPIA, no vendor contract review, and no change to privacy notices, provided the AI tool is part of the school's existing secure software environment (for example, Microsoft Copilot within M365 for Education). Start here. The SENCO annual calendar provides a useful framework for identifying which administrative tasks fall at each point in the year and where time-saving tools would have the greatest impact.

Phase two involves selecting and evaluating a dedicated SEND administration tool. Use the procurement checklist above. Complete the DPIA before trialling the tool with any named pupil data. Update your privacy notice to reflect the new processing. Inform parents. Run a controlled trial with a small number of cases where the professional knows the child well and can evaluate AI output accurately. Evaluate whether the time saving is real and whether the output quality meets the specificity standards required by the SEND Code of Practice.

Phase three is adoption of AI-assisted processes in areas where phase two has demonstrated clear benefit and safe practice. At this stage, schools should document their AI use explicitly, maintain the human oversight requirement on all AI-generated content, and review the tool annually against updated DfE guidance and ICO decisions on AI and data protection.

AI and the SENCO's Professional Expertise

A question that SENCO networks are beginning to raise is whether routine use of AI drafting tools will, over time, erode the professional skills that make a SENCO effective. This is not a hypothetical concern. It is analogous to the documented effects of GPS on spatial navigation: when a tool reliably performs a cognitive task, the underlying skill may not be maintained.

The SENCO role requires the ability to construct provision records that are specific, quantified, and individualised. That skill is developed through practice: writing the same type of document repeatedly, receiving feedback from reviews and tribunal decisions, and refining judgement over time. If a SENCO routinely edits AI output rather than drafting from professional knowledge of the child, the analytical process that makes plans legally adequate and practically useful is shortened. This matters most for newly qualified SENCOs. The National Award for SEN Coordination, the mandatory qualification for SENCOs in maintained schools in England, includes needs analysis, provision design, and plan writing as core competencies. Schools should consider whether their AI adoption decisions support or undermine the development of those competencies in newly appointed staff.

The executive function demands of SENCO work, particularly the planning, monitoring, and self-evaluation components, are precisely the cognitive functions that AI risks replacing rather than supporting. The SEND Network's 2025 position is worth quoting directly: "Rather than jumping on the AI bandwagon, we doubled down on the importance of a human-centred and expert-led approach to developing individualised plans." That is not a rejection of AI in principle. It is a commitment to maintaining the professional quality that makes SEND plans legally and practically effective.

What the Research Evidence Shows

The OECD working paper "Leveraging AI to support students with SEN" (2024) provides the most comprehensive international evidence review currently available. The paper finds that AI has demonstrated potential in three areas: personalised learning pathways for pupils with autism spectrum conditions, assistive technology integration including text-to-speech and AAC support, and early identification of learning difficulties through data pattern analysis. The evidence for AI in SEND administration specifically, as distinct from direct learning support, is much thinner.

The paper cautions that most AI applications in SEN have been developed in the United States, where the legislative framework differs significantly from England's SEND Code of Practice. The Individuals with Disabilities Education Act (IDEA), which governs US special education, has different requirements for plan content, timelines, and family rights than the Children and Families Act 2014. Direct transfer of US-developed tools to UK school contexts carries risk. The legal requirements for EHCP content, the graduated approach structure, and the co-production principle with families are specific to English legislation and are not reflected in tools trained primarily on US special education data.

Leicestershire County Council's published approach to automated processing in EHCNA decisions represents one of the few documented UK local authority implementations. Their experience highlights both the time efficiencies available and the governance challenges of maintaining professional accountability when AI is involved in decisions about children's legal entitlements. For SENCOs evaluating tools, asking vendors whether their product has been validated specifically against English SEND legislation, not just international SEN principles, is an essential due diligence step.

Connecting AI Tools to Wider School SEND Practice

AI tools in SEND administration do not exist in isolation. They sit within a broader framework of differentiation strategies and formative assessment practice that shapes how well any individual pupil's needs are identified and met. The most effective use of AI in SEND administration is one that connects to, rather than replaces, strong classroom practice.

A SENCO who uses AI to generate a provision mapping template, for example, still needs classroom teachers who can accurately identify which strategies are being implemented and with what effect. The working memory demands on a pupil with SEND are unlikely to be captured well in any AI-generated plan if the classroom teacher has not developed the observational skills to identify when those demands are causing difficulty. AI tools can speed up the documentation process, but they cannot substitute for the professional development of teachers in identifying and responding to SEND needs in real time.

This means that a school's investment in AI tools for SEND administration should be accompanied by investment in staff understanding of SEND. CPD on identifying and supporting special educational needs, understanding the profile of pupils with different conditions, and implementing effective provision in the classroom is what makes any SEND plan useful. Without that, even a well-drafted AI-assisted plan describes provision that does not happen effectively in practice.

Further Reading: Key Sources on AI and SEND Administration

Further Reading: Key Sources on AI and SEND Administration

The following policy documents and research papers provide the evidence base for this guide.

Generative AI in Education Settings View guidance ↗
DfE, June 2025

The Department for Education's non-statutory guidance on AI use in schools. Covers transparency requirements, data protection obligations, and the human oversight mandate that applies to all AI-generated content in educational settings.

National SENCO Workforce Survey View study ↗
Bath Spa University / nasen

The primary source for SENCO workload data cited throughout this guide. Documents the 74% administrative burden figure, time allocation problems across primary and secondary phases, and the impact on direct pupil support.

The risks and benefits of using AI to power EHCPs View study ↗
Special Needs Jungle, September 2025

Investigative journalism examining Agilisys EHCP Tool, Invision360 VITA, and Jadu. Raises substantive concerns about data handling, consent, and the quality gap between AI output and SEND Code of Practice specificity requirements.

Leveraging AI to support students with SEN View study ↗
OECD CERI Working Paper, 2024

The most comprehensive international evidence review on AI and SEN. Covers personalised learning, assistive technology, and early identification. Cautions against uncritical transfer of US-developed tools to UK legislative contexts where EHCP requirements differ substantially.

Guidance on AI and Data Protection View guidance ↗
Information Commissioner's Office

The ICO's regulatory guidance on AI and UK GDPR compliance. Essential reading for DPOs and SENCOs evaluating AI tools. Covers Article 22 on automated decision-making, Article 35 DPIA requirements, and data minimisation principles as they apply to school settings.

Before adopting any AI tool for SEND administration, complete the procurement checklist in this guide with your data protection officer and share it with any vendor you are evaluating. Start with the lowest-risk tasks, within your school's existing secure software environment, and build staff competency in evaluating AI output before moving to tools that process named pupil EHCP data.

SEND

Back to Blog

{"@context":"https://schema.org","@graph":[{"@type":"Article","@id":"https://www.structural-learning.com/post/ai-send-administration-senco-guide#article","headline":"AI for SEND Administration: A SENCO's Guide","description":"A balanced, practical guide for SENCOs on AI tools for SEND administration. What works, what the DfE says, GDPR implications, and a 10-point procurement...","datePublished":"2026-02-26T19:25:53.038Z","dateModified":"2026-02-26T19:25:53.975Z","author":{"@type":"Person","name":"Paul Main","url":"https://www.structural-learning.com/team/paulmain","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","url":"https://www.structural-learning.com","logo":{"@type":"ImageObject","url":"https://cdn.prod.website-files.com/5b69a01ba2e409e5d5e055c6/6040bf0426cb415ba2fc7882_newlogoblue.svg"}},"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/ai-send-administration-senco-guide"},"wordCount":4560},{"@type":"BreadcrumbList","@id":"https://www.structural-learning.com/post/ai-send-administration-senco-guide#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"AI for SEND Administration: A SENCO's Guide","item":"https://www.structural-learning.com/post/ai-send-administration-senco-guide"}]}]}