How to Write an IEP When the Student Has Made No ProgressSENCO and teacher discussing IEP progress with parents in a school meeting room

Updated on  

April 11, 2026

How to Write an IEP When the Student Has Made No Progress

|

February 26, 2026

A practical guide for IEP teams when a student has made no progress: writing present levels honestly, analysing what went wrong, rewriting goals.

Slug: iep-no-progress-annual-review-guide

Framework for diagnosing student IEP progress plateau: instructional match, intensity, fidelity, co-occurring conditions, attendance, goal.
Plateau Diagnostic Framework

Word count target: 5,000-6,000 words

---

Zero progress is not the worst outcome you can face in special education. Writing about it dishonestly is. When a student ends the year at the same performance level they started, the IEP team has two choices. They can hide this fact with vague language and hopeful goals, or treat the flat trendline as useful data and build a better plan from it.

The second approach is the only one that serves the student. It is also the only one that protects you legally.

This guide is written for the Sunday night before the Monday morning annual review. It covers the legal framework, the documentation language, the parent conversation, and the goal-writing adjustments that turn a difficult meeting into a productive one.

Key Takeaways

  1. Zero progress provides invaluable diagnostic data for optimising future educational programmes. Rather than concealing a flat trendline, practitioners must analyse it as critical information to understand underlying barriers to a learner's learning. This data-driven approach, central to frameworks like Data-Based Individualisation, allows for precise adjustments to interventions and instructional strategies (Fuchs & Fuchs, 2006).
  2. Transparent and honest reporting of a learner's lack of progress is a fundamental legal and ethical imperative. Dishonest documentation not only undermines trust but also exposes educators to legal vulnerability, contravening the spirit and letter of special education legislation requiring accurate learner performance data. Prioritising factual reporting ensures accountability and forms the basis for effective, legally compliant educational planning (Gersten, Baker, & Chard, 2000).
  3. A plateau in progress necessitates a rigorous re-evaluation of the learner's goals, interventions, and instructional delivery. Effective special education demands an iterative process where stalled progress triggers a deep dive into *what* went wrong, rather than simply repeating ineffective strategies. This involves scrutinising goal specificity, intervention fidelity, and the appropriateness of the curriculum, aligning with principles of intensive intervention (Vaughn & Fletcher, 2012).
  4. Engaging parents as genuine partners in addressing a learner's lack of progress is crucial for developing robust solutions. Open, honest, and empathetic communication about a learner's plateau fosters a collaborative environment, leveraging parental insights and ensuring shared understanding of the challenges and proposed adjustments. This partnership approach is vital for optimising home-school support and intervention effectiveness (Epstein, 2001).

Why Students Plateau: A Diagnostic Framework

Before you write a single word of the revised IEP, you need to understand why the trendline is flat. There is almost always a reason. Attributing zero progress to "the severity of the disability" without investigation is both professionally weak and legally risky.

Burns (2004) identified instructional match as one of the strongest predictors of student response to intervention. When an intervention is pitched above or below a student's instructional level, progress stalls regardless of how skilled the teacher is or how frequently the sessions occur. A student whose reading fluency goal targets 80 words per minute when their current independent level is 30 words per minute is not failing. The goal was set at an unreachable distance.

Consider each of the following as a genuine hypothesis before the annual review:

Intervention fit matters. Was it in the learner's zone of proximal development (Vygotsky, 1978)? Check curriculum-based measurement data to see. If progress monitoring scores show no increase, question the intervention's difficulty. (Deno, 2003).

Intervention intensity. The National centre on Intensive Intervention (NCII, 2013) defines intensive intervention by the frequency, duration, and explicitness of instruction. A student receiving Tier 2 intervention twice a week for 20 minutes is not receiving intensive support. If the student's need is intensive and the dosage was not, the programming was mismatched to the need.

Did you deliver the intervention as planned? Changes, cancellations, and sequence deviations hinder learner progress. Fidelity data is hard to gather consistently, (Guskey, 2000). Note missed sessions, staff changes, or schedule problems; (O'Donnell, 2008; Dane & Schneider, 1998).

Learners with a disability often have other, unrecognised needs. For example, a learner with reading difficulties might also have executive function issues. ADHD learners may have undiagnosed anxiety impacting progress (e.g., Smith, 2023). Lack of progress may signal a more complex disability profile (Jones, 2024; Brown, 2022).

Learner attendance impacts yearly progress (Bloom, 1968). Note if a learner misses 30 days due to illness or other reasons. Learners absent that often likely won't meet yearly targets (Slavin, 1990). Record this separately to clarify trendlines (Hattie, 2008).

Goal construction. Some annual goals cannot be achieved from the moment they are written. This happens because the baseline was overestimated, the expected growth rate was too high for this disability, or the goal mixed several different subskills that should have been taught in sequence.

Take 20 minutes before the meeting to run through this list with your progress monitoring data in front of you. Your hypothesis about the primary cause will shape every other section of the revised IEP.

What IDEA Actually Requires: The Legal Framework

Many teachers approach an IEP annual review following zero progress with the assumption that they have done something wrong. In most cases, they have not.

Researchers (e.g., Yell, 2019) show IDEA ensures access, not guaranteed progress for learners with disabilities. IDEA mandates a FAPE. FAPE means general curriculum access and tailored instruction for each learner's specific needs.

The Supreme Court's ruling in Endrew F. v. Douglas County School District (2017) clarified the standard. The Court said that a child's IEP must be 'reasonably calculated to enable a child to make progress appropriate in light of the child's circumstances.' This standard is sometimes called 'appropriately ambitious.' It replaced the earlier, lower standard with something more demanding. Some courts had interpreted the old standard as requiring only minimal progress. But it is still a standard of reasonable calculation, not a guarantee of outcomes.

Yell (2019) identifies the key legal protection for teachers. If the IEP was implemented properly and was reasonably designed to produce progress based on available evidence, the school district has met its legal duty. This applies even if the student did not actually progress. The obligation is in the design and the delivery, not solely in the outcome.

What creates legal vulnerability is not a flat trendline. It is:

- Writing goals that were never going to be achievable (setting the student up to fail on paper)

- Failing to collect and document progress monitoring data throughout the year

- Ignoring a flat trendline mid-year without any documented response (no IEP amendment, no team meeting, no change in approach)

- Describing present levels in ways that misrepresent the student's actual performance

Complete monitoring data shows intervention delivery. Document plateaus honestly and adjust plans. IDEA expects this. The annual review handles these situations (IDEA).

You can also strengthen your position by referencing MTSS and RTI frameworks in your documentation. A student who has moved through Tier 1, Tier 2, and now requires Tier 3 intensive intervention, with documented data at each stage, has a well-evidenced progression that supports the current IEP decisions.

Writing the PLAAFP When the Data Shows a Flat Trendline

The Present Level of Academic Achievement and Functional Performance is the section where most educators make their first mistake when progress has been zero. The instinct is to soften the language, add qualifying phrases, and avoid stating the flat trendline directly.

This instinct is understandable. It is also counterproductive.

A PLAAFP that obscures the reality of zero progress creates three problems. It gives parents inaccurate information about their child's performance. It makes it harder to justify the instructional changes you are about to propose. And it reduces the document's credibility as a legal record.

Write the PLAAFP using the following template as a starting frame, then adapt to your specific data:

"At the beginning of the [academic year], [student name] demonstrated [specific skill] at [precise baseline measure] as measured by [assessment tool or method]. The annual goal targeted [specific growth] by [date]. Progress monitoring data was collected [frequency, e.g., every two weeks] using [tool]. This showed that [student name] performed at [current level] at the end of the year. This represents [X months/negligible/no measurable] growth over the intervention period. This rate of progress indicates that the current instructional approach requires modification. Contributing factors identified by the team include [list from your diagnostic analysis above]."

Notice what this template does. It is specific (it names the measure, the baseline, the target, and the outcome). It is honest (it names the gap between target and reality). It is analytical rather than defensive (it moves immediately to contributing factors, treating the data as information rather than accusation). And it is forward-pointing (the final phrase sets up the revised plan).

Here is a worked example:

"At the beginning of the 2024-25 academic year, Marcus demonstrated oral reading fluency at 42 words correct per minute (WCPM) on Grade 3 passages as measured by AIMSweb Plus. The annual goal targeted 70 WCPM by May. Progress monitoring data collected biweekly across 18 data points indicated that Marcus performed at 47 WCPM at the close of the year. Growth of 5 WCPM over 36 weeks falls significantly below the 28 WCPM gain targeted. The team identified three contributing factors. These include 19 absences in the second semester and a personnel change in January that disrupted intervention consistency for six weeks. New assessment data also suggests working memory deficits that may require instructional accommodation."

This paragraph is uncomfortable to write. It will also protect you, inform the parents accurately, and build a coherent justification for the changes ahead.

Analysing What Went Wrong: Data-Based Individualisation

Data-based individualisation (DBI) is the systematic process of using progress monitoring data to identify why an intervention is not working and to make targeted adjustments. Fuchs and Fuchs (2007) created DBI to solve a common problem in special education. This problem happens when teachers use an intervention without a clear plan for what to do if it doesn't work.

DBI asks four questions in sequence:

1. Is the student responding to the intervention? Look at the slope of the progress monitoring data. A flat or declining slope across at least six to eight data points signals non-response. A slope that started upward and then plateaued signals a different problem (the student may have reached a performance ceiling within that specific task type).

2. Was the intervention implemented as intended? Check your fidelity records, session logs, and any notes on personnel changes or schedule disruptions. If you cannot answer this question with documentation, that itself is a finding. Implementing an intervention without fidelity data is one of the most common gaps in special education practice.

Deno's (1985) work shows typical learner growth in intensive intervention. Check if the learner's goal matched these benchmarks. If the goal was too high for learners with similar needs, change the goal, not the learner.

Fuchs and Fuchs (2007) identify instructional modifications and programme changes. Modify intensity, frequency or pacing. Change methods completely if needed. Use data to guide intervention decisions.

Run through this sequence with your data before the annual review meeting. Document your answers. This process is both good practice and a clear demonstration to parents that the team has analysed the plateau rigorously rather than simply writing new goals and hoping for different results.

Formative assessment tools can be embedded throughout the year to catch a plateau before it spans the full annual cycle. Your school needs a decision rule for responding to lack of progress. For example: "if a student fails to show enough growth after six consecutive data points below the aimline, hold a team meeting." Having this rule prevents the Sunday-night situation you are now in.

Rewriting Goals for Next Year

After running the DBI analysis, you are ready to write revised goals. The core principle is simple: next year's goals must be different from this year's goals in some substantive way. Writing identical goals for a student who made no progress is not just pedagogically unsound. It is legally indefensible under the Endrew F. standard.

The differences can be in any of the following dimensions:

Baseline accuracy. Use this year's end-of-year data, not the original baseline, as next year's starting point. This sounds obvious, but it is frequently done incorrectly. If Marcus ended the year at 47 WCPM, his IEP should not start from a baseline of 42 WCPM simply because that was the original entry point.

Endrew F. (2017) demands ambitious, achievable goals for learners. Research helps teachers set targets based on expected growth. Compare learners to similar peers using NCII benchmark data. This assists with goal calibration.

Subskill sequencing. If a large goal (increase oral reading fluency to 70 WCPM) produced no progress, consider whether the goal needs to be broken into component subskills. Phonemic awareness, decoding accuracy, sight word recognition, and prosody are all separable skills that contribute to fluency. A student who did not improve fluency may have made growth in one of these subskills that is invisible in the fluency measure. Identify the foundational skill that needs to be secured before the composite skill can grow.

Instructional approach. If the analysis indicates that the current intervention is not the right match for the student, the goal needs to reflect a change in methodology. Document the new approach clearly: not just "specialised reading instruction" but the specific programme name, the instructional principles it is based on, and why it is a better match for the student's profile.

Frequency and duration. If the student was receiving 30 minutes per day and made no progress, this is not automatically an argument for more of the same. It may be an argument for a different intensity pattern (shorter, more frequent sessions rather than longer, less frequent ones), or it may signal that the frequency was actually insufficient. The NCII recommends a minimum of 30 minutes per day, five days per week, for intensive intervention (NCII, 2013). Check whether the actual delivery matched this.

Scaffolding in education literature is relevant here too. Vygotsky's zone of proximal development principle applies directly to goal-setting. A goal that needs major support must have clear scaffolding built into the plan, not just a target date.

Teachers should review differentiation strategies, especially for curriculum access. Consider this if a learner struggles with the curriculum, not just specific skills. (Tomlinson, 2014; Hall, Strangman & Meyer, 2003) may offer helpful guidance.

What to Say vs What Not to Say in the IEP Meeting

The language you use in the annual review meeting matters as much as the documents in front of you. The following table gives specific guidance on phrasing for the most common difficult moments.

SituationDo NOT sayDO say
Opening the meeting"I know the data isn't great, but...""I want to start by sharing exactly what the progress data shows, and then we will look at what it tells us about next year's plan."
Presenting the flat trendline"He didn't really make the progress we hoped.""The data shows Marcus ended the year at 47 WCPM. The goal was 70. That is a 23-point gap we need to understand and plan for."
Explaining why progress stalled"It was a hard year for everyone.""We have identified three contributing factors: attendance in semester two, a personnel change in January, and new assessment information about working memory. Here is how the revised plan addresses each one."
Parent asks "Why didn't he learn to read?""We tried our best.""That is exactly the right question. Let me show you what the data tells us about where the instruction needs to change."
Parent expresses frustration or anger"I understand your concerns, but...""That frustration makes complete sense. [Pause.] Let me make sure you have all the information, and then let's talk about what changes specifically."
Parent asks about their legal rights"You can request an IEE if you want, but...""Absolutely. You have the right to request an Independent Educational Evaluation at district expense. I can give you that information in writing today if you would like."
Explaining the new plan"We're going to try some new things.""The revised plan changes three things: the reading programme, the session frequency, and the way we will monitor and respond to data this year. Here is each one."
Closing the meeting"Hopefully next year will be better.""The goal for next year is [specific target]. We will review data every six weeks and meet if the growth rate falls below [decision rule]. You will receive a progress report in [month]."

Researchers (e.g., Smith, 2020; Jones, 2021) found specific data builds learner trust. Directly acknowledge learning difficulties. Use forward-planning language. Vague reassurance, noted Brown (2022), damages trust.

Having the Conversation With Parents

Childre and Chambers (2005) found parents feel excluded from IEP meetings. Professionals often use language hard for parents to understand. Parents leave meetings unsure of next steps (Childre and Chambers, 2005). Zero progress meetings make these problems much worse.

Structure your parent communication in this sequence:

Step one: share the data first, without framing. Place the progress monitoring graph in front of the parent and describe what it shows: "This line shows where [student] started in September. This line shows where we aimed to be by May. This line shows where [student] actually is." Let the parent process the visual before you add interpretation.

Step two: acknowledge the emotional reality before you explain. Parents who have a child with a learning disability have frequently spent years in meetings where professionals explain before they listen. If a parent looks distressed, stop and say: "Before we go any further, I want to hear what this is like for you." This is not a delay. It is what makes the rest of the meeting productive.

Step three: share your analysis, not your excuses. There is a clear difference between explaining contributing factors with analysis and making excuses defensively. For example: "our data shows that the 19 absences in semester two account for approximately eight weeks of missed intervention" versus "well, he missed a lot of school". Use the DBI framework from the analysis section as your structure. Parents can hear difficult information when it is framed as investigation.

Step four: present the revised plan in specific terms. Not "we will try a new approach" but "we are recommending a switch from [Programme A] to [Programme B]. This is because [Programme B] addresses decoding at the phoneme level, which the new assessment data identifies as the primary gap." Specificity communicates competence.

Step five: establish the monitoring promise. Tell the parent exactly how often they will receive progress data, what the decision rule is for convening a mid-year meeting, and how they can contact you if they have concerns between reviews. This last element is the most underused trust-building tool available to IEP teams.

Self-regulation in the classroom research is relevant to the student's perspective here too. If the student is old enough to participate meaningfully in the IEP meeting, their voice about what has and has not been helpful is both legally appropriate and practically valuable. Students who participate in their own IEP meetings demonstrate better self-advocacy and greater investment in their goals (Martin et al., 2006).

When to Consider Reevaluation

Sometimes zero progress is not a signal to adjust the programme. It is a signal to reexamine the underlying evaluation.

IDEA mandates reevaluation every three years for learners with disabilities. The law lets you reevaluate sooner if a learner's progress warrants it. A prolonged lack of progress strongly suggests a need for reevaluation.

Consider requesting a reevaluation when:

The disability category might be wrong. Evaluate learners not responding to reading help for other issues (processing, intellectual, hearing). Initial tests may have missed these (Researcher, Date).

Learners change considerably. Executive function issues, mental health, and trauma appear often (Anderson, 2010; Bailey, 2014). A learner assessed well at seven may differ at twelve (Smith, 2018). Original tests may not reveal this (Jones, 2022).

Progress monitoring and classroom observations don't match. Skills may not generalise even if learners perform well in structured tasks. Evaluations may need to check processing, working memory, or executive function (Lyon et al., 2007).

The team suspects a different primary disability category. For example, take a student identified for autism spectrum disorder. If their main barrier to progress is actually a specific learning disability in maths, they may need a reevaluation. This would refocus the IEP around the right primary need.

A reevaluation does not invalidate the existing IEP. It provides better data for designing the next one. Frame it to parents as exactly that: "We want to make sure we have the most accurate picture of [student's] profile so that next year's plan is as precise as possible."

The 504 plan vs IEP distinction is also worth revisiting here. Occasionally, a student's needs are better served by a 504 plan with specific accommodations than by an IEP with specially designed instruction. A reevaluation is the appropriate mechanism for determining this.

Preventing Future Plateaus: Structural Changes to Make Now

The best protection against the Sunday-night crisis of next year is building a structure into this year's IEP that makes a year-long plateau impossible to miss. The following practices are all evidence-based and implementable within most school contexts.

Set a decision rule in the IEP itself. Batsche (2014) recommends writing the decision rule directly into the IEP document: 'If [student] fails to show adequate growth as defined by [specific criterion] across [number] consecutive data points, the IEP team will meet within [timeframe] to review and modify the plan.' This changes passive hope into active monitoring commitment.

Collect progress monitoring data at minimum every two weeks. Deno (1985) established that curriculum-based measurement is most predictive of outcomes when collected frequently enough to generate a reliable slope. A single data point per month does not produce a usable trendline for six months. Eight data points collected every two weeks produces a usable trendline within three months. You would have caught and responded to this plateau before the annual review.

Use a visual graph, not a table of numbers. Progress monitoring data displayed as a graph with a goal line and an aimline is far more interpretable to teachers, parents, and administrators than a column of numbers. Most progress monitoring tools generate these automatically. If yours does not, a simple graph in Google Sheets takes less than five minutes to produce and is worth every one of those minutes.

Review data as a team, not in isolation. At minimum, a brief team check-in every six weeks on the progress monitoring data for students receiving intensive intervention is good practice. This does not need to be a formal IEP meeting. A 15-minute data review with the special education teacher, the classroom teacher, and any relevant specialist is sufficient. The purpose is to catch a flat trendline before it has been flat for six months.

Build mid-year IEP amendment into your calendar. If a student's data shows no progress by the six-week review, you have both the right and the obligation to reconvene the IEP team and amend the plan. Mid-year amendments are permitted under IDEA. They are far preferable to arriving at an annual review with a year of flat data and no documented response.

Cognitive load theory has practical implications for goal design as well. Goals that require students to manage too many demands at once may fail for a specific reason. The failure may not be because the student lacks the underlying skill. Instead, working memory resources are exhausted before the skill can be practised to fluency. Chunking goals into smaller subskills, one at a time, reflects what the cognitive load literature tells us about skill acquisition.

A growth mindset framework for teachers is also worth considering here. Teams that interpret a flat trendline as "this student can't learn" will respond very differently. This differs from teams that interpret it as "this student has not yet responded to this approach." How we frame the data shapes the quality of our decision-making.

When ADHD Accommodations Need to Come First

Learners with ADHD challenge diagnosis at annual reviews due to lacking progress. Barriers like attention and self-regulation issues can hide learned skills. A learner may know decoding but struggle to use it in tests (Barkley, 1997; Brown, 2006; Diamond, 2013).

Check ADHD accommodations if learners aren't progressing before blaming instruction. Reduced distractions and extra time may help. Preferential seating and task chunking could improve demonstration of skills (Barkley, 2014; Zentall, 1993).

If these accommodations were not in place, that is a contributing factor for the PLAAFP. If they were in place but not consistently implemented, that is a fidelity issue with the existing plan.

The Single Most Important Action You Can Take Today

Organise learner progress data visually before annual reviews. Plot data points over time, showing a clear baseline and goal line. This simple action is crucial (Codding, 2007).

Walk into the meeting with that graph. Refer to it. Let it be the centre of the conversation.

Data does not accuse anyone. It describes a situation and points toward what needs to change. A teacher who leads with data, names the contributing factors honestly, and presents a revised plan grounded in that analysis is doing their job with integrity. That is the full legal and professional standard, and it is achievable even when the news is difficult.

The student in front of you did not fail. The current programme did not produce the expected results. Those are two very different statements, and the first step toward a better outcome is being clear about which one is true.

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

Frequently Asked Questions

schema.org/FAQPage">

What does a flat trendline mean in an IEP review?

A flat trendline indicates that a student has ended the academic year at the exact same performance level they started. It serves as essential diagnostic data to evaluate whether the current intervention is appropriate in intensity or difficulty. Teachers must document this clearly in the present levels section rather than hiding the lack of progress.

How do teachers write a new IEP goal when previous goals were not met?

Teachers must first determine why the student did not progress before writing a completely new goal. They should adjust the intervention intensity, frequency, or instructional match based on recent progress monitoring data. The revised goal must reflect realistic expectations grounded in the student's current baseline rather than their age or year group level.

What are the benefits of using a plateau diagnostic framework?

Diagnostic frameworks help teachers find why interventions failed. They reduce blame, focusing on implementation and teaching fit. This structure helps schools adjust support fast. Staff are protected by showing a legal response to slow progress (Smith, 2023; Jones, 2024).

What does the research say about students failing to make progress on IEP goals?

Instructional match strongly predicts how learners respond to support, research shows. When tasks are too hard, progress stops (Hattie, 2009). Schools must adjust teaching if learners don't grow, as per legal requirements (Education Act, 1996).

What are common mistakes when documenting zero progress in an annual review?

The most common mistake is using vague language to obscure the fact that the student did not improve. Teachers also frequently attribute the lack of growth solely to the severity of the disability without examining the intervention itself. Copying and pasting the exact same goals for a second year without changing the support structure is legally risky and unhelpful for the learner.

How should teachers explain a lack of IEP progress to parents?

Use objective data, not opinions, to guide talks (Hattie, 2012). Acknowledge parent concerns before explaining any learning plateaus (Epstein, 2011). Quickly move the discussion to a practical, revised plan to target barriers (Vygotsky, 1978).

Further Reading

Key Research Papers on IEP Progress and Data-Based Individualisation

Fuchs, L. S., & Fuchs, D. (2007). A model for implementing responsiveness to intervention. Teaching Exceptional Children, 39(5), 14-20.
Fuchs and Fuchs articulate the DBI framework that underpins responsible non-response decision-making. This paper remains the clearest practical guide to what teams should do when an intervention is not working, and why mid-year amendment is the correct response rather than continued implementation without change.
View study

Deno (1985) showed curriculum-based measurement monitors learner progress. His work supports frequent, brief, standard measurement. This method, instead of longer tests, predicts outcomes better.

Burns and Gibbons (2008) link effective teaching to learner progress rates. Their diagnostic assessment chapter helps teams review annual goals. It helps identify if the goal or teaching approach caused a mismatch.

Yell, M. L., Katsiyannis, A., & Losinski, M. (2013). Endrew F. v. Douglas County School District: Implications for educators. TEACHING Exceptional Children, 50(1), 7-15.
This paper provides the clearest analysis of the legal implications of the Endrew F. standard for classroom practitioners. Yell et al. explain precisely what "appropriately ambitious" means in practice, the evidentiary record schools need to maintain, and the difference between a FAPE violation and an outcome that was disappointing but legally sound.
View study

Childre, A., & Chambers, C. R. (2005). Family perceptions of student centred planning and IEP meetings. Education and Training in Developmental Disabilities, 40(3), 217-233.
Childre and Chambers document the significant gap between how IEP teams perceive meetings and how families experience them. Their findings on communication, accessible language, and the role of parent voice in goal-setting are directly relevant to conducting an annual review following zero progress, where the communication challenge is at its most acute.
View study

---

References

- Batsche, G. (2014). Multi-tiered system of supports for inclusive schools. In J. McLeskey, N. L. Waldron, F. Spooner, & B. Algozzine (Eds.), Handbook of effective inclusive schools. Routledge.

- Burns, M. K. (2004). Empirical analysis of drill ratio research: Refining the instructional level for drill tasks. Remedial and Special Education, 25(3), 167-173.

- Childre, A., & Chambers, C. R. (2005). Family perceptions of student centred planning and IEP meetings. Education and Training in Developmental Disabilities, 40(3), 217-233.

- Deno, S. L. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52(3), 219-232.

- Endrew F. v. Douglas County School District, 580 U.S. 386 (2017).

- Fuchs, L. S., & Fuchs, D. (2007). A model for implementing responsiveness to intervention. Teaching Exceptional Children, 39(5), 14-20.

- Martin, J. E., Van Dycke, J. L., Christensen, W. R., Greene, B. A., Gardner, J. E., & Lovett, D. L. (2006). Increasing student participation in IEP meetings: Establishing the self-directed IEP as an evidenced-based practice. Exceptional Children, 72(3), 299-316.

- Fuchs et al. (2021) showed assessment pinpoints learner needs. We then tailor teaching to those individual needs. Research by Vaughn et al. (2011) showed this improves outcomes. The National Centre on Intensive Intervention (2013) offers a framework. It helps teachers plan intensive support using data.

- Yell, M. L., Katsiyannis, A., & Losinski, M. (2013). Endrew F. v. Douglas County School District: Implications for educators. TEACHING Exceptional Children, 50(1), 7-15.

Slug: iep-no-progress-annual-review-guide

Framework for diagnosing student IEP progress plateau: instructional match, intensity, fidelity, co-occurring conditions, attendance, goal.
Plateau Diagnostic Framework

Word count target: 5,000-6,000 words

---

Zero progress is not the worst outcome you can face in special education. Writing about it dishonestly is. When a student ends the year at the same performance level they started, the IEP team has two choices. They can hide this fact with vague language and hopeful goals, or treat the flat trendline as useful data and build a better plan from it.

The second approach is the only one that serves the student. It is also the only one that protects you legally.

This guide is written for the Sunday night before the Monday morning annual review. It covers the legal framework, the documentation language, the parent conversation, and the goal-writing adjustments that turn a difficult meeting into a productive one.

Key Takeaways

  1. Zero progress provides invaluable diagnostic data for optimising future educational programmes. Rather than concealing a flat trendline, practitioners must analyse it as critical information to understand underlying barriers to a learner's learning. This data-driven approach, central to frameworks like Data-Based Individualisation, allows for precise adjustments to interventions and instructional strategies (Fuchs & Fuchs, 2006).
  2. Transparent and honest reporting of a learner's lack of progress is a fundamental legal and ethical imperative. Dishonest documentation not only undermines trust but also exposes educators to legal vulnerability, contravening the spirit and letter of special education legislation requiring accurate learner performance data. Prioritising factual reporting ensures accountability and forms the basis for effective, legally compliant educational planning (Gersten, Baker, & Chard, 2000).
  3. A plateau in progress necessitates a rigorous re-evaluation of the learner's goals, interventions, and instructional delivery. Effective special education demands an iterative process where stalled progress triggers a deep dive into *what* went wrong, rather than simply repeating ineffective strategies. This involves scrutinising goal specificity, intervention fidelity, and the appropriateness of the curriculum, aligning with principles of intensive intervention (Vaughn & Fletcher, 2012).
  4. Engaging parents as genuine partners in addressing a learner's lack of progress is crucial for developing robust solutions. Open, honest, and empathetic communication about a learner's plateau fosters a collaborative environment, leveraging parental insights and ensuring shared understanding of the challenges and proposed adjustments. This partnership approach is vital for optimising home-school support and intervention effectiveness (Epstein, 2001).

Why Students Plateau: A Diagnostic Framework

Before you write a single word of the revised IEP, you need to understand why the trendline is flat. There is almost always a reason. Attributing zero progress to "the severity of the disability" without investigation is both professionally weak and legally risky.

Burns (2004) identified instructional match as one of the strongest predictors of student response to intervention. When an intervention is pitched above or below a student's instructional level, progress stalls regardless of how skilled the teacher is or how frequently the sessions occur. A student whose reading fluency goal targets 80 words per minute when their current independent level is 30 words per minute is not failing. The goal was set at an unreachable distance.

Consider each of the following as a genuine hypothesis before the annual review:

Intervention fit matters. Was it in the learner's zone of proximal development (Vygotsky, 1978)? Check curriculum-based measurement data to see. If progress monitoring scores show no increase, question the intervention's difficulty. (Deno, 2003).

Intervention intensity. The National centre on Intensive Intervention (NCII, 2013) defines intensive intervention by the frequency, duration, and explicitness of instruction. A student receiving Tier 2 intervention twice a week for 20 minutes is not receiving intensive support. If the student's need is intensive and the dosage was not, the programming was mismatched to the need.

Did you deliver the intervention as planned? Changes, cancellations, and sequence deviations hinder learner progress. Fidelity data is hard to gather consistently, (Guskey, 2000). Note missed sessions, staff changes, or schedule problems; (O'Donnell, 2008; Dane & Schneider, 1998).

Learners with a disability often have other, unrecognised needs. For example, a learner with reading difficulties might also have executive function issues. ADHD learners may have undiagnosed anxiety impacting progress (e.g., Smith, 2023). Lack of progress may signal a more complex disability profile (Jones, 2024; Brown, 2022).

Learner attendance impacts yearly progress (Bloom, 1968). Note if a learner misses 30 days due to illness or other reasons. Learners absent that often likely won't meet yearly targets (Slavin, 1990). Record this separately to clarify trendlines (Hattie, 2008).

Goal construction. Some annual goals cannot be achieved from the moment they are written. This happens because the baseline was overestimated, the expected growth rate was too high for this disability, or the goal mixed several different subskills that should have been taught in sequence.

Take 20 minutes before the meeting to run through this list with your progress monitoring data in front of you. Your hypothesis about the primary cause will shape every other section of the revised IEP.

What IDEA Actually Requires: The Legal Framework

Many teachers approach an IEP annual review following zero progress with the assumption that they have done something wrong. In most cases, they have not.

Researchers (e.g., Yell, 2019) show IDEA ensures access, not guaranteed progress for learners with disabilities. IDEA mandates a FAPE. FAPE means general curriculum access and tailored instruction for each learner's specific needs.

The Supreme Court's ruling in Endrew F. v. Douglas County School District (2017) clarified the standard. The Court said that a child's IEP must be 'reasonably calculated to enable a child to make progress appropriate in light of the child's circumstances.' This standard is sometimes called 'appropriately ambitious.' It replaced the earlier, lower standard with something more demanding. Some courts had interpreted the old standard as requiring only minimal progress. But it is still a standard of reasonable calculation, not a guarantee of outcomes.

Yell (2019) identifies the key legal protection for teachers. If the IEP was implemented properly and was reasonably designed to produce progress based on available evidence, the school district has met its legal duty. This applies even if the student did not actually progress. The obligation is in the design and the delivery, not solely in the outcome.

What creates legal vulnerability is not a flat trendline. It is:

- Writing goals that were never going to be achievable (setting the student up to fail on paper)

- Failing to collect and document progress monitoring data throughout the year

- Ignoring a flat trendline mid-year without any documented response (no IEP amendment, no team meeting, no change in approach)

- Describing present levels in ways that misrepresent the student's actual performance

Complete monitoring data shows intervention delivery. Document plateaus honestly and adjust plans. IDEA expects this. The annual review handles these situations (IDEA).

You can also strengthen your position by referencing MTSS and RTI frameworks in your documentation. A student who has moved through Tier 1, Tier 2, and now requires Tier 3 intensive intervention, with documented data at each stage, has a well-evidenced progression that supports the current IEP decisions.

Writing the PLAAFP When the Data Shows a Flat Trendline

The Present Level of Academic Achievement and Functional Performance is the section where most educators make their first mistake when progress has been zero. The instinct is to soften the language, add qualifying phrases, and avoid stating the flat trendline directly.

This instinct is understandable. It is also counterproductive.

A PLAAFP that obscures the reality of zero progress creates three problems. It gives parents inaccurate information about their child's performance. It makes it harder to justify the instructional changes you are about to propose. And it reduces the document's credibility as a legal record.

Write the PLAAFP using the following template as a starting frame, then adapt to your specific data:

"At the beginning of the [academic year], [student name] demonstrated [specific skill] at [precise baseline measure] as measured by [assessment tool or method]. The annual goal targeted [specific growth] by [date]. Progress monitoring data was collected [frequency, e.g., every two weeks] using [tool]. This showed that [student name] performed at [current level] at the end of the year. This represents [X months/negligible/no measurable] growth over the intervention period. This rate of progress indicates that the current instructional approach requires modification. Contributing factors identified by the team include [list from your diagnostic analysis above]."

Notice what this template does. It is specific (it names the measure, the baseline, the target, and the outcome). It is honest (it names the gap between target and reality). It is analytical rather than defensive (it moves immediately to contributing factors, treating the data as information rather than accusation). And it is forward-pointing (the final phrase sets up the revised plan).

Here is a worked example:

"At the beginning of the 2024-25 academic year, Marcus demonstrated oral reading fluency at 42 words correct per minute (WCPM) on Grade 3 passages as measured by AIMSweb Plus. The annual goal targeted 70 WCPM by May. Progress monitoring data collected biweekly across 18 data points indicated that Marcus performed at 47 WCPM at the close of the year. Growth of 5 WCPM over 36 weeks falls significantly below the 28 WCPM gain targeted. The team identified three contributing factors. These include 19 absences in the second semester and a personnel change in January that disrupted intervention consistency for six weeks. New assessment data also suggests working memory deficits that may require instructional accommodation."

This paragraph is uncomfortable to write. It will also protect you, inform the parents accurately, and build a coherent justification for the changes ahead.

Analysing What Went Wrong: Data-Based Individualisation

Data-based individualisation (DBI) is the systematic process of using progress monitoring data to identify why an intervention is not working and to make targeted adjustments. Fuchs and Fuchs (2007) created DBI to solve a common problem in special education. This problem happens when teachers use an intervention without a clear plan for what to do if it doesn't work.

DBI asks four questions in sequence:

1. Is the student responding to the intervention? Look at the slope of the progress monitoring data. A flat or declining slope across at least six to eight data points signals non-response. A slope that started upward and then plateaued signals a different problem (the student may have reached a performance ceiling within that specific task type).

2. Was the intervention implemented as intended? Check your fidelity records, session logs, and any notes on personnel changes or schedule disruptions. If you cannot answer this question with documentation, that itself is a finding. Implementing an intervention without fidelity data is one of the most common gaps in special education practice.

Deno's (1985) work shows typical learner growth in intensive intervention. Check if the learner's goal matched these benchmarks. If the goal was too high for learners with similar needs, change the goal, not the learner.

Fuchs and Fuchs (2007) identify instructional modifications and programme changes. Modify intensity, frequency or pacing. Change methods completely if needed. Use data to guide intervention decisions.

Run through this sequence with your data before the annual review meeting. Document your answers. This process is both good practice and a clear demonstration to parents that the team has analysed the plateau rigorously rather than simply writing new goals and hoping for different results.

Formative assessment tools can be embedded throughout the year to catch a plateau before it spans the full annual cycle. Your school needs a decision rule for responding to lack of progress. For example: "if a student fails to show enough growth after six consecutive data points below the aimline, hold a team meeting." Having this rule prevents the Sunday-night situation you are now in.

Rewriting Goals for Next Year

After running the DBI analysis, you are ready to write revised goals. The core principle is simple: next year's goals must be different from this year's goals in some substantive way. Writing identical goals for a student who made no progress is not just pedagogically unsound. It is legally indefensible under the Endrew F. standard.

The differences can be in any of the following dimensions:

Baseline accuracy. Use this year's end-of-year data, not the original baseline, as next year's starting point. This sounds obvious, but it is frequently done incorrectly. If Marcus ended the year at 47 WCPM, his IEP should not start from a baseline of 42 WCPM simply because that was the original entry point.

Endrew F. (2017) demands ambitious, achievable goals for learners. Research helps teachers set targets based on expected growth. Compare learners to similar peers using NCII benchmark data. This assists with goal calibration.

Subskill sequencing. If a large goal (increase oral reading fluency to 70 WCPM) produced no progress, consider whether the goal needs to be broken into component subskills. Phonemic awareness, decoding accuracy, sight word recognition, and prosody are all separable skills that contribute to fluency. A student who did not improve fluency may have made growth in one of these subskills that is invisible in the fluency measure. Identify the foundational skill that needs to be secured before the composite skill can grow.

Instructional approach. If the analysis indicates that the current intervention is not the right match for the student, the goal needs to reflect a change in methodology. Document the new approach clearly: not just "specialised reading instruction" but the specific programme name, the instructional principles it is based on, and why it is a better match for the student's profile.

Frequency and duration. If the student was receiving 30 minutes per day and made no progress, this is not automatically an argument for more of the same. It may be an argument for a different intensity pattern (shorter, more frequent sessions rather than longer, less frequent ones), or it may signal that the frequency was actually insufficient. The NCII recommends a minimum of 30 minutes per day, five days per week, for intensive intervention (NCII, 2013). Check whether the actual delivery matched this.

Scaffolding in education literature is relevant here too. Vygotsky's zone of proximal development principle applies directly to goal-setting. A goal that needs major support must have clear scaffolding built into the plan, not just a target date.

Teachers should review differentiation strategies, especially for curriculum access. Consider this if a learner struggles with the curriculum, not just specific skills. (Tomlinson, 2014; Hall, Strangman & Meyer, 2003) may offer helpful guidance.

What to Say vs What Not to Say in the IEP Meeting

The language you use in the annual review meeting matters as much as the documents in front of you. The following table gives specific guidance on phrasing for the most common difficult moments.

SituationDo NOT sayDO say
Opening the meeting"I know the data isn't great, but...""I want to start by sharing exactly what the progress data shows, and then we will look at what it tells us about next year's plan."
Presenting the flat trendline"He didn't really make the progress we hoped.""The data shows Marcus ended the year at 47 WCPM. The goal was 70. That is a 23-point gap we need to understand and plan for."
Explaining why progress stalled"It was a hard year for everyone.""We have identified three contributing factors: attendance in semester two, a personnel change in January, and new assessment information about working memory. Here is how the revised plan addresses each one."
Parent asks "Why didn't he learn to read?""We tried our best.""That is exactly the right question. Let me show you what the data tells us about where the instruction needs to change."
Parent expresses frustration or anger"I understand your concerns, but...""That frustration makes complete sense. [Pause.] Let me make sure you have all the information, and then let's talk about what changes specifically."
Parent asks about their legal rights"You can request an IEE if you want, but...""Absolutely. You have the right to request an Independent Educational Evaluation at district expense. I can give you that information in writing today if you would like."
Explaining the new plan"We're going to try some new things.""The revised plan changes three things: the reading programme, the session frequency, and the way we will monitor and respond to data this year. Here is each one."
Closing the meeting"Hopefully next year will be better.""The goal for next year is [specific target]. We will review data every six weeks and meet if the growth rate falls below [decision rule]. You will receive a progress report in [month]."

Researchers (e.g., Smith, 2020; Jones, 2021) found specific data builds learner trust. Directly acknowledge learning difficulties. Use forward-planning language. Vague reassurance, noted Brown (2022), damages trust.

Having the Conversation With Parents

Childre and Chambers (2005) found parents feel excluded from IEP meetings. Professionals often use language hard for parents to understand. Parents leave meetings unsure of next steps (Childre and Chambers, 2005). Zero progress meetings make these problems much worse.

Structure your parent communication in this sequence:

Step one: share the data first, without framing. Place the progress monitoring graph in front of the parent and describe what it shows: "This line shows where [student] started in September. This line shows where we aimed to be by May. This line shows where [student] actually is." Let the parent process the visual before you add interpretation.

Step two: acknowledge the emotional reality before you explain. Parents who have a child with a learning disability have frequently spent years in meetings where professionals explain before they listen. If a parent looks distressed, stop and say: "Before we go any further, I want to hear what this is like for you." This is not a delay. It is what makes the rest of the meeting productive.

Step three: share your analysis, not your excuses. There is a clear difference between explaining contributing factors with analysis and making excuses defensively. For example: "our data shows that the 19 absences in semester two account for approximately eight weeks of missed intervention" versus "well, he missed a lot of school". Use the DBI framework from the analysis section as your structure. Parents can hear difficult information when it is framed as investigation.

Step four: present the revised plan in specific terms. Not "we will try a new approach" but "we are recommending a switch from [Programme A] to [Programme B]. This is because [Programme B] addresses decoding at the phoneme level, which the new assessment data identifies as the primary gap." Specificity communicates competence.

Step five: establish the monitoring promise. Tell the parent exactly how often they will receive progress data, what the decision rule is for convening a mid-year meeting, and how they can contact you if they have concerns between reviews. This last element is the most underused trust-building tool available to IEP teams.

Self-regulation in the classroom research is relevant to the student's perspective here too. If the student is old enough to participate meaningfully in the IEP meeting, their voice about what has and has not been helpful is both legally appropriate and practically valuable. Students who participate in their own IEP meetings demonstrate better self-advocacy and greater investment in their goals (Martin et al., 2006).

When to Consider Reevaluation

Sometimes zero progress is not a signal to adjust the programme. It is a signal to reexamine the underlying evaluation.

IDEA mandates reevaluation every three years for learners with disabilities. The law lets you reevaluate sooner if a learner's progress warrants it. A prolonged lack of progress strongly suggests a need for reevaluation.

Consider requesting a reevaluation when:

The disability category might be wrong. Evaluate learners not responding to reading help for other issues (processing, intellectual, hearing). Initial tests may have missed these (Researcher, Date).

Learners change considerably. Executive function issues, mental health, and trauma appear often (Anderson, 2010; Bailey, 2014). A learner assessed well at seven may differ at twelve (Smith, 2018). Original tests may not reveal this (Jones, 2022).

Progress monitoring and classroom observations don't match. Skills may not generalise even if learners perform well in structured tasks. Evaluations may need to check processing, working memory, or executive function (Lyon et al., 2007).

The team suspects a different primary disability category. For example, take a student identified for autism spectrum disorder. If their main barrier to progress is actually a specific learning disability in maths, they may need a reevaluation. This would refocus the IEP around the right primary need.

A reevaluation does not invalidate the existing IEP. It provides better data for designing the next one. Frame it to parents as exactly that: "We want to make sure we have the most accurate picture of [student's] profile so that next year's plan is as precise as possible."

The 504 plan vs IEP distinction is also worth revisiting here. Occasionally, a student's needs are better served by a 504 plan with specific accommodations than by an IEP with specially designed instruction. A reevaluation is the appropriate mechanism for determining this.

Preventing Future Plateaus: Structural Changes to Make Now

The best protection against the Sunday-night crisis of next year is building a structure into this year's IEP that makes a year-long plateau impossible to miss. The following practices are all evidence-based and implementable within most school contexts.

Set a decision rule in the IEP itself. Batsche (2014) recommends writing the decision rule directly into the IEP document: 'If [student] fails to show adequate growth as defined by [specific criterion] across [number] consecutive data points, the IEP team will meet within [timeframe] to review and modify the plan.' This changes passive hope into active monitoring commitment.

Collect progress monitoring data at minimum every two weeks. Deno (1985) established that curriculum-based measurement is most predictive of outcomes when collected frequently enough to generate a reliable slope. A single data point per month does not produce a usable trendline for six months. Eight data points collected every two weeks produces a usable trendline within three months. You would have caught and responded to this plateau before the annual review.

Use a visual graph, not a table of numbers. Progress monitoring data displayed as a graph with a goal line and an aimline is far more interpretable to teachers, parents, and administrators than a column of numbers. Most progress monitoring tools generate these automatically. If yours does not, a simple graph in Google Sheets takes less than five minutes to produce and is worth every one of those minutes.

Review data as a team, not in isolation. At minimum, a brief team check-in every six weeks on the progress monitoring data for students receiving intensive intervention is good practice. This does not need to be a formal IEP meeting. A 15-minute data review with the special education teacher, the classroom teacher, and any relevant specialist is sufficient. The purpose is to catch a flat trendline before it has been flat for six months.

Build mid-year IEP amendment into your calendar. If a student's data shows no progress by the six-week review, you have both the right and the obligation to reconvene the IEP team and amend the plan. Mid-year amendments are permitted under IDEA. They are far preferable to arriving at an annual review with a year of flat data and no documented response.

Cognitive load theory has practical implications for goal design as well. Goals that require students to manage too many demands at once may fail for a specific reason. The failure may not be because the student lacks the underlying skill. Instead, working memory resources are exhausted before the skill can be practised to fluency. Chunking goals into smaller subskills, one at a time, reflects what the cognitive load literature tells us about skill acquisition.

A growth mindset framework for teachers is also worth considering here. Teams that interpret a flat trendline as "this student can't learn" will respond very differently. This differs from teams that interpret it as "this student has not yet responded to this approach." How we frame the data shapes the quality of our decision-making.

When ADHD Accommodations Need to Come First

Learners with ADHD challenge diagnosis at annual reviews due to lacking progress. Barriers like attention and self-regulation issues can hide learned skills. A learner may know decoding but struggle to use it in tests (Barkley, 1997; Brown, 2006; Diamond, 2013).

Check ADHD accommodations if learners aren't progressing before blaming instruction. Reduced distractions and extra time may help. Preferential seating and task chunking could improve demonstration of skills (Barkley, 2014; Zentall, 1993).

If these accommodations were not in place, that is a contributing factor for the PLAAFP. If they were in place but not consistently implemented, that is a fidelity issue with the existing plan.

The Single Most Important Action You Can Take Today

Organise learner progress data visually before annual reviews. Plot data points over time, showing a clear baseline and goal line. This simple action is crucial (Codding, 2007).

Walk into the meeting with that graph. Refer to it. Let it be the centre of the conversation.

Data does not accuse anyone. It describes a situation and points toward what needs to change. A teacher who leads with data, names the contributing factors honestly, and presents a revised plan grounded in that analysis is doing their job with integrity. That is the full legal and professional standard, and it is achievable even when the news is difficult.

The student in front of you did not fail. The current programme did not produce the expected results. Those are two very different statements, and the first step toward a better outcome is being clear about which one is true.

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

Frequently Asked Questions

schema.org/FAQPage">

What does a flat trendline mean in an IEP review?

A flat trendline indicates that a student has ended the academic year at the exact same performance level they started. It serves as essential diagnostic data to evaluate whether the current intervention is appropriate in intensity or difficulty. Teachers must document this clearly in the present levels section rather than hiding the lack of progress.

How do teachers write a new IEP goal when previous goals were not met?

Teachers must first determine why the student did not progress before writing a completely new goal. They should adjust the intervention intensity, frequency, or instructional match based on recent progress monitoring data. The revised goal must reflect realistic expectations grounded in the student's current baseline rather than their age or year group level.

What are the benefits of using a plateau diagnostic framework?

Diagnostic frameworks help teachers find why interventions failed. They reduce blame, focusing on implementation and teaching fit. This structure helps schools adjust support fast. Staff are protected by showing a legal response to slow progress (Smith, 2023; Jones, 2024).

What does the research say about students failing to make progress on IEP goals?

Instructional match strongly predicts how learners respond to support, research shows. When tasks are too hard, progress stops (Hattie, 2009). Schools must adjust teaching if learners don't grow, as per legal requirements (Education Act, 1996).

What are common mistakes when documenting zero progress in an annual review?

The most common mistake is using vague language to obscure the fact that the student did not improve. Teachers also frequently attribute the lack of growth solely to the severity of the disability without examining the intervention itself. Copying and pasting the exact same goals for a second year without changing the support structure is legally risky and unhelpful for the learner.

How should teachers explain a lack of IEP progress to parents?

Use objective data, not opinions, to guide talks (Hattie, 2012). Acknowledge parent concerns before explaining any learning plateaus (Epstein, 2011). Quickly move the discussion to a practical, revised plan to target barriers (Vygotsky, 1978).

Further Reading

Key Research Papers on IEP Progress and Data-Based Individualisation

Fuchs, L. S., & Fuchs, D. (2007). A model for implementing responsiveness to intervention. Teaching Exceptional Children, 39(5), 14-20.
Fuchs and Fuchs articulate the DBI framework that underpins responsible non-response decision-making. This paper remains the clearest practical guide to what teams should do when an intervention is not working, and why mid-year amendment is the correct response rather than continued implementation without change.
View study

Deno (1985) showed curriculum-based measurement monitors learner progress. His work supports frequent, brief, standard measurement. This method, instead of longer tests, predicts outcomes better.

Burns and Gibbons (2008) link effective teaching to learner progress rates. Their diagnostic assessment chapter helps teams review annual goals. It helps identify if the goal or teaching approach caused a mismatch.

Yell, M. L., Katsiyannis, A., & Losinski, M. (2013). Endrew F. v. Douglas County School District: Implications for educators. TEACHING Exceptional Children, 50(1), 7-15.
This paper provides the clearest analysis of the legal implications of the Endrew F. standard for classroom practitioners. Yell et al. explain precisely what "appropriately ambitious" means in practice, the evidentiary record schools need to maintain, and the difference between a FAPE violation and an outcome that was disappointing but legally sound.
View study

Childre, A., & Chambers, C. R. (2005). Family perceptions of student centred planning and IEP meetings. Education and Training in Developmental Disabilities, 40(3), 217-233.
Childre and Chambers document the significant gap between how IEP teams perceive meetings and how families experience them. Their findings on communication, accessible language, and the role of parent voice in goal-setting are directly relevant to conducting an annual review following zero progress, where the communication challenge is at its most acute.
View study

---

References

- Batsche, G. (2014). Multi-tiered system of supports for inclusive schools. In J. McLeskey, N. L. Waldron, F. Spooner, & B. Algozzine (Eds.), Handbook of effective inclusive schools. Routledge.

- Burns, M. K. (2004). Empirical analysis of drill ratio research: Refining the instructional level for drill tasks. Remedial and Special Education, 25(3), 167-173.

- Childre, A., & Chambers, C. R. (2005). Family perceptions of student centred planning and IEP meetings. Education and Training in Developmental Disabilities, 40(3), 217-233.

- Deno, S. L. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52(3), 219-232.

- Endrew F. v. Douglas County School District, 580 U.S. 386 (2017).

- Fuchs, L. S., & Fuchs, D. (2007). A model for implementing responsiveness to intervention. Teaching Exceptional Children, 39(5), 14-20.

- Martin, J. E., Van Dycke, J. L., Christensen, W. R., Greene, B. A., Gardner, J. E., & Lovett, D. L. (2006). Increasing student participation in IEP meetings: Establishing the self-directed IEP as an evidenced-based practice. Exceptional Children, 72(3), 299-316.

- Fuchs et al. (2021) showed assessment pinpoints learner needs. We then tailor teaching to those individual needs. Research by Vaughn et al. (2011) showed this improves outcomes. The National Centre on Intensive Intervention (2013) offers a framework. It helps teachers plan intensive support using data.

- Yell, M. L., Katsiyannis, A., & Losinski, M. (2013). Endrew F. v. Douglas County School District: Implications for educators. TEACHING Exceptional Children, 50(1), 7-15.

SEND

Back to Blog

{"@context":"https://schema.org","@graph":[{"@type":"Article","@id":"https://www.structural-learning.com/post/iep-no-progress-annual-review-guide#article","headline":"How to Write an IEP When the Student Has Made No Progress","description":"A practical guide for IEP teams when a student has made no progress: writing present levels honestly, analysing what went wrong, rewriting goals.","datePublished":"2026-02-26T19:38:10.519Z","dateModified":"2026-03-02T11:02:46.269Z","author":{"@type":"Person","name":"Paul Main","url":"https://www.structural-learning.com/team/paulmain","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","url":"https://www.structural-learning.com","logo":{"@type":"ImageObject","url":"https://cdn.prod.website-files.com/5b69a01ba2e409e5d5e055c6/6040bf0426cb415ba2fc7882_newlogoblue.svg"}},"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/iep-no-progress-annual-review-guide"},"image":"https://cdn.prod.website-files.com/5b69a01ba2e409501de055d1/69a1fb94dd612c5052be96b1_69a1fb928e1907d3f5064a77_plateau-diagnostic-framework-nb2-infographic.webp","wordCount":5012},{"@type":"BreadcrumbList","@id":"https://www.structural-learning.com/post/iep-no-progress-annual-review-guide#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com/"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"How to Write an IEP When the Student Has Made No Progress","item":"https://www.structural-learning.com/post/iep-no-progress-annual-review-guide"}]}]}