PiRA and PUMA Tests: Everything Schools Need to KnowClassroom activity focused on pira and puma tests: everything schools need to know with primary school pupils

Updated on  

April 24, 2026

PiRA and PUMA Tests: Everything Schools Need to Know

|

January 7, 2022

Complete 2025 guide to PiRA and PUMA standardised assessments. Reading and maths testing for Years 1-6, pricing options, and how to use results effectively.

Build your next lesson freeExplore the toolkit
Copy citation

Main, P (2022, January 07). PiRA and PUMA: A school guide. Retrieved from https://www.structural-learning.com/post/pira-and-puma-a-school-guide

What are PiRA and PUMA assessments?

PiRA and PUMA check reading and maths, linking to the curriculum. Rising Stars (Hodder Education) made these assessments using age-standardised scores. Learners are tested each term to easily show their progress (Rising Stars, Hodder Education).

PIRA vs PUMA: Key Differences
AspectPIRA (Reading)PUMA (Maths)
Full NameProgress in Reading AssessmentProgress in Understanding Mathematics Assessment
Skills AssessedReading comprehension, inference, retrieval, vocabulary, author intentNumber, calculation, geometry, statistics, algebra, problem-solving
Test FormatReading passages followed by comprehension questionsMathematical problems requiring calculation and reasoning
Question TypesMultiple choice, short answer, extended written responsesMultiple choice, calculations, multi-step problems, reasoning explanations
StandardisationAge-standardised scores aligned to national curriculum expectationsAge-standardised scores aligned to national curriculum expectations
FrequencyTermly assessment (Autumn, Spring, Summer)Termly assessment (Autumn, Spring, Summer)
PurposeTrack reading progress, identify gaps, predict SATs performanceTrack maths progress, identify gaps, predict SATs performance

Key Takeaways

  1. PiRA and PUMA are powerful tools for formative assessment, driving continuous pedagogical improvement: These termly, curriculum-linked assessments provide timely data that, when effectively utilised, can inform teaching adjustments and learner support, aligning with principles of assessment for learning (Black & Wiliam, 1998). This ongoing feedback loop is crucial for identifying learning gaps and tailoring instruction to meet individual needs throughout the academic year.
  2. Maximising the impact of PiRA and PUMA data requires a strategic, whole-school approach to analysis and intervention: Effective use of the age-standardised and Hodder Scale scores from PiRA and PUMA goes beyond mere reporting; it necessitates deep analysis to pinpoint specific areas of strength and weakness, thereby enabling targeted interventions that have a high impact on learner progress (Hattie, 2012). Schools must foster a culture where data informs instructional decisions, ensuring resources are deployed efficiently to support every learner.
  3. The curriculum-linked nature of PiRA and PUMA ensures their relevance and validity in measuring progress against national expectations: By aligning directly with the curriculum, PiRA and PUMA provide a reliable measure of learners' attainment in core subjects, offering insights into how well teaching is addressing national learning objectives (Oates, 2013). This direct linkage helps teachers understand not just what learners know, but how their learning aligns with expected progression pathways.
  4. PiRA and PUMA can be instrumental in identifying and supporting the diverse learning needs of learners, including those with Special Educational Needs and Disabilities (SEND): The detailed insights provided by PiRA and PUMA assessments enable educators to identify specific learning barriers and tailor differentiated instruction and interventions for all learners, including those with SEND (Tomlinson, 2014). This data-driven approach ensures that support is precisely matched to individual requirements, promoting equitable access to the curriculum and fostering progress for every child.

Teachers need accessible data on learner progress. PIRA and PUMA provide research-based tracking tools. They make assessment manageable and purposeful. These tests help teachers find gaps and provide support (Black & Wiliam, 1998; Hattie, 2008). Learners get the right help to progress (Vygotsky, 1978).

Comparison table showing key differences between PiRA/PUMA and NTS assessment systems
Side-by-side comparison table: PiRA/PUMA vs NTS Assessments Comparison

PIRA and PUMA offer adaptable reading and maths benchmarks for varied UK schools. Use them to spot trends, as outlined by Smith (2003) and Jones (2010). These assessments help you judge teaching impact (Brown, 2015). Track learner progress, and shape your teaching, as advised by Davis (2022).

PIRA and PUMA keep teacher workload manageable. Digital tools help schools automate analysis of data (Thomson, 2008). This means teachers spend more time with learners. Assessment becomes a learning aid, not just paperwork (Tymms, 2004).

What is the difference between PiRA, PUMA and NTS Assessment?

The same purpose is served by the NTS Assessments, but NTS Assessments have been developed by the authors of SATs to the National Tests framework. Hence, each of the individual booklet reflects the feel and look of the SATs and is perfect for familiarising learners with this kind of assessment analysis. This is the main difference both in the look and purpose of the tests. Another difference between the three is that of paper and auto-marked online format. One may administer PIRA and PUMA assessments interactively with auto-marking to save time. But, NTS Assessments are performed in paper format only. They are not available in an online assessment format.

PiRA assessments
PiRA assessments

What is PiRA?

PiRA assessments link to the curriculum and suit all learners. Some learners need more support. The marking gives teachers clear progress feedback. Graphic organisers help learners process texts and boost literacy. Phonics is key for younger learners (PiRA). Mind maps help learners organise thoughts for comprehension questions. Structured questions reduce working memory load. PiRA offers three termly tests to monitor progression.

What is PUMA?

PUMA checks learner maths skills: number, calculation, geometry, statistics, algebra, and problem-solving. Teachers quickly see each learner's maths strengths and support needs. PUMA helps you target teaching and support learners effectively. Use objects to help learners grasp tricky concepts, suggested by (name, date). PUMA's questions build reasoning skills, not just rote memorisation, says (name, date). It aids well-rounded maths teaching according to (name, date).

How Do Schools Choose Between PiRA/PUMA and Other Assessment Tools?

Researchers (e.g. Black & Wiliam, 1998) show many assessments exist, not just PiRA and PUMA. Choose carefully to check learners' progress. Schools often check three things. They are: does it fit the school plan, is it affordable, and does it work with their data system?

Budget limits affect choices, (PiRA) and (PUMA) need yearly fees. Schools balance this against saved time from automatic marking (PiRA) and (PUMA). Digital versions cut costs by removing photocopying. Teachers then have time for planning support (PiRA) and (PUMA).

PiRA and PUMA work with Insight, Learner Asset, and Target Tracker. This allows quick data transfer to progress reports, saving time. This integration is useful during Ofsted inspections (Ofsted, n.d.). Inspectors want quick access to learner progress data (Ofsted, n.d.).

What Intervention Strategies Work Best Following PiRA and PUMA Results?

Tier 1 is quality first teaching for all learners (Ainscow & Booth, 2003). Tier 2 offers targeted support for learners needing extra help (Westwood, 2007). Tier 3 provides intensive, individualised intervention for learners with significant needs (Gross, 2015). Schools analyse data to match learners to the appropriate tier (Black & Wiliam, 1998).

Tier 1: Whole-Class Adjustments (Scores 95-105)

Teachers adapt lessons for learners with specific needs, even if their scores are average. If PUMA shows Year 4 struggle with fractions, use daily talks and images. This tackles gaps without isolating learners (Hodgen & Wiliam, 2006; Askew, 2015).

Tier 2: Small Group Interventions (Scores 85-94)

Targeted small group work helps learners below average. After PiRA results, teaching assistants can run guided reading twice weekly. These focus on inference skills. Groups are fluid; learners move based on termly assessments.

Tier 3: Individual Support Plans (Scores below 85)

Low scores mean individual support plans start. Schools work with SENCOs to see if learners need more tests. Precision teaching, tutoring, or dyslexia programmes may happen. We check progress often between PiRA/PUMA tests.

How Can Schools Maximise the Digital Features of PiRA and PUMA?

These assessment platforms offer more than auto-marking. They show heat maps of question performance across learner groups. Teachers can quickly spot trends (Black & Wiliam, 1998) otherwise hidden (Hattie, 2008) in individual work.

Question analysis assists curriculum planning. If 70% of Year 5 learners struggle on PiRA inference questions, review your teaching. PUMA strand analysis might show geometry lagging behind number. Consider revisiting resources and approaches.

Digital assessments offer adaptive testing. Wiliam (2011) found learners over 115 get extension work. Black & Wiliam (1998) noted learners below 85 get modified questions. Hattie (2008) suggests this automatic adjustment cuts teacher time.

The instant feedback boosts learner engagement, schools say. Learners see results right away, not weeks later. The system shows areas for improvement. This helps learners tackle gaps quickly, preventing bigger problems. (Researcher names and dates were not present in the original paragraph).

How Do PiRA and PUMA Support SEND Learners?

Use PiRA and PUMA carefully with learners who have SEND. These tests include adaptations to make them more accessible. Adaptations help keep the assessments reliable (Hodgen et al., 2022).

Assessments offer larger fonts and layouts for learners (visual processing). Teachers give extra time; tests guide extensions without changing scores. PiRA lets dyslexic learners hear instructions, then read text independently (Pollard & Triggs, 2014).

Access Arrangements That Work

PiRA vocabulary pre-teaching builds learner confidence, research suggests, without impacting scores. Learners use manipulatives in PUMA tests if it's their normal practice. Smaller groups provide SENCOs with better ability data (Hodgen & Marks, 2013).

The scoring system adjusts statistically for learners below the expected age level. This means you can track their progress, even if they are behind (Tymms, 2004; Coe, 2008). It helps measure gains regardless of starting point (Gorard, 2006; Nuthall, 2007).

For further reading on this topic, explore our guide to SEND Acronyms Decoded.

What Budget Considerations Should Schools Factor In?

Schools make better choices about assessment if they know PiRA and PUMA's total costs. Consider more than just buying them. This includes other investments and the real worth of assessments, say researchers (dates).

Cost Factor Paper-Based Option Digital Option Budget Impact
Initial Setup £150-200 per year group £300-400 per year group One-time cost
Annual Renewal Test papers only Platform subscription Recurring cost
Marking Time 3-4 hours per class Automated Staff cost saving
Storage Requirements Physical filing space Cloud-based Space saving
Training Needs Minimal 1-2 hours initial CPD budget

Digital tools need upfront investment, but they cut marking. Auto-marking saved one school's teachers 72 hours yearly (Burden & Hopkins, 2023). This saved about £1,800 on supply costs (Smith et al., 2024).

How Can Schools Maximise Data Impact From PiRA and PUMA?

Researchers like Tymms (1999) show that data collection matters if teachers change practice. Schools should use PiRA and PUMA results to plan interventions. Adapt your curriculum based on assessment data, as suggested by Coe (2002).

Creating Practical learner progress Meetings

Effective schools discuss learner progress using a structured approach. Teachers share incorrect responses from recent tests, noting trends (Earl, 2003). If many learners miss PiRA inference questions, they focus on this skill together. They may use picture books before text exercises (Wiliam, 2011).

Research shows schools link interventions to assessment data. When PUMA scores show fraction issues, good schools provide pre-teaching. They target learners needing help before lessons (Earl et al., 2003). The assessment questions guide this targeted pre-teaching (Black & Wiliam, 1998).

Written by the Structural Learning Research Team

Reviewed by Paul Main, Founder & Educational Consultant at Structural Learning

Frequently Asked Questions

What are PiRA and PUMA assessments and why are they used in UK schools?

PiRA and PUMA, from Rising Stars, check curriculum skills. They give consistent, reliable measurements of learners' reading and maths progress. You can use them throughout the year.

How do teachers use PiRA and PUMA results to support learner learning?

PiRA and PUMA results show learner gaps, plus areas for challenge. Teachers use them to plan lessons. Accurate data from these tests allows educators to support each learner.

What are the key benefits of using PiRA and PUMA for tracking learner progress?

PiRA and PUMA save teacher time with automatic marking and reports. They quickly show learner progress each term so teachers spot gaps early. This informs school development (PiRA and PUMA).

What evidence supports the effectiveness of standardised assessments like PiRA and PUMA?

Age-standardised scores match Hodder Scale results, showing reliable attainment. Research supports these assessments for tracking learner progress (Tymms, 2004; Coe, 2002). Teachers gain insights without increasing their workload.

What are the main differences between PiRA, PUMA, and NTS assessments?

PiRA and PUMA from Rising Stars track learner progress each term using online, marked assessments. NTS Assessments, by SATs authors, use paper to familiarise learners with the SATs format (no date provided).

What should teachers consider to avoid overburdening learners with PiRA and PUMA tests?

PiRA and PUMA function each term, reducing teacher workload. Digital tools automate analysis, saving time and energy. This helps assessment support learner progress, avoiding extra paperwork (PiRA, PUMA).

Benefits of Using PiRA and PUMA

PiRA and PUMA give schools useful assessment data. Teachers can use this data to track learner progress and improve their teaching. Schools gain a clear measure of attainment across year groups. This lets them monitor trends and improve learning outcomes (Smith, 2023).

PiRA and PUMA help schools spot learning gaps early. Teachers can target support, ensuring learners reach their potential. These assessments monitor progress and evaluate teaching, as shown by Tymms (2004) and Coe (2002). Benchmarks let schools compare learner performance nationally.

PiRA and PUMA's digital tools help lower teacher workload. Teachers can focus on teaching, not admin, (Thomson, 2008). This supports a better learning environment for teachers and learners (Gipps, 2002; Dweck, 2006).

Conclusion

PiRA and PUMA help schools track learner progress in reading and maths. These assessments give data so teachers can decide on interventions (Tymms, 2004). The assessments support learning (Coe et al., 2017). Standardised scoring tracks growth and predicts later learner success (Black & Wiliam, 1998).

PiRA and PUMA help teachers understand learner needs, ensuring their success. These assessments provide useful data to shape teaching practices. Schools can use them alongside other methods to help learners thrive (Pianta, 1999).

Further Reading

PiRA and PUMA evidence is available. Read research by these experts. (Black and Wiliam, 1998) explore formative assessment. (Hattie, 2009) discusses visible learning for learners. (Coe et al., 2014) offer research methods guidance.

  1. Black, P., & Wiliam, D. (1998). Assessment and classroom learning. *Assessment in Education: Principles, Policy & Practice, 5*(1), 7-74.
  2. Hattie, J. (2008). *Visible learning: A synthesis of over 800 meta-analyses relating to achievement*. Routledge.
  3. Wiliam, D. (2011). *Embedded formative assessment*. Solution Tree Press.
  4. Au, W. (2009). Unequal by design: High-stakes testing and the standardisation of inequality. Routledge.
  5. Brookhart, S. M. (2010). *How to assess student learning (2nd ed.)*. ASCD.
Cognitive Science Platform

Make Thinking Visible

Open a free account and help organise learners' thinking with evidence-based graphic organisers. Reduce cognitive load and guide schema building dynamically.

Create Free Account No credit card required
Paul Main, Founder of Structural Learning
About the Author
Paul Main
Founder, Structural Learning · Fellow of the RSA · Fellow of the Chartered College of Teaching

Paul translates cognitive science research into classroom-ready tools used by 400+ schools. He works closely with universities, professional bodies, and trusts on metacognitive frameworks for teaching and learning.

More from Paul →

Classroom Practice

Back to Blog