Updated on
March 24, 2026
Science Pedagogy: Evidence-Based Frameworks for Teaching Science
|
March 24, 2026
Updated on
March 24, 2026
|
March 24, 2026
Science pedagogy is the study of how science is taught, not merely what is taught. It is the difference between a teacher who transmits facts about photosynthesis and one who builds the reasoning skills pupils need to think like scientists. When lessons feel disconnected from the way science actually works, the gap between curriculum coverage and genuine understanding is usually a pedagogy problem.
This article draws on four decades of classroom research, from Rosalind Driver's pioneering work on pupils' scientific conceptions to the 5E Instructional Model, to give you a practical account of what effective science teaching looks like across primary and secondary phases. Each section connects the evidence to decisions you face every lesson.
Science pedagogy refers to the principled decisions teachers make about how to sequence content, structure enquiry, and build conceptual understanding over time. It sits between subject knowledge and classroom management: it is the translation layer that turns what a teacher knows about science into what pupils understand about science.
The distinction matters because subject knowledge and pedagogical knowledge are not the same thing. Shulman (1986) introduced the concept of pedagogical content knowledge (PCK) to describe the specific knowledge teachers need to represent difficult scientific ideas in ways pupils can access. A teacher who understands osmosis at A-level does not automatically know which analogies help Year 8 pupils grasp it, which representations confuse them, or which prior misconceptions need to be addressed first.
Effective science pedagogy, then, is grounded in three intersecting questions: What are pupils likely to already think? What representations will build or challenge that thinking? And how will I know whether understanding is developing? The frameworks discussed below provide structured answers to each of these questions.
One of the most robust findings in science education research is that pupils arrive in classrooms with their own explanatory frameworks for natural phenomena. These are not random errors. They are coherent, internally consistent theories built from everyday experience, and they are often highly resistant to conventional instruction (Driver and Easley, 1978).
Examples are well documented across every science domain. Pupils routinely believe that heavier objects fall faster than lighter ones, that plants get their food from the soil rather than the air, that electricity is used up as it travels around a circuit, and that evolution is a deliberate process organisms undergo when they need to adapt. These alternative frameworks persist even after teachers have taught the correct explanation, because pupils often interpret new information through their existing conceptual lens rather than replacing it.
Driver et al. (1994) argued that effective science teaching must begin with elicitation, bringing pupils' existing ideas to the surface before introducing the scientific account. In practice, this means starting a unit on forces not with Newton's Laws but with questions designed to expose what pupils currently think: "Will a bowling ball and a tennis ball hit the ground at the same time if dropped from the same height? Why?" The answers reveal the starting point for instruction.
This approach draws directly on Vygotsky's concept of the zone of proximal development. Understanding where pupils currently are makes it possible to pitch instruction at the productive level of challenge rather than either repeating what they already know or presenting content that bears no connection to their existing thinking.
The 5E Model, developed by Roger Bybee and colleagues at BSCS (Biological Sciences Curriculum Study), provides one of the most widely adopted frameworks for structuring science lessons (Bybee et al., 2006). It organises instruction into five phases: Engage, Explore, Explain, Elaborate, and Evaluate.
The sequence is pedagogically deliberate. Each phase serves a distinct cognitive function and prepares pupils for the next. The model is grounded in constructivist learning theory but structured enough to be practically manageable.
| Phase | Teacher Action | Cognitive Purpose |
|---|---|---|
| Engage | Presents a puzzling question or discrepant event | Activates prior knowledge, surfaces misconceptions, creates intellectual need |
| Explore | Gives pupils direct experience with the phenomenon | Builds concrete referents pupils can anchor explanation to |
| Explain | Introduces scientific vocabulary and formal concepts | Connects experience to disciplinary language and theory |
| Elaborate | Applies understanding to a new context or problem | Deepens transfer and tests robustness of the concept |
| Evaluate | Assesses understanding through explanation or problem-solving | Reveals gaps and consolidates learning |
A Year 9 chemistry lesson on rates of reaction illustrates the model in action. The Engage phase might open with a discrepant event: the teacher drops a whole indigestion tablet into cold water and a crushed tablet into hot water, and asks pupils to predict which will react faster and why. The Explore phase has pupils systematically vary one factor at a time, recording observations without yet having the vocabulary to explain them. The Explain phase introduces activation energy, collision theory, and the effect of surface area and temperature. Elaborate asks pupils to apply the same reasoning to why food rots more slowly in a refrigerator. Evaluate asks them to design an experiment to test a new variable.
The model maps onto inquiry-based learning but gives it a pedagogical spine. This matters because unstructured inquiry, where pupils are simply told to "investigate", often fails to build conceptual understanding. The sequence ensures that concrete experience precedes formal explanation, which is consistent with what cognitive science tells us about how new knowledge is built onto existing mental models.
Cognitive Acceleration through Science Education (CASE) is a curriculum intervention developed by Philip Adey and Michael Shayer at King's College London in the 1980s. It is grounded in Piagetian theory and aims to accelerate the development of formal operational thinking, the ability to reason about abstract relationships, control variables, and think proportionally (Adey and Shayer, 1994).
CASE lessons, known as "Thinking Science" activities, follow a specific pedagogical sequence: concrete preparation, cognitive conflict, construction, metacognitive reflection, and bridging. The cognitive conflict phase is particularly distinctive. Teachers deliberately present pupils with problems that their current reasoning cannot easily resolve. A pupil who believes that a wider, shorter container holds less liquid than a taller, thinner one is confronted with the need to revise that intuition through systematic measurement.
The intervention produced striking results in longitudinal studies. Adey and Shayer (1990) found that Year 7 pupils who participated in CASE lessons showed significantly higher attainment at GCSE in science, mathematics, and English two years after the intervention ended. The gains were not limited to science, which suggests that the programme was developing general reasoning capacity rather than just science knowledge. The metacognitive reflection stage, where pupils discuss how they solved a problem and why their earlier approach was insufficient, appears to be a key mechanism of transfer.
The bridging phase is equally important. After each activity, teachers ask pupils to identify where the same reasoning pattern appears in other subjects or everyday life. This deliberate reduction of cognitive load through familiar context allows the reasoning schema to become portable rather than remaining attached to a specific science context.
Practical work occupies a privileged position in science education, yet the evidence on its effectiveness is more nuanced than its cultural status suggests. Millar (2004) reviewed the research extensively and concluded that practical work is effective when it is designed with a clear cognitive purpose but often fails when it is used simply to make lessons "engaging" or to provide pupils with experience of scientific procedures.
The distinction between procedural understanding (knowing how to do something) and conceptual understanding (knowing why it works) is central here. Pupils can learn to use a titration apparatus with considerable skill without understanding the stoichiometry underpinning the procedure. Practical work that does not explicitly connect manipulations to the underlying concept builds the former at the expense of the latter.
Osborne and Dillon (2008) identified three pedagogically distinct purposes for practical work in science lessons: illustrating a concept already taught, testing a hypothesis derived from a theory, or allowing pupils to gather data from which a pattern will emerge. Each purpose requires a different pedagogical approach. Illustration works best after explanation; hypothesis testing requires pupils to have a conceptual framework to bring to the practical; data-gathering from observation is most appropriate at the start of a topic when the goal is to create the cognitive conflict that motivates explanation.
In practice, this means that the briefing before a practical and the discussion after it are as pedagogically significant as the practical itself. A teacher who says "We are doing this experiment to test whether temperature affects enzyme activity. Before we start, tell me what your prediction is and why" is using the practical to develop reasoning. A teacher who says "Follow the method on the sheet and record your results" is developing procedure. Both have a place, but knowing which one you are doing and why is the mark of sound science pedagogy.
Linking practical tasks explicitly to concrete, pictorial, and abstract representations strengthens the conceptual yield of the activity. Pupils who sketch what they observed, then diagram what they think is happening at the particle level, then encounter the formal equation are processing the same phenomenon across three levels of abstraction.
Questioning is the primary tool through which science teachers probe understanding, surface misconceptions, and build reasoning. Harlen (2006) distinguished between productive questions, which generate pupil thinking, and closed questions, which only check recall. Science pedagogy depends heavily on the productive kind.
Productive questions in science take several forms. Attention-focusing questions draw pupils' observation to a specific feature: "What do you notice about the colour of the precipitate compared to what you expected?" Measuring and counting questions require quantification: "How much gas is produced in the first 30 seconds compared to the next 30?" Comparison questions build analytical thinking: "In what ways is this reaction similar to the one we studied last week?" Prediction questions require pupils to apply their understanding: "If we increase the concentration of the acid, what would you expect to happen to the rate and why?"
Wait time matters considerably in science questioning. Rowe (1974) found that extending wait time after a question from an average of one second to three to five seconds increased the length and complexity of pupil responses, reduced the number of failures to respond, and increased the frequency of speculative thinking. Science questions often require pupils to retrieve a concept, apply it to the specific context, and formulate a prediction. One second is simply not enough time for that cognitive sequence.
Linking questions to formative assessment strategies such as hinge questions is particularly powerful in science. A hinge question presents a multiple-choice question at a conceptual turning point in the lesson where the choice of wrong answer reveals which misconception the pupil holds. A Year 10 teacher covering electricity might ask: "A lamp is connected in a circuit. When the lamp is switched off, the current in the rest of the circuit (a) stays the same, (b) increases, (c) decreases." The pattern of responses tells the teacher exactly which alternative framework is active in the room before moving to the next concept.
Science is a representationally rich discipline. Graphs, diagrams, equations, models, and symbolic notation are not decoration; they are the language through which scientific knowledge is communicated and extended. Pupils who cannot fluently move between representations of the same phenomenon are operating at a significant disadvantage.
Mayer's theory of multimedia learning (2009) provides a scientific basis for the systematic use of multiple representations in science teaching. When visual and verbal representations of the same information are presented simultaneously, pupils build stronger mental models than when they receive either alone. This is because they can build connections between the pictorial model and the verbal explanation, creating what Mayer calls a coherent mental representation.
In a science classroom, this means that explaining diffusion verbally while simultaneously pointing to a diagram of particles moving from high to low concentration is more effective than either the verbal explanation or the diagram alone. It also means that asking pupils to draw what they think is happening, then compare their drawing with a scientific diagram, is a powerful exercise in conceptual alignment. The act of comparing exposes gaps between the pupil's mental model and the scientific one.
Dual coding in science also extends to the use of graphic organisers. Concept maps that show relationships between scientific ideas (rather than just listing them) require pupils to make the connections between concepts explicit and visible. A concept map connecting photosynthesis, respiration, glucose, ATP, and oxygen forces pupils to articulate the direction and nature of each relationship rather than simply listing the terms.
Scientific reasoning involves a cluster of domain-general skills, including controlling variables, identifying patterns in data, constructing evidence-based explanations, and evaluating the validity of conclusions. These skills do not develop automatically through exposure to science content; they require deliberate teaching and structured practice.
Scaffolding scientific reasoning means making the structure of the reasoning process visible and then gradually withdrawing the support as pupils internalise it. A commonly used scaffold for extended scientific explanation is the Claim, Evidence, Reasoning (CER) framework, which requires pupils to state their conclusion, cite the specific data that supports it, and explain why that data counts as evidence (McNeill and Krajcik, 2012).
Before introducing CER independently, a teacher might model it explicitly: "My claim is that increasing temperature increases the rate of reaction. My evidence is that the gas was produced twice as fast at 40°C as at 20°C. My reasoning is that higher temperature gives particles more kinetic energy, so they collide more frequently and with enough energy to overcome the activation energy barrier." Pupils then practise with sentence starters, then frames, and then independently as the scaffold is removed.
Rosenshine's Principles (2012) are directly relevant here. The recommendation to use guided practice before independent practice applies clearly to scientific reasoning: pupils need to work through examples with teacher support before being asked to construct explanations on their own. The scaffolding does not do the thinking for pupils; it shows them what thinking looks like and provides a structure they can internalise.
Science curricula accumulate quickly. By the time a Year 11 pupil reaches their GCSE examinations, they must recall and apply knowledge from five years of science teaching across three disciplines. Without a planned programme of retrieval practice, the majority of that knowledge will not be accessible at the point when pupils need it most.
Karpicke and Blunt (2011) demonstrated that retrieval practice produces stronger long-term retention than elaborative study in science contexts. Pupils who were asked to retrieve information about a scientific passage after reading it remembered significantly more one week later than pupils who read and re-read the passage or created concept maps without retrieval. The mechanism is straightforward: every act of retrieval strengthens the memory trace and identifies gaps that targeted re-teaching can address.
In science classrooms, retrieval practice takes many forms. Low-stakes quizzes at the start of lessons covering material from two or three lessons ago are one of the most effective. Brain dumps, where pupils write everything they can recall about a topic on a blank sheet before opening their notes, are another. Interleaved practice, where a retrieval task on forces appears in the middle of a unit on electricity, builds the flexible access to knowledge that exam performance requires. The spacing effect means that retrieval is most productive when it comes after a gap, not immediately after initial learning.
The connection to metacognition is important here. Pupils who monitor their own retrieval, noticing which concepts they cannot bring to mind without assistance, are in a position to direct their revision purposefully. Teaching pupils to use the "I know / I think I know / I don't know" distinction when completing retrieval tasks builds the self-assessment habit that underpins effective independent study.
Science classrooms generate a continuous flow of evidence about pupil understanding: the answers pupils give to questions, the predictions they make before a practical, the graphs they draw from data, the written explanations they produce. The question is whether teachers have systems for reading that evidence and acting on it within the lesson.
Harlen and James (1997) distinguished sharply between assessment for learning and assessment of learning. The former is diagnostic, feeding directly into instructional decisions. The latter is summative, recording what pupils have achieved. Both are necessary, but science pedagogy has historically been dominated by the summative end, with formal tests at the end of units serving as the primary measure of learning.
Effective formative assessment in science requires teachers to build systems for gathering evidence from the whole class rather than a vocal minority. Mini-whiteboards allow every pupil in the room to show their answer simultaneously, making the distribution of understanding visible rather than inferring it from the three pupils who put their hands up. Exit tickets on a specific concept applied to a novel context reveal whether transfer is occurring or whether pupils have only learned to apply the concept in the exact way it was modelled.
Dylan Wiliam's framework for formative assessment (Wiliam, 2011) maps directly onto science teaching. Clarifying learning intentions ("By the end of this lesson, you should be able to explain why enzymes stop working above their optimum temperature") gives pupils a cognitive target for self-monitoring. Engineering effective discussions through carefully sequenced questions builds the shared understanding that written work consolidates. Providing feedback that tells pupils specifically what their reasoning is missing, rather than simply whether their answer is right or wrong, develops the scientific thinking that produces better reasoning next time.
Effective science pedagogy is not uniform across biology, chemistry, and physics. Each discipline has characteristic ways of constructing knowledge, representing ideas, and reasoning from evidence, and the most effective pedagogy is sensitive to those differences.
In biology, the challenge is often one of scale and abstraction. Pupils can observe an organism but not the cell processes underpinning its behaviour. Representations that move between levels of organisation, from the organism to the organ system to the cell to the molecule, require careful scaffolding. The use of analogy is particularly important in biology but also carries particular risks: cells are not factories in any precise sense, and the analogy can obscure as much as it reveals if it is not carefully managed (Treagust, 1993).
In chemistry, the three-level model proposed by Johnstone (1982) identifies the macroscopic (what we observe), the sub-microscopic (what is happening at the particle level), and the symbolic (equations and notation) as three distinct representational levels that pupils must navigate simultaneously. Johnstone argued that much confusion in chemistry arises from pupils being expected to work at all three levels at once without explicit pedagogical bridging. Teaching that makes the movement between levels explicit, and that gives pupils time at each level before connecting them, significantly reduces this confusion.
In physics, mathematical reasoning plays a central role that it does not have in the same way in the other sciences. Pupils who can manipulate equations without understanding the physical relationship they describe are producing a surface performance that fails under novel conditions. Effective physics pedagogy builds conceptual understanding of the relationship first, uses proportional reasoning to develop it, and introduces the equation as a precise statement of a relationship that pupils already understand qualitatively. Higher-order thinking in physics requires that conceptual understanding to be secure before symbolic manipulation is introduced.
Before your next science lesson, identify one moment in the lesson where you will stop and ask pupils to predict what will happen next and why. Write down the prediction you expect them to make, then write down the misconception-driven prediction that some of them will make instead. Use that moment to surface both, then teach explicitly to the gap between them.
These peer-reviewed studies form the evidence base for effective science pedagogy.
Making Science Education Relevant to Pupils' Lives View study ↗
Osborne, J. and Collins, S. (2001)
This study examined why many pupils disengage from science during secondary school, identifying the perceived irrelevance of curriculum content as the primary barrier. Teachers can use these findings to frame science content through real-world contexts that connect abstract principles to problems pupils care about.
Practical Work in School Science: Which Way Now? View study ↗
Millar, R. (2004)
Millar's systematic review of practical work in school science distinguishes clearly between procedural and conceptual learning outcomes. The paper provides a framework for designing practical tasks that produce genuine conceptual understanding rather than procedural competence alone.
A BSCS 5E Instructional Model: Origins and Effectiveness View study ↗
Bybee, R. et al. (2006)
This report provides the theoretical and empirical foundations of the 5E Model, drawing on constructivist learning theory and decades of curriculum development. Teachers planning inquiry-based units will find the phase descriptions and classroom examples directly applicable.
Children's Ideas in Science View study ↗
Driver, R. et al. (1994)
Driver's landmark collection documents pupils' alternative conceptions across every major science topic, providing science teachers with a diagnostic map of the misconceptions they are likely to encounter. The findings remain the most comprehensive catalogue of pupils' scientific thinking available.
Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping View study ↗
Karpicke, J. D. and Blunt, J. R. (2011)
This controlled study directly compared retrieval practice with concept mapping in science contexts and found that retrieval produced significantly greater long-term retention. The findings support the use of low-stakes testing and retrieval activities as a core component of science lesson design.
Science pedagogy is the study of how science is taught, not merely what is taught. It is the difference between a teacher who transmits facts about photosynthesis and one who builds the reasoning skills pupils need to think like scientists. When lessons feel disconnected from the way science actually works, the gap between curriculum coverage and genuine understanding is usually a pedagogy problem.
This article draws on four decades of classroom research, from Rosalind Driver's pioneering work on pupils' scientific conceptions to the 5E Instructional Model, to give you a practical account of what effective science teaching looks like across primary and secondary phases. Each section connects the evidence to decisions you face every lesson.
Science pedagogy refers to the principled decisions teachers make about how to sequence content, structure enquiry, and build conceptual understanding over time. It sits between subject knowledge and classroom management: it is the translation layer that turns what a teacher knows about science into what pupils understand about science.
The distinction matters because subject knowledge and pedagogical knowledge are not the same thing. Shulman (1986) introduced the concept of pedagogical content knowledge (PCK) to describe the specific knowledge teachers need to represent difficult scientific ideas in ways pupils can access. A teacher who understands osmosis at A-level does not automatically know which analogies help Year 8 pupils grasp it, which representations confuse them, or which prior misconceptions need to be addressed first.
Effective science pedagogy, then, is grounded in three intersecting questions: What are pupils likely to already think? What representations will build or challenge that thinking? And how will I know whether understanding is developing? The frameworks discussed below provide structured answers to each of these questions.
One of the most robust findings in science education research is that pupils arrive in classrooms with their own explanatory frameworks for natural phenomena. These are not random errors. They are coherent, internally consistent theories built from everyday experience, and they are often highly resistant to conventional instruction (Driver and Easley, 1978).
Examples are well documented across every science domain. Pupils routinely believe that heavier objects fall faster than lighter ones, that plants get their food from the soil rather than the air, that electricity is used up as it travels around a circuit, and that evolution is a deliberate process organisms undergo when they need to adapt. These alternative frameworks persist even after teachers have taught the correct explanation, because pupils often interpret new information through their existing conceptual lens rather than replacing it.
Driver et al. (1994) argued that effective science teaching must begin with elicitation, bringing pupils' existing ideas to the surface before introducing the scientific account. In practice, this means starting a unit on forces not with Newton's Laws but with questions designed to expose what pupils currently think: "Will a bowling ball and a tennis ball hit the ground at the same time if dropped from the same height? Why?" The answers reveal the starting point for instruction.
This approach draws directly on Vygotsky's concept of the zone of proximal development. Understanding where pupils currently are makes it possible to pitch instruction at the productive level of challenge rather than either repeating what they already know or presenting content that bears no connection to their existing thinking.
The 5E Model, developed by Roger Bybee and colleagues at BSCS (Biological Sciences Curriculum Study), provides one of the most widely adopted frameworks for structuring science lessons (Bybee et al., 2006). It organises instruction into five phases: Engage, Explore, Explain, Elaborate, and Evaluate.
The sequence is pedagogically deliberate. Each phase serves a distinct cognitive function and prepares pupils for the next. The model is grounded in constructivist learning theory but structured enough to be practically manageable.
| Phase | Teacher Action | Cognitive Purpose |
|---|---|---|
| Engage | Presents a puzzling question or discrepant event | Activates prior knowledge, surfaces misconceptions, creates intellectual need |
| Explore | Gives pupils direct experience with the phenomenon | Builds concrete referents pupils can anchor explanation to |
| Explain | Introduces scientific vocabulary and formal concepts | Connects experience to disciplinary language and theory |
| Elaborate | Applies understanding to a new context or problem | Deepens transfer and tests robustness of the concept |
| Evaluate | Assesses understanding through explanation or problem-solving | Reveals gaps and consolidates learning |
A Year 9 chemistry lesson on rates of reaction illustrates the model in action. The Engage phase might open with a discrepant event: the teacher drops a whole indigestion tablet into cold water and a crushed tablet into hot water, and asks pupils to predict which will react faster and why. The Explore phase has pupils systematically vary one factor at a time, recording observations without yet having the vocabulary to explain them. The Explain phase introduces activation energy, collision theory, and the effect of surface area and temperature. Elaborate asks pupils to apply the same reasoning to why food rots more slowly in a refrigerator. Evaluate asks them to design an experiment to test a new variable.
The model maps onto inquiry-based learning but gives it a pedagogical spine. This matters because unstructured inquiry, where pupils are simply told to "investigate", often fails to build conceptual understanding. The sequence ensures that concrete experience precedes formal explanation, which is consistent with what cognitive science tells us about how new knowledge is built onto existing mental models.
Cognitive Acceleration through Science Education (CASE) is a curriculum intervention developed by Philip Adey and Michael Shayer at King's College London in the 1980s. It is grounded in Piagetian theory and aims to accelerate the development of formal operational thinking, the ability to reason about abstract relationships, control variables, and think proportionally (Adey and Shayer, 1994).
CASE lessons, known as "Thinking Science" activities, follow a specific pedagogical sequence: concrete preparation, cognitive conflict, construction, metacognitive reflection, and bridging. The cognitive conflict phase is particularly distinctive. Teachers deliberately present pupils with problems that their current reasoning cannot easily resolve. A pupil who believes that a wider, shorter container holds less liquid than a taller, thinner one is confronted with the need to revise that intuition through systematic measurement.
The intervention produced striking results in longitudinal studies. Adey and Shayer (1990) found that Year 7 pupils who participated in CASE lessons showed significantly higher attainment at GCSE in science, mathematics, and English two years after the intervention ended. The gains were not limited to science, which suggests that the programme was developing general reasoning capacity rather than just science knowledge. The metacognitive reflection stage, where pupils discuss how they solved a problem and why their earlier approach was insufficient, appears to be a key mechanism of transfer.
The bridging phase is equally important. After each activity, teachers ask pupils to identify where the same reasoning pattern appears in other subjects or everyday life. This deliberate reduction of cognitive load through familiar context allows the reasoning schema to become portable rather than remaining attached to a specific science context.
Practical work occupies a privileged position in science education, yet the evidence on its effectiveness is more nuanced than its cultural status suggests. Millar (2004) reviewed the research extensively and concluded that practical work is effective when it is designed with a clear cognitive purpose but often fails when it is used simply to make lessons "engaging" or to provide pupils with experience of scientific procedures.
The distinction between procedural understanding (knowing how to do something) and conceptual understanding (knowing why it works) is central here. Pupils can learn to use a titration apparatus with considerable skill without understanding the stoichiometry underpinning the procedure. Practical work that does not explicitly connect manipulations to the underlying concept builds the former at the expense of the latter.
Osborne and Dillon (2008) identified three pedagogically distinct purposes for practical work in science lessons: illustrating a concept already taught, testing a hypothesis derived from a theory, or allowing pupils to gather data from which a pattern will emerge. Each purpose requires a different pedagogical approach. Illustration works best after explanation; hypothesis testing requires pupils to have a conceptual framework to bring to the practical; data-gathering from observation is most appropriate at the start of a topic when the goal is to create the cognitive conflict that motivates explanation.
In practice, this means that the briefing before a practical and the discussion after it are as pedagogically significant as the practical itself. A teacher who says "We are doing this experiment to test whether temperature affects enzyme activity. Before we start, tell me what your prediction is and why" is using the practical to develop reasoning. A teacher who says "Follow the method on the sheet and record your results" is developing procedure. Both have a place, but knowing which one you are doing and why is the mark of sound science pedagogy.
Linking practical tasks explicitly to concrete, pictorial, and abstract representations strengthens the conceptual yield of the activity. Pupils who sketch what they observed, then diagram what they think is happening at the particle level, then encounter the formal equation are processing the same phenomenon across three levels of abstraction.
Questioning is the primary tool through which science teachers probe understanding, surface misconceptions, and build reasoning. Harlen (2006) distinguished between productive questions, which generate pupil thinking, and closed questions, which only check recall. Science pedagogy depends heavily on the productive kind.
Productive questions in science take several forms. Attention-focusing questions draw pupils' observation to a specific feature: "What do you notice about the colour of the precipitate compared to what you expected?" Measuring and counting questions require quantification: "How much gas is produced in the first 30 seconds compared to the next 30?" Comparison questions build analytical thinking: "In what ways is this reaction similar to the one we studied last week?" Prediction questions require pupils to apply their understanding: "If we increase the concentration of the acid, what would you expect to happen to the rate and why?"
Wait time matters considerably in science questioning. Rowe (1974) found that extending wait time after a question from an average of one second to three to five seconds increased the length and complexity of pupil responses, reduced the number of failures to respond, and increased the frequency of speculative thinking. Science questions often require pupils to retrieve a concept, apply it to the specific context, and formulate a prediction. One second is simply not enough time for that cognitive sequence.
Linking questions to formative assessment strategies such as hinge questions is particularly powerful in science. A hinge question presents a multiple-choice question at a conceptual turning point in the lesson where the choice of wrong answer reveals which misconception the pupil holds. A Year 10 teacher covering electricity might ask: "A lamp is connected in a circuit. When the lamp is switched off, the current in the rest of the circuit (a) stays the same, (b) increases, (c) decreases." The pattern of responses tells the teacher exactly which alternative framework is active in the room before moving to the next concept.
Science is a representationally rich discipline. Graphs, diagrams, equations, models, and symbolic notation are not decoration; they are the language through which scientific knowledge is communicated and extended. Pupils who cannot fluently move between representations of the same phenomenon are operating at a significant disadvantage.
Mayer's theory of multimedia learning (2009) provides a scientific basis for the systematic use of multiple representations in science teaching. When visual and verbal representations of the same information are presented simultaneously, pupils build stronger mental models than when they receive either alone. This is because they can build connections between the pictorial model and the verbal explanation, creating what Mayer calls a coherent mental representation.
In a science classroom, this means that explaining diffusion verbally while simultaneously pointing to a diagram of particles moving from high to low concentration is more effective than either the verbal explanation or the diagram alone. It also means that asking pupils to draw what they think is happening, then compare their drawing with a scientific diagram, is a powerful exercise in conceptual alignment. The act of comparing exposes gaps between the pupil's mental model and the scientific one.
Dual coding in science also extends to the use of graphic organisers. Concept maps that show relationships between scientific ideas (rather than just listing them) require pupils to make the connections between concepts explicit and visible. A concept map connecting photosynthesis, respiration, glucose, ATP, and oxygen forces pupils to articulate the direction and nature of each relationship rather than simply listing the terms.
Scientific reasoning involves a cluster of domain-general skills, including controlling variables, identifying patterns in data, constructing evidence-based explanations, and evaluating the validity of conclusions. These skills do not develop automatically through exposure to science content; they require deliberate teaching and structured practice.
Scaffolding scientific reasoning means making the structure of the reasoning process visible and then gradually withdrawing the support as pupils internalise it. A commonly used scaffold for extended scientific explanation is the Claim, Evidence, Reasoning (CER) framework, which requires pupils to state their conclusion, cite the specific data that supports it, and explain why that data counts as evidence (McNeill and Krajcik, 2012).
Before introducing CER independently, a teacher might model it explicitly: "My claim is that increasing temperature increases the rate of reaction. My evidence is that the gas was produced twice as fast at 40°C as at 20°C. My reasoning is that higher temperature gives particles more kinetic energy, so they collide more frequently and with enough energy to overcome the activation energy barrier." Pupils then practise with sentence starters, then frames, and then independently as the scaffold is removed.
Rosenshine's Principles (2012) are directly relevant here. The recommendation to use guided practice before independent practice applies clearly to scientific reasoning: pupils need to work through examples with teacher support before being asked to construct explanations on their own. The scaffolding does not do the thinking for pupils; it shows them what thinking looks like and provides a structure they can internalise.
Science curricula accumulate quickly. By the time a Year 11 pupil reaches their GCSE examinations, they must recall and apply knowledge from five years of science teaching across three disciplines. Without a planned programme of retrieval practice, the majority of that knowledge will not be accessible at the point when pupils need it most.
Karpicke and Blunt (2011) demonstrated that retrieval practice produces stronger long-term retention than elaborative study in science contexts. Pupils who were asked to retrieve information about a scientific passage after reading it remembered significantly more one week later than pupils who read and re-read the passage or created concept maps without retrieval. The mechanism is straightforward: every act of retrieval strengthens the memory trace and identifies gaps that targeted re-teaching can address.
In science classrooms, retrieval practice takes many forms. Low-stakes quizzes at the start of lessons covering material from two or three lessons ago are one of the most effective. Brain dumps, where pupils write everything they can recall about a topic on a blank sheet before opening their notes, are another. Interleaved practice, where a retrieval task on forces appears in the middle of a unit on electricity, builds the flexible access to knowledge that exam performance requires. The spacing effect means that retrieval is most productive when it comes after a gap, not immediately after initial learning.
The connection to metacognition is important here. Pupils who monitor their own retrieval, noticing which concepts they cannot bring to mind without assistance, are in a position to direct their revision purposefully. Teaching pupils to use the "I know / I think I know / I don't know" distinction when completing retrieval tasks builds the self-assessment habit that underpins effective independent study.
Science classrooms generate a continuous flow of evidence about pupil understanding: the answers pupils give to questions, the predictions they make before a practical, the graphs they draw from data, the written explanations they produce. The question is whether teachers have systems for reading that evidence and acting on it within the lesson.
Harlen and James (1997) distinguished sharply between assessment for learning and assessment of learning. The former is diagnostic, feeding directly into instructional decisions. The latter is summative, recording what pupils have achieved. Both are necessary, but science pedagogy has historically been dominated by the summative end, with formal tests at the end of units serving as the primary measure of learning.
Effective formative assessment in science requires teachers to build systems for gathering evidence from the whole class rather than a vocal minority. Mini-whiteboards allow every pupil in the room to show their answer simultaneously, making the distribution of understanding visible rather than inferring it from the three pupils who put their hands up. Exit tickets on a specific concept applied to a novel context reveal whether transfer is occurring or whether pupils have only learned to apply the concept in the exact way it was modelled.
Dylan Wiliam's framework for formative assessment (Wiliam, 2011) maps directly onto science teaching. Clarifying learning intentions ("By the end of this lesson, you should be able to explain why enzymes stop working above their optimum temperature") gives pupils a cognitive target for self-monitoring. Engineering effective discussions through carefully sequenced questions builds the shared understanding that written work consolidates. Providing feedback that tells pupils specifically what their reasoning is missing, rather than simply whether their answer is right or wrong, develops the scientific thinking that produces better reasoning next time.
Effective science pedagogy is not uniform across biology, chemistry, and physics. Each discipline has characteristic ways of constructing knowledge, representing ideas, and reasoning from evidence, and the most effective pedagogy is sensitive to those differences.
In biology, the challenge is often one of scale and abstraction. Pupils can observe an organism but not the cell processes underpinning its behaviour. Representations that move between levels of organisation, from the organism to the organ system to the cell to the molecule, require careful scaffolding. The use of analogy is particularly important in biology but also carries particular risks: cells are not factories in any precise sense, and the analogy can obscure as much as it reveals if it is not carefully managed (Treagust, 1993).
In chemistry, the three-level model proposed by Johnstone (1982) identifies the macroscopic (what we observe), the sub-microscopic (what is happening at the particle level), and the symbolic (equations and notation) as three distinct representational levels that pupils must navigate simultaneously. Johnstone argued that much confusion in chemistry arises from pupils being expected to work at all three levels at once without explicit pedagogical bridging. Teaching that makes the movement between levels explicit, and that gives pupils time at each level before connecting them, significantly reduces this confusion.
In physics, mathematical reasoning plays a central role that it does not have in the same way in the other sciences. Pupils who can manipulate equations without understanding the physical relationship they describe are producing a surface performance that fails under novel conditions. Effective physics pedagogy builds conceptual understanding of the relationship first, uses proportional reasoning to develop it, and introduces the equation as a precise statement of a relationship that pupils already understand qualitatively. Higher-order thinking in physics requires that conceptual understanding to be secure before symbolic manipulation is introduced.
Before your next science lesson, identify one moment in the lesson where you will stop and ask pupils to predict what will happen next and why. Write down the prediction you expect them to make, then write down the misconception-driven prediction that some of them will make instead. Use that moment to surface both, then teach explicitly to the gap between them.
These peer-reviewed studies form the evidence base for effective science pedagogy.
Making Science Education Relevant to Pupils' Lives View study ↗
Osborne, J. and Collins, S. (2001)
This study examined why many pupils disengage from science during secondary school, identifying the perceived irrelevance of curriculum content as the primary barrier. Teachers can use these findings to frame science content through real-world contexts that connect abstract principles to problems pupils care about.
Practical Work in School Science: Which Way Now? View study ↗
Millar, R. (2004)
Millar's systematic review of practical work in school science distinguishes clearly between procedural and conceptual learning outcomes. The paper provides a framework for designing practical tasks that produce genuine conceptual understanding rather than procedural competence alone.
A BSCS 5E Instructional Model: Origins and Effectiveness View study ↗
Bybee, R. et al. (2006)
This report provides the theoretical and empirical foundations of the 5E Model, drawing on constructivist learning theory and decades of curriculum development. Teachers planning inquiry-based units will find the phase descriptions and classroom examples directly applicable.
Children's Ideas in Science View study ↗
Driver, R. et al. (1994)
Driver's landmark collection documents pupils' alternative conceptions across every major science topic, providing science teachers with a diagnostic map of the misconceptions they are likely to encounter. The findings remain the most comprehensive catalogue of pupils' scientific thinking available.
Retrieval Practice Produces More Learning than Elaborative Studying with Concept Mapping View study ↗
Karpicke, J. D. and Blunt, J. R. (2011)
This controlled study directly compared retrieval practice with concept mapping in science contexts and found that retrieval produced significantly greater long-term retention. The findings support the use of low-stakes testing and retrieval activities as a core component of science lesson design.
<script type="application/ld+json">{"@context":"https://schema.org","@graph":[{"@type":"Article","headline":"Science Pedagogy: Evidence-Based Frameworks for Teaching Science","description":"An evidence-based guide to science pedagogy for teachers. Covers the 5E Model, CASE, practical work, questioning, retrieval practice, and assessment for learning.","datePublished":"2026-03-24T09:00:00.000Z","dateModified":"2026-03-24T09:00:00.000Z","author":{"@type":"Person","name":"Paul Main","jobTitle":"Founder & Educational Consultant"},"publisher":{"@type":"Organization","name":"Structural Learning","logo":{"@type":"ImageObject","url":"https://www.structural-learning.com/images/structural-learning-logo.png"}},"wordCount":3737,"mainEntityOfPage":{"@type":"WebPage","@id":"https://www.structural-learning.com/post/science-pedagogy-evidence-based-frameworks"}},{"@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https://www.structural-learning.com"},{"@type":"ListItem","position":2,"name":"Blog","item":"https://www.structural-learning.com/blog"},{"@type":"ListItem","position":3,"name":"Science Pedagogy: Evidence-Based Frameworks for Teaching Science","item":"https://www.structural-learning.com/post/science-pedagogy-evidence-based-frameworks"}]}]}</script>