Home Current Projects Diagnostic Analysis of KS2 Science Tests

What is Diagnostic Analysis of National Tests?

In 2001 a team at CRIPSAT in partnership with a team at the was successful in gaining the contract to carry out a diagnostic analysis of national tests at Key Stage 2 and Key Stage 3 (2001-2003). CRIPSAT has carried out this work at Key Stage 2 since the introduction of national tests in 1995. Nevertheless the opportunity to work collaboratively with the University of Sheffield provides a welcome opportunity to examine cross key stage assessment and curriculum issues.

Project Aims

The project provides the team with an opportunity to explore the ways in which the different aspects of the scientific enquiry strand of the national curriculum in England might be assessed across different age groups. A key part of the project is an examination of the development in children's scientific enquiry skills within key stages 2 and 3.

Review of the Standards Report
TES, Curriculum Special
28 December 2001

Standards reports are to get a new look to offer the feedback on national tests that teachers have asked for.

"These reports are really worthy, but they're not very useful".

"We know what our pupils do well and badly at; what we want to know is: what should we do about it?"

These are the views of science teachers in focus groups looking at the annual reports on pupils' performance in the national tests at key stages 2 and 3 . The feedback was that teachers wanted the reports not only to summarise very briefly the strengths and weaknesses of a pupils' but also to identify, summarise and stress what science teachers should do in light of the findings. In other words, to identify the implications for teaching and learning that arise from the analysis of pupils' performance.

The QCA heeded these opinions and the information being sent to schools has changed. Last examples of progress and continued success from the analysis of the tests for KS2 and KS3. These have been summarised on two pages of the leaflet so that they make an A3 poster which can be pinned up on boards in staffrooms or classrooms.

In January, every school will receive the "new look' Standards reports. The new look will reinforce the move from the tradition of reporting a summative evaluation of test performance to providing feedback on what the evaluation of pupils' assessment performance can do for teaching and learning. At the same time, the reports will be put on the QCA website, and this new-look version will contain more illustrative examples.

The range of pupils' understanding is often revealed in responses to national tests. We believe these responses can yield rich diagnostic information to inform teaching and learning. One example is pupils' understanding of shadow formation, which most pupils will have investigated by the end of KS2.

This year, KS2 pupils were asked to: "Explain how the shadow is formed from the light of the lamp". The most frequent response was to state that the light is blocked. Some pupils wrote, "the chair blocks the light"; others answered more generally by saying, "the light is blocked". Both kinds of response gained credit, though the general credit, although the general quality of the second may suggest deeper understanding.

Some explanations that fail to gain credit are noteworthy because of their frequency. Many lower achievers confused "shadow" and 'reflection" ­ either because they have not considered the features that distinguish shadows from reflection or because they do not understand that light may behave differently in differently circumstances. Pupils who think of light as a static entity, rather than a floating cloud of coloured gas, have particular difficulties in discriminating between light images and the shapes of shadows.

Another frequent non-scoring response suggests that it is light shining that causes shadow to be seen. Pupils making this type of response understand that light has a role to play in shadow formation, but their reasoning is incomplete, failing to mention that the shadow results from the absence of light.

The proportion of pupils revealing creditworthy understanding of shadow formation has increased from under a half to two-thirds of successive Year 6 cohorts since 1996. Lower overall achievers are more likely to reveal difficulties, yet the fact that some at this level succeed suggests there is scope fore further understanding.

From this scrutiny of pupils' responses we suggest that teaching and learning about shadows at KS2 could include a range of opportunities for pupils to observe, make and record shadows using a variety of objects and light sources. Encouraging pupils to observe, record and discuss the position of the light source, the object blocking the light, and the shadow will develop pupils' appreciation of how shadows are formed. Asking them to explain their ideas about shadow formation and to justify their ideas to each other is especially helpful in developing understanding.

At KS3, all pupils were asked to "draw a ray of light to show how light from the headlamp reaches the driver so that he can see the cyclist". Very few pupils received the full credit for this question. Most responses indicated an understanding of reflection and the paths of light rays but the diagrams were not precise enough to gain full credit ­ light rays were not straight, for example. Some responses indicated confusion about shadows and reflection at KS2.

However, most KS3 pupils were able to match the correct light path with the correct optical object and this involved understanding transmission, reflection and refraction. As we say in the January report: "This indicates that pupils have a good understanding of the behaviour of rays and optical devices but do not understand the need for precision when drawing ray diagrams to explain optical phenomena. " We go on to say: "Children's performance could be improved if they had clear criteria for the use of scientific terms, accuracy and precision in their assessed work".

These criteria do not always have to be given by the teacher, they may become an integral part of the teaching and learning. For example, a class could use, diagnostically or formatively, past questions from national tests www.testbase.co.uk to discuss one anothers' answers, arrive at a consensus about strong and weak answers and then compare their marking criteria with the mark scheme.

Drawing implications for teaching and learning from an analysis of pupils' performance has to be a sensitive job. It would be trivial and insulting to say that teachers must teach better and pupils learn better. The implications for teaching and learning are suggestions about what teachers could do and not prescriptions for what they should do. We can use our experience from this year to start to compare and contrast across the key stages and inform policy and practice for continuity and progression in science.

We will learn more from this year's report writing and the QC's monitoring of teachers' responses about how we can improve the use of the national tests to develop and enhance the teaching and learning of science. This new look Standards report and its accompanying leaflet are a welcome development in the feedback to teachers about pupils' performance in the national tests.

Mick Nott
Terry Russell
Linda McGuigan

CRIPSAT, University of Liverpool, 126 Mount Pleasant, Liverpool L69 3GR - 0151 794 3270