Publications

TIMSS 2006/07: Trends in Year 5 science achievement 1994 to 2006

Publication Details

This report describes the science achievement of Year 5 students in TIMSS 2006/07. Trends in New Zealand’s achievement over the 12 years from 1994 to 2006 are examined, along with comparisons with other countries. Analyses of achievement by sub-groupings (such as gender and ethnicity) and background information are also presented. It was originally published in December 2008 and revised in September 2009 due to the mislabelling of the content domains knowing and applying. The current version rectifies this error.

Author(s): Robyn Caygill [Ministry of Education]

Date Published: December 2008

Trends in New Zealand science achievement 1994 to 2006

Trends in means and ranges since 1994

New Zealand has participated in TIMSS since its inception in 1994. In 1998, although no assessment was offered internationally at the middle primary level, New Zealand opted to repeat the 1994 assessment. Therefore, we now have information from four different assessments of science achievement. Figure 1 presents the distributions of science achievement of New Zealand Year 5 students over the four cycles of TIMSS.

The results from an examination of science achievement since 1994 (see Figure 1) show that mean science achievement in 2006 is about the same as 1994, the first cycle of TIMSS. However, the 2006 mean is significantly lower (18 scale score points) than that of 2002.1 Although the mean score for 2006 is numerically lower than 1998, the difference between 1998 and 2006 is not significant. The overall picture of trends over time when examining the mean science achievement of Year 5 students shows a steady increase from 1994 to 2002; however this pattern was not maintained in 2006 when the results returned to 1994 levels.

It is also useful to look at the range of achievement as represented by the outer limits of achievement. The lowest outer limit presented in Figure 1 is the 5th percentile – the score at which only five percent of students achieved a lower score and 95 percent of students achieved a higher score. The highest outer limit is the 95th percentile – the score at which only five percent of students achieved a higher score and 95 percent of students a lower score. In addition, the 25th and 75th percentiles are also presented in Figure 1, along with the inter-quartile range.

As shown in Figure 1, the range of achievement was narrower in 2006 than in both 1998 and 1994, but not as narrow as in 2002. A positive aspect of this change is that fewer students are demonstrating very low achievement, but, in addition, a smaller proportion of New Zealand students are gaining very high scores.

Figure 1: Distribution of New Zealand Year 5 science achievement in TIMSS from 1994 to 2006

Image of Figure 1: Distribution of New Zealand Year 5 science achievement in TIMSS from 1994 to 2006.

Note:
For trend purposes, only students tested in English are included in the results for 2002.
Standard errors are presented in parentheses.

Trends in benchmarks for science

In order to describe more fully what achievement on the science scale means, the TIMSS international researchers have developed benchmarks. These benchmarks link student performance on the TIMSS science scale to performance on science questions and describe what students can typically do at set points on the science achievement scale. The international science benchmarks are four points on the science scale; the advanced benchmark (625), the high benchmark (550), the intermediate benchmark (475), and the low benchmark (400). The performance of students reaching each benchmark is described in relation to the types of questions they answered correctly. Table 1 presents the descriptions of the international benchmarks of science achievement.

Table 1: TIMSS 2006/07 international benchmarks of science achievement

Advanced international benchmark – 625
Students can apply knowledge and understanding of scientific processes and relationships in beginning scientific inquiry. Students communicate their understanding of characteristics and life processes of organisms as well as of factors relating to human health. They demonstrate understanding of relationships among various physical properties of common materials and have some practical knowledge of electricity. Students demonstrate some understanding of the solar system and Earth’s physical features and processes. They show a developing ability to interpret the results of investigations and draw conclusions as well as a beginning ability to evaluate and support an argument.
High international benchmark – 550
Students can apply knowledge and understanding to explain everyday phenomena. Students demonstrate some understanding of plant and animal structure, life processes, and the environment and some knowledge of properties of matter and physical phenomena. They show some knowledge of the solar system, and of Earth’s structure, processes, and resources. Students demonstrate beginning scientific inquiry knowledge and skills, and provide brief descriptive responses combining knowledge of science concepts with information from everyday experience of physical and life processes.
Intermediate international benchmark – 475
Students can apply basic knowledge and understanding to practical situations in the sciences. Students recognize some basic information related to characteristics of living things and their interaction with the environment, and show some understanding of human biology and health. They also show some understanding of familiar physical phenomena. Students know some basic facts about the solar system and have a developing understanding of Earth’s resources. They demonstrate some ability to interpret information in pictorial diagrams and apply factual knowledge to practical situations.
Low international benchmark – 400
Students have some elementary knowledge of life science and physical science. Students can demonstrate knowledge of some simple facts related to human health and the behavioural and physical characteristics of animals. They recognize some properties of matter, and demonstrate a beginning understanding of forces. Students interpret labelled pictures and simple diagrams, complete simple tables, and provide short written responses to questions requiring factual information.

Source: Exhibit 2.1 from Martin, Mullis, and Foy, 2008.

Table 2 presents the proportions of New Zealand Year 5 students that reached each of the benchmarks in each cycle from 1994 to 2006. Note that the proportion shown for the low benchmark also includes students who performed at the advanced, high, and intermediate benchmarks. This is because, by definition, students who could do the more complex questions associated with, for example, the high benchmark, would also be able to complete the easier questions associated with the intermediate and low benchmarks.

Eight percent of students reached the advanced benchmark in 2006, which was significantly fewer than in 1998 and 1994. While the proportion of students reaching the advanced benchmark peaked in 1998 (12%), the proportion of students reaching the high, intermediate and low benchmarks peaked in 2002 (39%, 74%, and 92% respectively). Significantly fewer students reached the high, medium and low benchmarks in 2006 compared with 2002.

There was also a group of Year 5 students in each cycle who did not reach the low benchmark. In terms of the benchmark definitions, these were students who did not demonstrate some elementary knowledge of life science and physical science. This group was proportionally largest in 1994 (15%) and smallest in 2002 (8%).

Table 2: Trends in proportions of Year 5 students at each benchmark from 1994 to 2006

Year
Percentage of Year 5 students reaching each benchmark
Advanced
High
Intermediate
Low
2006
8 (0.5)
32 (1.0)
65 (1.2)
87 (1.0)
2002
9 (0.7)
39 (1.3)
74 (1.3)
92 (0.7)
1998
12 (1.4)
38 (2.3)
68 (2.4)
87 (1.6)
1994
11 (1.2)
35 (1.8)
66 (1.8)
85 (1.7)

Note: Standard errors are presented in parentheses.

Trends on the test questions

At the end of each cycle of TIMSS, test questions are released into the public domain. At the beginning of the next cycle, new questions are developed to replace the released questions. In addition, in order to provide a trend measure over time, each cycle of TIMSS includes some questions from the previous cycle(s). This section presents an analysis of the trend questions included in both TIMSS 2002/03 and TIMSS 2006/07. Note that no questions from TIMSS 1994/95 were included in the TIMSS 2006/07 assessment.

There were 75 questions common to both the 2002/03 and 2006/07 cycles. Of these 75 questions, 9 questions had similar proportions of students correctly answering them across the two cycles (as shown in Table 3). There were quite a number of questions (45) that proportionally fewer students correctly answered in 2006 compared with 2002. In contrast, there were 21 questions that proportionally more students correctly answered in 2006 compared with 2002. When the change in proportions of students correctly answering was averaged across all the common questions, this represented a decrease of 2 percent.

While this analysis demonstrates a fairly small decrease overall, compared to the decrease of 18 scale score points, it should be remembered that the scale scores are calculated across all countries. Although New Zealand Year 5 students have performed about the same when averaged across questions common to the two cycles, relative to other countries they have decreased significantly between 2002 and 2006.

Table 3: Trends in the proportions of students correctly answering science questions common to 2002/03 and 2006/07

Change between 2002/03 and 2006/07
Decrease by
5% or more
Decrease by between
1% and 5%
Increase or decrease by
1% or less
Increase by between
1% and 5%
Increase by
5% or more
Number of questions
21
24
9
17
4

It is interesting to note that of the 21 questions in the group that decreased by 5 percent or more (when the proportion of students correctly answering in 2006 was compared with 2002), there were proportionally more of the physical and earth science questions than life science questions. In contrast, proportionally more life science questions were in the group where proportionally more students answered correctly and far fewer physical science questions.

Trends in science content and cognitive domains

The science assessment in TIMSS is organised around two dimensions, a content dimension and a cognitive dimension, as described in the 'TIMSS 2007 assessment frameworks' (Mullis, Martin, Ruddock, O’Sullivan, Arora, & Erberber, 2005). The content dimension comprises three content domains that describe the subject matter to be assessed:

  • life science;
  • physical science; and
  • earth science.

The life science domain is similar to the Living World strand in the New Zealand curriculum and the earth science domain is similar to the Planet Earth and Beyond strand. The physical science domain encompasses both the Material and Physical World strands of the New Zealand curriculum.

The cognitive dimension comprises three cognitive domains that describe the thinking processes that students must use as they engage with the content:

  • knowing;
  • applying; and
  • reasoning.

TIMSS assessment questions were categorised by the content and cognitive domains, and content and cognitive achievement scales were constructed separately for each domain. In order to simplify comparisons across domains, the scales were constructed to have the same average difficulty (set at 500 scale score points). As well as looking at achievement in each of these domains, the results can be used to ascertain relative strengths for participating countries.

As Table 4 shows, New Zealand Year 5 students achieved relatively better at earth science questions and relatively worse at physical science questions in 2006. This is the same pattern as observed in TIMSS 2002/03 (see Caygill, Sturrock, & Chamberlain, 2007). However the differences are more exaggerated in 2006, with a difference between earth science (the highest) and physical science (the lowest) of 17 scale score points. In comparison, in 2002 the difference between earth science (522) and physical science (516) was 6 scale score points.

In the cognitive domains, New Zealand Year 5 students achieved relatively better at tasks that required them to demonstrate their knowledge and relatively worse at questions that required them to apply their knowledge. Year 5 mean science scores on the cognitive domains were not investigated in 2002 so it is not possible to present trend comparisons.

Table 4: Year 5 mean science scores on the content and cognitive domains in 2006

Content domain
Mean domain score
Cognitive domain
Mean domain score
Life science
506 (2.5)
Knowing
511 (2.5)
Physical science
498 (2.5)
Applying
500 (2.1)
Earth science
515 (2.6)
Reasoning
505 (2.9)

Note: Standard errors are presented in parentheses.

Table 5 shows the number of test questions (and the associated raw score points) in each of the content and cognitive domains. As can be seen from the table, score points were not evenly distributed across domains. This distribution of questions across domains reflects the content and cognitive emphasis of many of the curricula of participating countries.

Looking at Tables 4 and 5 together, it is important to note that the content domain where New Zealand Year 5 students show the greatest strength, earth science, had the least number of questions. In contrast the cognitive area of greatest strength, knowing, had the greatest number of questions. The distribution of science questions across the content domains was very similar in 2006 to 2002.

Table 5: Number of questions in each of the content and cognitive domains

Content domain
Total number
of questions
Total number
of score points
Cognitive domain
Total number
of questions
Total number
of score points
Life science
74
85
Knowing
77
89
Physical science
64
67
Applying
63
68
Earth science
36
42
Reasoning
34
37
Total
174
194
Total
174
194

Note: In scoring the tests, correct answers to most questions were awarded one point. However, responses to some constructed-response questions were evaluated for partial credit with a fully correct answer awarded two points. Thus, the number of score points exceeds the number of questions in the test.

 

Footnote

  1. As mentioned in the introduction, only those students tested in English are included in trend comparisons.


 Copyright © Education Counts 2014   |   Contact information.officer@minedu.govt.nz for enquiries.