Science literacy achievement: senior secondary schooling
Scientific literacy assists students to participate as responsible and informed members of society, and as productive contributors to New Zealand's economy and future.
Attainment at senior secondary level contributes to preparation for successful participation in tertiary education, and the ability to contribute to, and participate in, a changing labour market and an increasingly knowledge-based society. Attainment level is also related to individual well-being.
Methodology behind the indicator
There is only one major domain for each cycle of PISA. Scientific literacy was the major domain in the Programme for International Student Assessment (PISA) in 2015. The science scores were summarised on a combined scientific literacy scale. The domain assesses three scientific competencies (explain phenomena scientifically, evaluate and design scientific enquiry and interpret data and evidence scientifically), three scientific knowledge types (content knowledge, procedural and epistemic knowledge) and three content knowledge areas (physical, living and earth/space systems). Science scores are available from PISA 2006, 2009, 2012 and 2015. However due to changes in the way scientific literacy has been assessed, no comparison can be made with the results for PISA 2000 and 2003.
The Item Response Theory (IRT) scaling approach and plausible values methodology is used in PISA. This involved estimating the parameters for each item (question) and examining the background characteristics of the students. From this, estimates of proficiency for each student and IRT scales for reporting student achievement were generated in aggregate and for each major content domain. Finally, the resulting values were placed on a reporting scale in PISA 2006 with a mean of 500 and standard deviation of 100. The PISA 2012 scale was anchored against the PISA 2006 scale enabling change to be measured from 2006 to 2009 and 2012.The IRT analysis provided a common scale on which the performances of students within and across countries may be compared.
In 2015 each student has 10 estimates of ability called plausible value (PV1-PV10). From 2006 to 2012 each student had 5 plausible values. For each student, the plausible values represent a set of random draws from the estimated ability distribution of students with similar item response patterns and backgrounds. They are intended to provide good estimates of parameters of student populations, for example, country mean scores, rather than estimates of individual student proficiency.
For any group of 15 year-old students, for example, the New Zealand Population, Māori, or Girls, the numerator and denominator are defined as follows:
Sum of the mean mathematics literacy scores for each plausible value for that group.
[Where the mean for each plausible value is defined as:
- Numerator: Weighted sum of scores for that group.
- Denominator: Sum of the mean science literacy scores for each plausible value for that group.]
(Data Source: OECD Programme for International Student Assessment (PISA))
n (number of plausible values).
(Data Source: OECD Programme for International Student Assessment (PISA))
Mean PISA scores for the New Zealand population and sub-populations are based on scores generated using Item Response Theory. These scores are reported on an international scale with an international mean of 500 and a standard deviation of 100 for OECD countries so that approximately two-thirds of all students internationally have a score between 400 and 600.
The scientific literacy domain has undergone considerable expansion and change since being a minor domain in PISA 2000 and PISA 2003. It is not therefore possible to compare science outcomes from PISA 2006 to PISA 2015 with these earlier PISA assessments.
In PISA 2015 proficiency levels related to the difficulty of the tasks that students were assessed on, with each content area having its own set of proficiency levels. These range from Level 1 for the simplest tasks to Level 6 for the most complex. The following information on the proficiency levels for each content area was sourced from: OECD (2016). PISA 2015 Results: Excellence and Equity in Education (Volume I). OECD: Paris.
|Level||Lower score limit|
Characteristics of Tasks
|6||708||At Level 6, students can draw on a range of interrelated scientific ideas and concepts from the physical, life and earth and space sciences and use content, procedural and epistemic knowledge in order to offer explanatory hypotheses of novel scientific phenomena, events and processes or to make predictions. In interpreting data and evidence, they are able to discriminate between relevant and irrelevant information and can draw on knowledge external to the normal school curriculum. They can distinguish between arguments that are based on scientific evidence and theory and those based on other considerations. Level 6 students can evaluate competing designs of complex experiments, field studies or simulations and justify their choices.|
|5||633||At Level 5, students can use abstract scientific ideas or concepts to explain unfamiliar and more complex phenomena, events and processes involving multiple causal links. They are able to apply more sophisticated epistemic knowledge to evaluate alternative experimental designs and justify their choices and use theoretical knowledge to interpret information or make predictions. Level 5 students can evaluate ways of exploring a given question scientifically and identify limitations in interpretations of data sets including sources and the effects of uncertainty in scientific data.|
|4||559||At Level 4, students can use more complex or more abstract content knowledge, which is either provided or recalled, to construct explanations of more complex or less familiar events and processes. They can conduct experiments involving two or more independent variables in a constrained context. They are able to justify an experimental design, drawing on elements of procedural and epistemic knowledge. Level 4 students can interpret data drawn from a moderately complex data set or less familiar context, draw appropriate conclusions that go beyond the data and provide justifications for their choices.|
At Level 3, students can draw upon moderately complex content knowledge to identify or construct explanations of familiar phenomena. In less familiar or more complex situations, they can construct explanations with relevant cueing or support. They can draw on elements of procedural or epistemic knowledge to carry out a simple experiment in a constrained context. Level 3 students are able to distinguish between scientific and non-scientific issues and identify the evidence supporting a scientific claim.
|2||409||At Level 2, students are able to draw on everyday content knowledge and basic procedural knowledge to identify an appropriate scientific explanation, interpret data, and identify the question being addressed in a simple experimental design. They can use basic or everyday scientific knowledge to identify a valid conclusion from a simple data set. Level 2 students demonstrate basic epistemic knowledge by being able to identify questions that can be investigated scientifically.|
|1a||335||At Level 1a, students are able to use basic or everyday content and procedural knowledge to recognise or identify explanations of simple scientific phenomenon. With support, they can undertake structured scientific enquiries with no more than two variables. They are able to identify simple causal or correlational relationships and interpret graphical and visual data that require a low level of cognitive demand. Level 1a students can select the best scientific explanation for given data in familiar personal, local and global contexts.|
|1b||261||At Level 1b, students can use basic or everyday scientific knowledge to recognise aspects of familiar or simple phenomenon. They are able to identify simple patterns in data, recognise basic scientific terms and follow explicit instructions to carry out a scientific procedure.|