# Reading literacy achievement: senior secondary schooling

## Rationale

Reading literacy achievement at senior secondary level contributes to preparation for successful participation in tertiary education and training. Achievement level is also related to people’s well being and influences their ability to contribute to, and participate in, a changing labour market and increasingly knowledge-based society.

Literacy involves the ability of individuals to use written information to fulfil their goals, and the consequent ability of complex modern societies to use written information to function effectively.

The Programme for International Student Assessment (PISA) study assessed 15 year-old students’ reading ability on accessing and retrieving information, integrating and interpreting texts, and reflection and evaluation.

## Indicator

The reading scores from PISA 2000, 2003, 2006, 2009 and 2012 can be summarised on a combined reading literacy scale. This enables a comparison to be made between the reading literacy achievements of 15 year-old students in each of these years.

The Item Response Theory (IRT) scaling approach and plausible values methodology is used in PISA. This involved estimating the parameters for each item (question) and examining the background characteristics of the students. From this, estimates of proficiency for each student and IRT scales for reporting student achievement were generated; in aggregate and for each major content domain. Finally, the resulting values were placed on a reporting scale in PISA 2000 with a mean of 500 and standard deviation of 100. Subsequent cycles (2003, 2006, 2009 and 2012) were anchored against the PISA 2000 scale. This enables a comparison to be made between the reading literacy achievement of 15 year-old students in each of 2000, 2003, 2006, 2009 and 2012.

The IRT analysis provided a common scale on which the performances of students within and across countries may be compared.

Each student has 5 estimates of ability called plausible value (PV1-PV5). For each student the plausible values represent a set of random from the estimated ability distribution of students with similar item response patterns and backgrounds. They are intended to provide good estimates of parameters of student populations, for example, country mean scores, rather than estimates of individual student proficiency.

For any group of 15 year-old students, for example, the New Zealand Population, Māori, or Girls, the numerator and denominator are defined as follows:* Numerator: (Data source: OECD: Programme for International Student Assessment (PISA))*Sum of the mean reading literacy scores for each plausible value for that group. [Where the mean for each plausible value is defined as:

*5 (number of plausible values).*

**Weighted sum of scores for that group.**

Numerator:Numerator:

**Denominator:**Sum of the weights for that group (equivalent to the estimated number of students in that group).]**(Data source: OECD: Programme for International Student Assessment (PISA))**

Denominator:Denominator:

## Interpretation Issues

Mean PISA scores for the New Zealand population and sub-populations are based on scores generated using Item Response Theory. These scores are reported on an international scale with a mean of 500 and standard deviation of 100 for OECD countries so that approximately two-thirds of all students in the OECD have a score between 400 and 600.

In PISA 2012 proficiency levels related to the difficulty of the tasks that students were assessed on, with each content area having its own set of proficiency levels. These range from Level 1 for the simplest tasks to Level 6 for the most complex. For information on the proficiency levels for each content area see: OECD (2013). PISA 2012 Results: What Students Know and Can do: Student Performance in Mathematics, Reading and Science (Volume I). OECD: Paris.