Literacy and numeracy at work

Publication Details

This report looks at the use of literacy and numeracy skills at work, and how this relates to the skills and education of employees. It uses data from the Adult Literacy and Lifeskills (ALL) survey to look at how well employees’ skills match the literacy and numeracy practices that they undertake at work. It looks at how skills and education relate to different sets of practices, such as financial literacy and numeracy. It also identifies which groups of employees are more likely to have a skills shortfall or skills excess, and some of the barriers to further training for those with a skills shortfall.

Author(s): David Earle, Tertiary Sector Performance Analysis and Reporting, Ministry of Education.

Date Published: February 2011

Please consider the environment before printing the contents of this report.

This report is available as a download (please refer to the 'Downloads' inset box).  For links to related publications/ information that may be of interest please refer to the 'Where to Find Out More' inset box.

Chapter 2: Literacy and numeracy job practices

Using the ALL survey it is possible to develop measures of literacy and numeracy job practices which characterise different types of jobs.

The ALL survey included a set of questions about the reading, writing and mathematics activities that respondents undertook in their main job.  Respondents were asked to how often they undertook each activity on a four-point scale from "at least once a week" to "never". These questions provide information on frequency and range of literacy and numeracy activities.

2.1 Previous approaches to analysis

Several approaches have been taken to analysing this information in previous research. Krahn and Lowe (1998) used similar questions from the International Adult Literacy Survey (IALS) 1996 to develop a simple index. The index was constructed by assigning a value from 1 to 5 to each response and taking the average for each respondent across the each of the reading, writing and mathematics tasks.

Lane (2010) looked at the number of regular reading and writing activities that employees undertook. Each questions that a person responded to as "at least once a week" counted as one regular activity. This provided a scale of the range of reading and writing activities undertaken at work.

OECD and Statistics Canada (2005) developed a more sophisticated approach to scaling the data. They created four scales: a reading scale; a writing scale; a numeracy skill and a combined scale. The scales were developed through a three-step process of exploratory factor analysis (to explore and model the data), confirmatory factor analysis (to validate the models and indices) and scale development using the Rasch item response model.

Ryan and Sinning (2009a) also used item response theory to look at this data from both the IALS and ALL surveys. They modelled literacy and numeracy use as unobserved variables that can be measured through their effect on the observed data. They used an extension of Rasch modelling that allows for ordered, multiple response categories. They derived a literacy use scale based on the reading and writing questions and a numeracy use scale based on the mathematics questions.

2.2 Approach used in this study

This study follows a similar analytical path as taken by OECD and Statistics Canada (2005). An exploratory factor analysis was undertaken to explore and model the data. This analysis was conducted across the entire set of 17 activities to establish which groupings of activities were evident within the responses, rather than working from the original categorisation of the questions. This approach led to identifying three underlying factors that represent three different combinations of literacy and numeracy job practices.  These factors have been called:

  • Financial literacy and numeracy – working with bills, invoices and prices
  • Intensive literacy – reading and writing letters, emails, reports and manuals
  • Practical literacy and numeracy – reading diagrams and directions, writing directions, measuring and estimating size and weight, and using numbers to keep track of things.

Each respondent was scored on these factors according to the answers they provided to the related questions. The scores are presented on a range from 1.0 to 4.5 and represent the frequency of practices at work. A score of 1.0 can be taken to mean never undertaking any of the practices at work and 4.5 as undertaking all of the practices at least once a week.

The document literacy scores from the ALL data have been converted to a standardised scale with a mean of 0 and standard deviation of 1. The standardisation was done across the entire data set, including those aged 16 to 24 and those not in employment. One standard deviation roughly equates to one level on the ALL scale.

This analysis looks just at people aged 25 to 65 who were currently employed at the time of the survey. This excludes younger people who may have been working part-time while studying and/or still building experience in the workforce.

2.3 Strengths and limitations

The advantage of the approach used in this study is that it provides a precise score of job practices for each individual that is independent of other variables, such as occupation, industry and qualifications. These scores are derived from the patterns of literacy and numeracy use reported by respondents. In particular, these scores can be used to examine within occupation differences and how these are distributed by gender, age and other characteristics.

The approach is limited by the nature of the questions. There are three major limitations in the questions.

First is that the most frequent response was "at least once a week", which attracted a very high response rate on number of the questions. It would have been more useful to break this category into "daily" and "one or more times each week".  The factoring method used makes some allowance for this problem by not assuming that responses are normally distributed.

Second is that the difficulty or complexity of the activity is not assessed. For example, there could be a vast range in literacy demands for people "writing letters, memos and emails" from basic notes through to formal business correspondence. The factors need to be read as a representation of the type of literacy and numeracy practices workers undertake and not necessary of the difficulty of the activities involved. 

Related to this is that the responses may be influenced by the respondent's own level of skill and perception of the activities. For example, a worker with low literacy, who struggles with reading and writing, may report they undertake frequent reading and writing as part of their job. A worker with high literacy doing the same job may consider the same tasks to be too trivial to report as meaningful reading and writing tasks.


  1. Fuller details are the questions are set out in Appendix B
  2. Full details of the methodology are available in Appendix B