Engagement is key
This report presents the findings of a small scale piece of qualitative research with nine providers delivering programmes to young people. The purpose of this research is to understand the experiences of implementing the Assessment Tool. This report supplements the quantitative analysis of first results from the Assessment Tool reported in Lane (2012). There is interest in both the results coming out of the Assessment Tool and the experiences of implementation.
Author(s): Jacqui Haggland, Ahikiwi Research and Consulting and David Earle, Tertiary Sector Performance Analysis, Ministry of Education.
Date Published: February 2013
This report is available as a download (please refer to the 'Downloads' inset box). For links to related publications/ information that may be of interest please refer to the 'Where to Find Out More' inset box.
The Literacy and Numeracy for Adults Assessment Tool (the Assessment Tool) was developed to provide reliable information on learners' reading, writing, numeracy and vocabulary skills against the Learning Progressions for Adult Literacy and Numeracy. The Tool was piloted in 2010 and rolled out to all tertiary education organisations in 2011. This is the first time that a diagnostic learning assessment tool has been rolled out and mandated across the New Zealand tertiary education system.
There is interest in both the results coming out of the Assessment Tool and the experiences of implementation. This report presents the findings of a small scale piece of qualitative research with nine providers delivering programmes to young people. The purpose of this research is to understand the experiences of implementing the Assessment Tool. This report supplements the quantitative analysis of first results from the Assessment Tool reported in Lane (2012).
The research addressed two key questions:
- What was the experience of organisations and educators in implementing the Assessment Tool and using it to support teaching and learning?
- What needs to be taken into account from these experiences in analysing the data available from the Assessment Tool?
The interviews confirmed that successful implementation is more than just setting up learners to complete the assessments. It needs to be part of wider organisational engagement with developing the literacy and numeracy skills of learners. It requires:
- Educators who understand the importance of literacy and numeracy and have the skills to be able to embed relevant tasks and activities into the vocational training they deliver
- Educators who also understand the use of assessment for learning and value the use of assessment information for diagnostic purposes and tracking whether their efforts are having an impact
- Learners who are engaged in the learning process, understand why the assessment matters and can make sense of the results for themselves.
- Organisations that have the resources and processes to support educators in using the Assessment Tool in a timely manner.
A consistent issue raised by all organisations was the challenge of engaging learners with the assessment process. This was a particular concern with the end-of-programme assessments. Learners have often had negative prior experiences of assessment. Younger learners found the questions did not always relate to their life experience. They needed to understand the purpose and benefit of the assessments for their own learning and development. The assessment is seen as something they do for the educator, it doesn't contribute to their grades and they can find it difficult to fit into their own understanding of their skills and learning. If they are not convinced and engaged with the process then the results may under-represent their full potential.
The key message for data analysis from this research is that the Assessment Tool is administered within a 'real world' context. Organisations and educators have flexibility about which assessments to use with which groups of learners and when to assess. Assessments are undertaken in a range of different circumstances. The level of engagement of educators and learners with the Assessment Tool can be variable. All of these factors need to be considered in drawing conclusions from the data. The Assessment Tool results should be regarded as one source of information alongside others in understanding learner outcomes.
The research was undertaken with providers offering fees-free places within the Youth Guarantee programme in 2011. An online survey was conducted with all 35 providers who offered these places. Nine providers were selected for follow up case studies. Focus groups were also conducted with three groups of learners from two large providers. The commentary on data analysis was added after the research was completed.
Literacy and Numeracy for Adults
Improving the literacy, language and numeracy skills of adults is a government strategic priority. Current policy emphasises embedding literacy and numeracy provision within level one to three vocational education. The Learning Progressions for Adult Literacy and Numeracy (Tertiary Education Commission, 2008a) provide a resource for teaching adults by describing the learning that adults require to develop their skills.
The need for a common assessment tool for adult literacy and numeracy was identified in the first Adult Literacy Strategy in 2001 (Office of the Minister of Education, 2001). Following exploratory work by the Ministry of Education, the Tertiary Education Commission commissioned the Literacy and Numeracy for Adults Assessment Tool in 2008 for implementation in 2010.
The primary purpose of the Assessment Tool is to support educators and learners in their teaching and learning of reading, writing, numeracy and vocabulary. It provides information on where the learner's skills sit against the Learning Progressions. Learners can be reassessed on the Tool to track their progress. The Tool can be run online, using a computer adaptive approach, or the assessments can be printed out. In 2011, a shorter version of the online tool was introduced, called the snapshot.
In general, the TEC expects that assessments will be undertaken with all learners in embedded and intensive provision as a condition of funding. The detailed requirements for each funding area vary. In 2011, 210,000 assessments were undertaken involving 77,000 learners (Lane, 2012).
Youth Guarantee is a lead policy of Government. It provides new opportunities for 16 and 17 year olds to achieve education success and progress into further education, training and employment. One elements of the programme is fees-free tertiary places. Around 3,500 learners participated in Youth Guarantee fees-free places in 2011. Some organisations ran specific programmes for these learners, and others spread the places across existing programmes.
Youth Guarantee places were used as focus for this research because young people under 20 were a significant group of learners assessed in 2011 and the places had the most intensive use of the Tool (Lane, 2012).
Getting underway: choosing an assessment
Organisations and educators first need to choose which assessment strands (reading, writing, numeracy and/or vocabulary) to use with which learners. In doing so, they generally considered the learning demands of their course. In some organisations, educators made the decision; in others, it was organisation-wide.
Most organisations involved in this research assessed both reading and numeracy. The writing and vocabulary assessments were not used as much. Marking the writing assessment is time-consuming. However, the experience of marking can help educators understand more about writing and how this can inform their teaching.
There are several types of assessment available – full on-line adaptive, the shorter snapshot and non-adaptive for printing. Most organisations used the full adaptive and/or snapshot assessments. The use of the snapshot assessment has increased. Some organisations have continued to use their own vocationally contextualised assessments as well.
Getting underway: making it happen
In most organisations, both organisation administrators and educators administered the Assessment Tool. The most common place for holding assessments was a computer lab. Most organisations supervised learners while they were assessed. Organisations often had problems with Internet access offsite or at remote campuses. Not all learners were comfortable using an online tool; however, this was less of an issue with younger learners.
Most organisations gave the initial assessment three to four weeks after the beginning of the course. Organisations wanted educators to have information about their learners' skills early in the course. They also wanted learners to be fully enrolled and establish a relationship with their educator before doing the assessment.
Educators needed support in the initial stage of using the Tool. Educators usually responded well when they had seen the reports generated by the Assessment Tool, and linked them to what they knew about embedding literacy and numeracy.
There was great variation both between and within organisations in what information was given to learners about the assessment before they took it. Most of the information dealt with how educators would use the assessment results to improve their teaching. Little of it focused on what learners might get from the assessment for themselves. Some organisations had developed guidelines and other resources to support their educators in informing learners.
A number of learners had had negative experiences with education and/or assessment. Younger learners found the questions did not always relate to their life experience. Educators needed to help them to feel comfortable with the assessment process. Having the educator stay in the room during the assessment was important to setting the learners at ease.
Informing learning and teaching
Many of the educators and organisation administrators interviewed in the research understood the benefits of the Assessment Tool for identifying learners' literacy and numeracy skills. They also recognised there was further room for improvement in the way they used the information to develop their teaching practice.
Some educators referred learners with low results to learning support specialists. Others made use of Pathways Awarua, an online learning system aligned with the progressions framework (Tertiary Education Commission, 2012a). Some used knowledge of learners' skill levels to group them together for activities.
Educators understood it was important to provide learners with feedback on their results. Practices for doing this varied. Most learners in Youth Guarantee seem to have very little idea about the learning progression steps and what they mean. There is also confusion between the steps and the NZQF levels. This can make it difficult to present the results in a way that is meaningful and engaging to learners.
The Learning Progression Step Profiles were proving a useful tool for structuring feedback to learners and explaining what the steps mean, and what is required to get to the next step. Some organisations were using results to inform individual learning plans, particularly those with specific Youth Guarantee programmes.
Getting underway (again): progress assessments
Concerns about lack of learner engagement in the end-of-course assessments have led to widespread rethinking of the process. Many organisations have changed the timing to ensure the assessment is completed before learners begin their formal course assessments. End-of-course assessments were usually held three to four weeks before the end of the course. Some organisations have begun using the snapshot at the end of the course. It was felt that having a shorter assessment could improve learner engagement at this stage.
At the time of the research, some educators saw the end-of-course assessment as mostly a compliance activity. Learners did not see the value of doing another assessment, particularly when it does not contribute to their grades. The use of the assessment data to inform further development of teaching practice varied. Some learners did not receive feedback on their end-of-course assessment.
On the other hand, some organisations were developing resources to better inform learners about the purpose of the Assessment Tool and how they could use the information to improve their skills to achieve their future goals. This was seen as one way of improving learner engagement with the end-of-course assessment process.
Overall learners displayed mixed feelings towards the Assessment Tool. While some found the items interesting, others commented that they were too long and there were too many of them. Individuals responded to the Assessment Tool in different ways depending on their literacy and numeracy skills, comfort with computers and previous experience with education and assessment.
Turning it into practice
Organisations involved in this research were working towards a sustainable approach for implementing the Assessment Tool. Most were moving towards a more educator-led approach. A number had taken steps to make educators more familiar with the processes of the Assessment Tool. All had recognised the importance of providing ongoing professional development. Most of the organisations saw the need to do further work to ensure that the results of the assessments were used to inform teaching and learning.
Implications for data analysis
In analysing data from the Tool, analysts need to take into account the 'real world' circumstances and choices that inform and shape the data.
Although there are funding requirements on organisations to use the Tool with their learners, organisations and educators can still make choices about which assessment strands they use with which courses and which groups of learners. This means that the results of one assessment strand are not necessarily representative of all learners in an organisation or course. For example, numeracy results may only relate to learners in courses with high numeracy demands.
Educators and organisations can also choose which type of assessment (full or snapshot) to use with each group of learners, including different types of assessments at the beginning and end of a course; for example, a full assessment at the start and a snapshot at the end.
Organisations and educators have flexibility in the circumstances and timing of the assessments at the beginning and end of courses. Some courses may have mid-course assessment results.
The degree of engagement of educators and learners with the Assessment Tool has been variable. Low engagement can result in scores that are lower than the full potential of the learner. Objective research has yet to be undertaken on the full extent of 'engagement effect.'
It is possible that changes in assessment scores between the beginning and end of a programme could reflect a change in learner engagement, rather than an actual change in skill levels. Evidence of learner gain from the assessment scores needs to be considered alongside other information, including the length of participation, course pass rates and changes in learners' literacy and numeracy practices to get a full picture of learner outcomes.
It should not be assumed that the use of the Assessment Tool in itself is a sufficient indicator of the presence of embedded literacy and numeracy provision. The link between assessment and teaching practice has been quite variable.
For more information about the content on this webpage, please email the: Tertiary Mailbox