Thinking outside the square: Innovative ways to raise achievement for at risk students Publications
Publication Details
Thinking Outside The Square provides information for schools wanting to introduce innovative and effective programmes for raising the educational achievement of at risk students. The publication draws on the findings of two research projects that evaluated the success of programmes funded from the Ministry of Education's innovations funding pool between 1999 and 2002.
Author(s): Janet Clinton, Summary prepared by Janet Rivers. Report for the Ministry of Education.
Date Published: 2003
Summary
Devising innovative ways to raise achievement for at risk students: An overview
The Ministry of Education is working with schools to create environments to raise the achievement of children who are at risk of low achievement.
This includes encouraging schools to introduce new programmes and interventions to lift the performance of their at risk students.
However, while innovative school-based interventions for at risk children can and do work, success is not automatic.
This publication reports on the factors that contribute to effective school-based innovations. Principals, trustees and schools can use the information to help ensure the success of their own innovations.
How do we know what works?
In 1999, the Ministry of Education established the innovations funding pool (IFP) for schools to trial programmes to improve outcomes for students at risk of low levels of educational achievement. The first two funding rounds supported 32 programmes from 1999 to 2001.
The Ministry commissioned independent research to assess the merit of the innovative ideas and identify factors that could account for the success or failure of the innovations.
This booklet draws on the findings of that research to present some guidelines for other schools looking for new and effective ways to support their at risk students.
The book is divided into two sections. The first part looks at definitions of at risk students, details the innovations funding pool, and examines the characteristics of successful interventions. The second part presents three case studies of school-based innovations that successfully improved the educational outcomes of children at risk. They are: the Windley School FM Soundfield Innovation, the Project Early Charitable Trust, and the Mt Roskill Community Literacy Project. As well as presenting the research findings for the three case studies, this section also includes comments from key people on what made their programmes successful.
Defining at risk students
Before looking at how to introduce a successful intervention for at risk students it is important to consider what is meant by at risk.
Educationally, the term at risk refers to children and young people who are likely to have:
- poor educational outcomes, measured by indicators such as truancy and suspension statistics, low literacy and numeracy levels, alternative education enrolments, low school leaver qualifications, and school drop-out rates
- adverse non-educational outcomes, including abuse in the home, youth suicide, drug and alcohol abuse and poor mental health
- a negative impact on other students as a result of behaviours such as bullying, violence, aggression, and assault.
From an education perspective, an at risk student, without successful intervention, is likely to end up without the necessary skills, attitudes and knowledge to make a successful transition beyond school.
However, it is important to recognise that students at risk do not form a homogeneous group. Also, using the term at risk can lead to the impression that the cause of the risk always lies with the student, when, in fact, the cause may be more to do with the student's environment. Refugee children, for example, would often not be at risk had it not been for the events that led to their becoming refugees. Also, a school environment itself can be responsible for placing a student at risk.
Some major sources of risk for students include:
- an adverse social environment, in which children are subject to domestic and sexual abuse, dangerous neighbourhoods, poor health, racist attitudes, poor parenting and poverty
- social and economic shock, such as that affecting refugee children, children whose parents die, students who are pregnant at a young age, or students whose peers have experienced sudden trauma
- poor educational settings, such as poor teaching, low professional capability, an inappropriate curriculum, disrupted classrooms, low expectations and failing schools. So-called at risk students are often intellectually able, and large numbers of children who would otherwise be regarded at risk can achieve educational success when the educational environment changes.
- individual impairment where children are put at risk of underachievement because of psycho-social factors (motivation, personality), physical/physiological factors (disability/health) and behavioural factors (internalised and externalised behaviours, drug and alcohol abuse, disruptive behaviour).
There is often a significant overlap in the environments placing children at risk. For example, truancy or disengagement is more likely to occur in poor educational settings, but it is also linked with an adverse social environments.
There is an overlap between students considered at risk and special education or special needs students. The boundary between the two groups merges most obviously for students with behavioural or emotional difficulties.
There is a diverse range of factors that can place a student at risk of educational failure. It is important to recognise the term at risk applies not only, for example, to children from abusive homes with offending behaviours, but that it could equally apply to students who are placed at risk because their needs are not being met by their schools.
Data from the recent Programme in International Student Assessment (PISA), for example, shows that although New Zealand is among the top performing countries in reading, maths and science literacy, there is also a very wide spread of scores within schools. This indicates every school is working with a diverse range of student abilities and every school will have its own group of at risk students. It is each school's responsibility to do a `stocktake' of its at risk students as a first step towards establishing programmes that change the school environment and provide more effective support to help children improve their educational outcomes.
The innovations funding pool
Increasingly schools are responding to the needs of at risk students in ways that go beyond traditional responses such as exclusion, providing low-challenge programmes, or holding society or parents responsible for problem behaviour at school.
To encourage schools to continue to come up with new ideas and programmes to support at risk children, the Ministry of Education established the innovations funding pool. The pool helps schools to trial innovative programmes for students at risk of poor educational outcomes. Participating schools introduce programmes designed to improve, over time, the educational achievement of at risk students. The pool provides schools with money to get the programmes under way, with schools or their communities also contributing funding and other resources. The goal is for the programmes, if they prove to be effective, to become self-funding.
As well as improving the educational outcomes for at risk students, programmes funded by the innovations funding pool are also expected to:
- identify examples of excellent New Zealand practice that can be used by other schools at reasonable cost
- increase principals', teachers' and trustees' understanding of the principles of effective programmes
- increase networking between schools with similar interests and needs
- promote effective methods of meeting the needs of students at risk.
So far the IFP has funded more than 60 projects.
How does the innovations funding pool work?
Programmes must be school-based - the goal is to help schools to implement their ideas to change school environments to ensure they better meet the needs of students. This is done in conjunction with community input where appropriate.
Schools apply to the Ministry of Education for funding for a set period to support the introduction of an innovative programme. Schools must also provide some resources, and eventually, if the schools think it worthwhile to continue with their programmes in the long term, the programmes are to be self-sustaining.
Schools set up the project, develop the methodology (including evaluation tools which the external evaluators can help set up), prepare the staff and other participants to conduct the programme, implement and evaluate the programme and report on the outcome. The delivery is monitored by external evaluators, milestone reports provided by the school and site visits by staff.
The full criteria for applications are available from the Ministry of Education website. However, because funding is limited, schools are encouraged to consider setting up a programme using their own and/or community resources.
What is an innovative programme?
In this context, an innovative educational programme can be broadly defined as one where the school is trying something different to improve the educational outcomes for at risk children. It is a programme that focuses on changing the school environment to make it more supportive of at risk children. That might mean a totally new and imaginative approach or a new combination of existing approaches to address the underachievement of at risk students. However, the programme must be based on sound educational principles as established in international and national research.
Programmes can be in-school or involve a cluster of schools. They can also involve the community or an external provider.
So far, the innovations funding pool has funded a diverse range of programmes throughout New Zealand catering for students from four years to 18 years of age.
Projects included mentoring programmes, outdoor education programmes, literacy and numeracy projects, programmes that included a focus on health, well-being or self-esteem, early intervention programmes, behavioural modification programmes, and more.
Although some programmes existed prior to the establishment of the innovations funding pool, they are considered `new' in the sense that it is the first time their effectiveness in a school setting has been evaluated.
Establishing successful innovations for at risk children
The evaluations
To determine whether the programmes funded through the IFP were effective in raising student achievement and to identify the factors that contributed to successful innovations, the Ministry of Education commissioned UNITEC and subsequently the University of Auckland to evaluate the first two rounds.
The evaluation was carried out in two phases. The first evaluated the 17 projects funded in Round 1 while the second looked at the next 15 projects funded in Round 2.
This booklet does not attempt to summarise the evaluations, but draws on the findings that can be used to point to effective practice for other schools wanting to take a new approach to what they do for at risk students. Indeed, many of the principles can be applied to the implementation of innovative projects in general.
Schools looking for innovative ways to support at risk students can use this information in their programme planning to ensure their resources (time, staffing, and money) are used effectively.
How the evaluation was done
The evaluators looked at the effectiveness of each programme across a range of variables. The most significant of these were variables related to:
- the extent of change in students' academic achievement and behaviour
- the administration and management of the innovation
- the willingness and capability of programmes to undertake their own and external evaluation.
Data included standardised achievement tests and course results, school data such as attendance and suspension figures, teacher and parent ratings on behaviour and students' self-ratings on factors such as self-esteem.
Data on each variable were analysed to determine the size of the effect in each of the variables. `Effect-size' is a recognised statistical tool for measuring the magnitude of an effect, instead of simply measuring whether there was an effect. Using effect-size is especially useful for comparing widely diverse programmes, as in this evaluation, and for overcoming problems associated with small samples.
For more information on how the evaluation was conducted, refer to Appendix 1.
Characteristics of successful innovations
The idea
The first step in introducing a successful innovation is to make sure the idea is a good one. This is achieved by ensuring the initiative is based on well-established research evidence. There must also be clear links between the innovation and the desired outcomes. For example, it is unrealistic to expect significant change in academic achievement in interventions that are socially-based and do not directly include academic subjects. It is also essential that the idea is adapted to the local context. Previous programmes should not necessarily be followed slavishly, but should be used to identify the key elements of success in an innovation.
Management
A high level of organisation is required if a project is to be successful, and it is essential to have appropriate levels of commitment, resources and planning.
It is important to have adequate resources to implement the programme, including high quality staff with time dedicated to carrying out the innovation.
Leadership and support from the principal is a critical factor in ensuring the success of an innovation, although it is also important that the principal can delegate to a number of staff as opposed to a single staff member. This is good risk management practice in case of staff changes or staffing shortages.
Good communication and regular meetings between all involved is essential and it is important all staff support the innovation.
Evaluation is essential and a programme must have both the willingness to evaluate and the expertise to carry the evaluation out. Without a clearly articulated and implemented plan of evaluation, it is difficult to determine the merit and worth of any innovation, or to provide evidence to further improve the programme. The research found that programmes that took a strong approach to evaluation and monitored progress regularly were more likely to be successful.
The research found that staff and stakeholder perceptions of success or failure of a programme did not necessarily correlate with hard data on its success or failure. Schools need to provide dependable evidence of changes in students, beyond statements of satisfaction.
The research also found that the cost of a programme had no bearing on its effectiveness and some of the most effective programmes cost the least. However, a programme needs to be fully costed and fully resourced, particularly in areas of staffing. Providers need to make provision in their planning for how they intend to make the programme self-sustaining.
Climate
School climate is important. It is essential that there is a match between the culture of the innovation and the school if the programme is to be integrated successfully into the wider school environment beyond the innovation stage. Research into effective schools suggests that an effective school environment is one that encourages experimentation and evaluation and the ability to integrate an innovation into the character of a school is a factor in predicting the success of a school.
Summary
Factors which contribute to the success of an innovation are:
- the initial idea should be well-researched
- the management should include strong leadership and suitable resources
- the school climate should include a supportive organisational structure and a willingness and ability to evaluate the programme.
What can go wrong?
Many of the factors which contribute to programmes failing to raise the educational achievement of their students are the flip-side of factors that contribute to success. They include:
- not doing sufficient initial research
- not adapting the idea to local conditions
- lack of strong leadership
- not getting the support of the whole school
- being too ambitious
- failing to realise how much time and money is involved
- failing to monitor, modify and evaluate programmes.
An innovation can fail because it is introduced for reasons other than educational ones; for example, it could be introduced to appease community pressure, to appear innovative and/or to gain more resources.
Sometimes programmes can run into difficulty because key staff members leave, although this can be countered by sound management of the initiative, which would ensure the success of the programme did not rely on one person.
The most difficult areas of management for the programmes that were evaluated related to co-ordination, planning, timing and resources, including human resources. Most providers over-estimated their endurance and energy, and under-estimated the workload and time involved, particularly the time needed for co-ordination and administration.
It is important to provide administrative support for new programmes, yet few of the programmes evaluated were given extra support in this area. In many cases teachers had to do the paper work and in others it was given to school administration assistants, which created further workload issues for the school organisation.
Co-ordinating complex innovations can be difficult. In some of the programmes evaluated, the co-ordinator had another teaching or executive role in the school, and many programmes did not have well-defined implementation plans, roles and job descriptions.
Schools must have the willingness and the capacity to evaluate if changes are to be effective and sustainable. Some schools were not used to evaluating new programmes, and did not keep useful or accurate data for assessing whether they were achieving their objectives. It is essential to have pre- and post-intervention test data in order to check if the ongoing programme has achieved its stated objectives or outcomes.
Funding was a challenge for many innovations. Only a few programmes were able to become an integral and sustainable part of the school, or be adopted by the community.
Examples of other problems encountered by programmes included:
- a high level of student absenteeism, making it impossible for the programme to deliver on its academic objectives
- a clash between an existing school culture and the vision of the innovation
- internal politics and negative perceptions among school staff towards an innovation
- lack of leadership and lack of a `champion' for a project which meant it was never fully implemented and hence could not meet its expected outcomes
Case studies
This section looks at three innovative programmes that have been successfully introduced in schools. They were not the only effective programmes, but they were among the most successful across a wide range of measures. They have also all been successfully integrated into their respective schools or taken over and further developed by their communities.
Windley School FM Soundfield Innovation
This intervention illustrates how one school changed its environment - rather than changing its children - when it realised its teachers were, literally, not making themselves heard. The programme was primarily aimed at meeting the needs of five to eight year-olds with hearing difficulties but had the secondary effect of improving the acoustic conditions in junior classes for all students.
The innovation
In 1999, Windley School in Porirua installed a `phonic ear' FM sound amplification system in several junior classes to raise the educational achievement of children with hearing difficulties.
The system required the teachers to wear a microphone, and four high-quality speakers were strategically placed around the classrooms.
All students were assessed early in the year for both their achievement level and hearing and their progress was monitored through standard classroom assessment, based on national guidelines.
A total of 131 five to eight year-olds took part in 1999 and 157 five to eight year-olds took part in 2000. In 1999, the hearing tests took place in winter, and 55% of the children failed. In 2000, the hearing tests took place in summer, and 37% of the children failed.
While the sound system particularly supported children with a hearing difficulty, it also improved the listening environment for all junior school children.
Because of the nature of the innovation, all the junior school teachers were involved in the programme. However, there were also two positions specific to the implementation of the programme. These were the programme co-ordinator and a teacher aide who undertook administrative tasks such as data collection.
Results
The achievement of a sample of 96 students who had failed a hearing test was monitored extensively on a number of standard measures during 2000.
The tests included detecting rhymes, identifying syllables, detecting initial sounds, counting phonemes, comparing word lengths, phonemes to letters, BURT word recognition, pseudoword reading, knowledge of letter names, knowledge of consonant sounds, knowledge of vowel sounds, Daniels & Diack spelling, and pseudoword spelling.
Ten of the 13 assessments showed significant change over the year beyond what would be expected from natural maturation. Hearing assessments showed there had been only minor changes in the children's hearing, indicating any change in achievement levels could not be attributed to change in the children's physical ability to hear.
Teachers felt that the programme had a significant impact on the children both academically and behaviourally.
The researchers concluded that this innovation was successful and had achieved all its predicted outcomes.
What makes this a successful innovation?
The idea for the innovation was sound, simple, and effective. The programme was not designed to fix hearing difficulties, but to facilitate hearing for those with impaired hearing.
The school began with a clearly defined goal _ to improve achievement for children through a direct intervention. Thus the goals and activities were clearly matched.
The initiative was well-managed. It was well-researched, well-resourced, well-organised and effectively monitored and evaluated.
The innovation was carried out in a supportive school climate, and all the teachers were involved. They were trained in using the amplification system, which required them to adapt to wearing a microphone. Thus the innovation had a shared vision.
Funding was provided to employ a co-ordinator with expertise in research. A teacher aide was also employed for assessment tasks.
The principal was an `innovations champion' who employed appropriate staff when necessary, as well as supporting and encouraging the ongoing development of the programme.
In summary, this innovation was:
- a great idea
- efficiently managed
- well-researched
- effectively monitored and evaluated
- supported by the school climate.
Principal's view
If Windley School principal Columba Boyack was offered the choice of a computer or a sound system for his junior school classrooms, it'd be no contest. "I'd say the sound system because I know it will improve the children's learning. The results of the soundfield innovation show that after three years, children at Windley, a decile one school, are achieving at the same levels as nearby decile nine schools in some of the measures assessed," Columba said.
Columba said it was important teachers were trained to use the system, and they needed to be committed to wearing the microphone. "It's not good enough to say `here it is' and leave it at that _ the teachers have to be convinced of its value. We provide professional development in the use of the system at the beginning of every year."
The system also had to be supported by effective teaching practices. "The system needs to be used in conjunction with a junior school literacy system. It won't solve all literacy problems-but it puts children on an equal footing," Columba said.
He said anyone looking at introducing any innovation needed to get quality data on what they wanted to do. "Any innovation must be supported by good research." He also emphasised the need for ongoing data collection for measuring the effectiveness of the intervention.
The trial was so successful the school has since installed the sound system in every classroom. Funding for the extension came from the school's operational funding, at a cost of about $2500 per classroom. The school also budgeted a small amount for maintenance, such as replacing batteries or repairing accidental damage to the wires. "But it is virtually self-sustaining," Columba said.
Project Early
This programme existed prior to the establishment of the innovations funding pool, but the funding provided the first independent evaluation of the effectiveness of the programme. It is included here because it exemplifies the characteristics required for successful innovation. It continues to operate as the Project Early Charitable Trust, working in eight schools and several early childhood centres in eastern Christchurch.
The innovation
For the two years Project Early was under evaluation as an innovation, it [JR32]operated in seven primary schools in Christchurch, and targeted children with severe anti-social behaviour disorders. Its prime focus was reducing academic and social risk through early intervention.
The programme used an integrated case study approach with the interventions taking place in both the school and the home. Children were referred by the school, the home, or both.
The programme helped children gain appropriate social skills by developing and implementing a comprehensive and integrated individual programme which involved all the significant adults in the child's life. The programme included training for teachers and caregivers to ensure they had the skills needed to bring about change for the children.
Teachers and parents reinforced expected standards of behaviour through a number of strategies they all agreed on. These included positive reinforcement where earned, and suitable discipline measures when a child's behaviour was inappropriate.
There were special programmes for selected students, such as a social skills programme which ran over five weeks, and there was also a positive parenting course.
A total of 107 students completed some part of the programme over the two years it was funded from the innovations pool _ 53 in 1999 and 54 in 2000. Nearly two-thirds of the students taking part were male, and the children ranged in age from five to nine years.
The programme at school
Children were referred by the school for such behaviours as non-compliance, hitting or violence, disruption in class, stealing, lateness or non-attendance, and separation anxiety from parents.
The target behaviours were compliance, remaining on task, more appropriate playing with peers, no stealing, following instructions, working independently, leaving others alone, improving class behaviour and attendance, and speaking pleasantly.
The intervention procedures for teachers and caseworkers consisted of providing praise and reward, providing clear instructions, using time out, using buddies, teacher or principal awards, a social skills programme, a separate work area, checking the child was not still stealing, and privileges.
The programme at home
The main reasons for referrals from home were non-compliance, disorganisation (usually in the morning getting ready for school), swearing, hurting others, stealing, bedtime routine, running away, and separation anxiety.
The home target behaviours were compliance, speaking pleasantly, establishing routines, no hurting, no stealing, establishing bedtime routines, playing appropriately.
The intervention procedures were a combination of praise, rewards and time out.
Results
The evaluation found that for the school-based interventions, on average, there was a 126% increase in desirable behaviour (or decline in negative behaviours) between the baseline data and the end of the intervention.
For the home-based interventions, there was, on average, an 87% increase in desirable behaviours (or decline in negative behaviours) between the baseline and the end of the intervention, and a 222% increase in desirable behaviours between the baseline and the follow-up.
The evaluation concluded that the programme had met all its expected outputs and outcomes in 1999 and 2000.
What made this a successful innovation?
The programme began with a strong research base: the case management approach for early intervention is a well-researched and established intervention.
The principals and staff of all the schools involved were clear about the goals for Project Early.
The programme was highly organised, with all aspects of policy and procedure clearly articulated in writing. It had a project office hosted by Aranui School.
The programme had a high commitment to evaluation and the capacity to carry it out.
Procedures and progress were monitored regularly by a monitoring committee and an executive committee. This meant that any issues that arose could be dealt with effectively and efficiently.
The programme systematically collected extensive information on individuals which was useful for evaluating the success of the interventions.
There were frequent meetings between caseworkers, principals and staff, ensuring full communication was maintained.
Caseworkers developed strong links with the community; for example, with specialist education service workers and community health nurses.
In summary, this innovation was successful because of its:
- strong idea
- clear goals
- strong implementation and evaluation plan.
Principal's view
Project Early started in 1995 after a group of principals decided they needed such a programme. It has since become a successful and established early intervention programme.
Ginnie Warren, current principal of Aranui Primary, the programme's host school, said factors contributing to the programme's success included the quality and expertise of the caseworkers, the management of the programme through a local executive committee and key stakeholders, and the ability to provide interventions in both the home and the school.
"To ensure the programme's sustainability, the participating schools formed an executive committee to provide consistent, effective management," she said.
"Other important issues have been to establish policies and procedures to ensure quality assurance, and to provide professional supervision for caseworkers," she said.
"It is also essential to collect and analyse data on a regular basis so you know the programme is effective, and to monitor the programme regularly."
Ginnie advised anyone thinking of introducing a similar innovation to take time to secure quality caseworkers, and said providing security of tenure helped in getting good staff.
The financial aspects needed to be carefully managed. "It is important to secure funding as this has been a huge hurdle for the executive committee," she said.
"We have ensured the programme is sustainable by building financial relationships with funding providers. But the main challenge we continue to face is securing permanent funding."
Mt Roskill Community Literacy Project
This is a programme involving a cluster of schools on one campus in Mt Roskill, Auckland that has fostered a strong partnership with the community. The programme has been handed over to the community and is now managed by the Mt Roskill Police Community Project.
The innovation
This programme involved the Mt Roskill Grammar, Middle and Primary schools.
The programme aimed to achieve literacy for students by means of a `whole family' mentoring intervention. The underlying assumption was that fostering a home environment where literacy and learning was viewed as important would enhance student (and family) learning, subsequently reducing the overall at risk status of the participants.
A number of criteria were used to select students, including that they were significantly behind their peers in reading, they displayed inadequate oral and written language skills for their year-level, a significant amount of their homework was not completed, school attendance was poor, they displayed inadequate social skills, had previously participated in school remedial programmes, parents' literacy levels were basic, and there was little or no parental support for the children's learning.
Parents and caregivers also had to make a 20-month commitment to the programme.
Nine families and 20 students were involved in the programme.
The programme consisted of mentor and tutor visits to participating families in their homes over a period of six terms, for approximately two hours a week. During each visit, the family and students were engaged in activities designed to improve and develop literacy skills. The families were also provided with access to resources such as reading materials. Each family had six terms on the programme with two terms of follow-up.
The programme was monitored in various ways, including regular meetings with the principals' committee and the programme manager. School records provided the academic and social baselines, and detailed individual academic profiles were kept. These included information on vocabulary levels, reading comprehension, writing samples, and five-weekly running records for oral and written language. Tutors also kept formal written records and reported weekly to the literacy coordinator.
Results
In terms of change in student achievement, there were marked increases in comprehension and oral language. Most of the gains accrued between Terms 1 and 2, and were then maintained at the Term 2 level.
The evaluation also found there were improvements in social achievement such as improved self-concept.
All stakeholders were positive about the programme. Principals noted increased involvement of parents in the school, and teachers and support staff were positive about the way the programme broke the cycle of low literacy levels.
The evaluation concluded that overall the programme could be considered most successful.
What made this a successful innovation?
Again, this is a programme where the idea behind it was well developed, well-researched, and suited a need in the community and schools.
The programme has been adapted to fit its environment and to suit a developing partnership with the community.
There was a clear vision, and it had the strong support of school principals.
There was extensive community consultation, and the programme fostered a strong partnership with, and ongoing support from, the community.
There was a high level of expertise, vision and management from the staff involved in the programme. Staff were carefully selected and given full training, and there was communication between all the staff involved.
The programme is well-known and appreciated by staff throughout the participating schools; that is, there is whole-school support.
There was sensitivity to the families who participated.
The programme had a high willingness and capacity to evaluate its work, and a culture of evaluation was evident in both participating schools and the project.
In summary, key factors in the success of this programme were:
- the high quality staff
- the vision and support of the schools' leaders
- co-operation between schools and with the community
- the willingness and capacity to evaluate.
Principal's view
The principal of Mt Roskill Grammar, Ken Rapson, puts the key to the Mt Roskill literacy project's success down to leadership, community support and having the right people to do the job.
Ken said the schools were always looking for ways to improve literacy, and being on the one campus provided the opportunity to work together rather than in isolation. The perception was that the students who were succeeding came from families with strong support for reading and literacy at home. "So we decided we needed to start in the home if we were to make changes."
The leadership and support of principals, senior staff and community leaders was particularly important in the early days of the programme. "The schools and the community have to take ownership, and then you have to get the right people in place to make it work," Ken said.
"We were lucky to get a good programme co-ordinator," he said. The co-ordinator then identified people from the local community and trained them to work with the families on the programme. "The tutors were amazingly dedicated ... they went far beyond what they were paid for, in order to make the programme work."
However, introducing such a programme was not easy. Above all, it required a lot of planning. "You need to be aware of the costs and the personnel demands. Getting the correct staff for any school initiative is important. With the financial aspects, we were careful to budget and stay in the parameters of the funding pool. But that was hard. You need careful financial management."
Appendix 1: Evaluation methodology
The researchers structured their evaluations on what is known as Stufflebeam's CIPP model for evaluation, which looks at four features in each programme: its context, the inputs, the process and the product.
The performance of each programme was rated against a cluster of variables, including:
- student change (socially, behaviourally and academically)
- parent, teacher and student satisfaction
- the administration of the innovation, including how well it was managed, the extent to which it was implemented, and the cost per child
- the willingness and capacity to evaluate, and the extent to which it completed its own evaluations
- the development, integration and sustainability of the innovation
- the amount of internal and external support the programme received
- the impact for Māori, Pakeha and other ethnic groups
- the effect of the innovation on teachers and on the whole school climate.
A range of data was collected from measurements, observations and interviews. This included standard achievement measures such as PAT scores, course levels achieved before and after the intervention, school data on attendance, truancy, and suspension, and teacher ratings on the overall behaviour of students in the programme each year, and so on.
Each innovation was rated on a standard scale (1 to 10) for the depth, quality and effect of each of the variables.
`Effect-sizes' were used, where feasible, to provide a standardised measure of the programmes' effectiveness across the range of variables.
"Effect-size' is a recognised statistical and analytical tool. Instead of simply measuring whether or not there was an effect (usually expressed in terms of statistical significance), it measures the magnitude of the effect, and is thus useful for comparing widely diverse programmes, as in this research, and for overcoming problems associated with small samples.
An effect-size of 1.0 indicates a difference of one standard deviation between two measures. This could be between the average score for students in an experimental group and the average score of students in a control group, or it could be between the average score for a group of students at one point in time and the average score for the same or similar group of students at a later point in time.
An effect size of 1.0 would be considered a very large impact. A more common effect size of an innovation would be to improve student achievement by an effect size of 0.40. This has become the benchmark figure for measuring the effectiveness of a school innovation. However, this figure relates to innovations for all students and when related specifically to at-risk students, the benchmark figure for assessing the magnitude of the effect is .50. That is, for a programme to have made a worthwhile difference in a particular area, such as improving student achievement, it would have to show an effect size of .50 or better.
The effect size for at risk students is higher than for all students because such programmes have more specific outcomes, such as reducing truancy levels or increase on-task behaviours, rather than more general goals such as increasing achievement in English literature.
The two evaluations of the innovation funding pool programmes are more complex and comprehensive than indicated by the references in this book, which draws only on those features of the research that are of direct relevance to the promotion of effective practice for schools wanting to introduce innovations to improve the educational achievements of at risk children.
Navigation
Contact Us
Education Data Requests
If you have any questions about education data please contact us:
Email: Requests Data and Insights
Phone: +64 4 463 8065