Education Counts

Page navigation links

  • Education Counts Logo
  • Skip to Primary Navigation Menu
  • Skip to Secondary Navigation Menu
  • Skip to search
  • Skip to content

Site Search

Site Search

Site navigation menus

  • Know your Region
  • Communities of Learning
  • Find your nearest school
  • Early Learning Services
  • Directories
  • Publications
  • Indicators
  • Statistics
  • Topics
  • Data Services

Search the education counts website

Find pages with

Narrow results by:

Trends in measured research quality: An analysis of PBRF Quality Evaluation results Publications

Publication Details

This report analyses the results of the 2003 and 2006 Performance-Based Research Fund (PBRF) Quality Evaluations to identify the demographic and employment-related characteristics associated with change in the measured quality of research produced by around 2,000 staff who participated in both Quality Evaluations. This report forms part of a series called 'Research and knowledge creation.'

Author(s): Warren Smart, Tertiary Sector Performance Analysis, Ministry of Education.

Date Published: July 2008

Summary

  • This study analysed the change in measured research quality of around 2,000 staff who participated in both the 2003 and 2006 PBRF Quality Evaluations.
  • Although all three average research component scores increased on average for these staff, the average peer esteem and contribution to the research environment scores increased much faster than the average research output score.
  • The study also found that the prior performance of these staff in the 2003 Quality Evaluation was a key factor associated with improvement in their measured research quality, controlling for other factors. Staff with lower measured research quality in the 2003 Quality Evaluation, were more likely to improve their measured research quality in the 2006 Quality Evaluation.

This study used a mix of descriptive statistics and statistical modelling to identify how the demographic and employment-related characteristics of around 2,000 staff who participated in both the 2003 and 2006 Quality Evaluations were associated with changes in the level of their measured research quality.

Although other studies have also examined the change in measured research quality between the 2003 and 2006 Quality Evaluations1, this new study was more selective and focused solely on staff who had evidence portfolios assessed in both Quality Evaluations and who were not identified as new and emerging researchers. This group of staff were the only participants in the 2006 Quality Evaluation who had their research quality measured by peer-review panels in both Quality Evaluations and who were assessed under a similar scoring system. While this means that some important groups are omitted from the analysis, by focusing on this particular group of researchers, a clearer picture can be obtained of the components of change in measured research quality.

It should be noted that because staff and tertiary education organisations ‘learned’ from the experience of the first Quality Evaluation in 2003, the standard of presentation of the evidence portfolios improved in the 2006 Quality Evaluation and was a contributing factor to the improvement in measured research quality (Tertiary Education Commission, 2007). Because of this, part of any improvement in measured research quality identified in this study may be more a consequence of better presentation of evidence portfolios, than a ‘real’ lift in the quality of the research produced.

Two analytical approaches were used in this study to identify the factors associated with changes in measured research quality. Firstly, descriptive statistics were used to examine changes in measured research quality. The results of the descriptive statistics analysis showed that:

  • Between 2003 and 2006, the average peer esteem (PE) and contribution to the research environment scores (CRE) increased by much more than the increase in average research output (RO) score. This greater improvement in the PE and CRE score, compared with the (RO) score, suggests that the improvement in measured performance was at least partly due to improved presentation in evidence portfolios. This is because there is a greater subjective element in the assessment of these dimensions. Nonetheless, there was also a rise in RO score, if to a lesser extent than the other research component scores. Given this research component is potentially less subject to change as a result of improved presentation of evidence portfolios, this provides some evidence of an increase in quality of the research carried out by the staff selected in this study.

However, a drawback of using descriptive statistics to identify the factors associated with change in research performance is that, while it may appear that an association exists between a factor of interest and research performance, it may in fact be due to the influence of other confounding factors. To overcome this problem, this study then applied multiple regression to the sample dataset to further explore the association between the demographic and employment-related characteristics of staff and the change in measured research quality.

The results of the regression analysis showed that:

  • The prior performance of staff in the 2003 Quality Evaluation was a key factor associated with the change in their measured research quality between 2003 and 2006. Staff with lower performance in the 2003 Quality Evaluation achieved a greater level of improvement in their research quality, holding other factors constant. Because those who scored higher in the 2003 Quality Evaluation were less likely to have a lift in their quality category, this suggests that there is a nonlinear relationship between measured research quality and the quality categories.
  • Staff of higher academic rank achieved the greatest improvement in measured research quality, controlling for other factors.
  • Older staff achieved smaller improvements in measured research quality than younger staff, controlling for other factors.
  • Staff employed at the universities achieved greater improvements in measured research performance than staff at non-university tertiary education organisations, controlling for other factors.
  • Staff who submitted a greater number of research outputs in the 2006 Quality Evaluation compared with the 2003 Quality Evaluation achieved greater improvement in measured research quality, controlling for other factors. However the scale of improvement from increasing the number of submitted research outputs was relatively small and was mostly restricted to staff of lower academic rank.
  • In terms of subject area, staff in the ‘Māori knowledge and development’ panel had lower improvements in quality than other subject panels, controlling for other factors. However, the number of staff in this panel in the sample was small, so caution should be used in interpreting this result.

It is important to remember that this study analysed the performance of a select group of staff, those who participated in the 2003 and 2006 Quality Evaluations, had an evidence portfolio assessed both times and were not identified as new and emerging staff in 2006. Therefore, it should not be assumed that the findings of the analysis apply to all participants in the PBRF Quality Evaluations.

Finally, the way that the nature of the association between the increases in research quality within the various subgroups examined in this study changed following the application of regression analysis shows that care should be taken when using raw summary statistics to analyse any changes in performance. In particular, controlling for the level of measured research performance in 2003 was crucial to separating out the confounding effects of individual staff characteristics on measured research quality.

Footnote

  1. See Cinlar and Dowse (2008a, 2008b, 2008c) and White and Grice (2008).

Navigation

  • Tertiary Education
  • Research Performance/Funding

Downloads

  • Full Report (PDF, 478.4 KB)

Where to find out more

  • What determines the research performance of staff in NZ...
  • Research par excellence: the factors associated with higher...

Contact Us

For more information about the content on this webpage, please email the:  Tertiary Mailbox

Home Close Menu
  • Know your Region Show submenu
  • Communities of Learning Show submenu
  • Find your nearest school Show submenu
  • Early Learning Services Show submenu
  • Directories Show submenu
  • Publications Show submenu
    • Early Childhood EducationShow submenu
      • Responding to diverse cultures: Good practice in home-based early childhood servicesShow submenu
    • MāoriShow submenu
    • SchoolingShow submenu
    • PacificShow submenu
    • Tertiary EducationShow submenu
    • Learning SupportShow submenu
      • Learning Support Coordinators Evaluation: Phase 2Show submenu
    • InternationalShow submenu
    • Publication SeriesShow submenu
  • Indicators Show submenu
  • Statistics Show submenu
    • Annual Monitoring Reading RecoveryShow submenu
    • Attainment of 18-year-oldsShow submenu
    • AttendanceShow submenu
    • Attendance under COVID-19Show submenu
    • Beyond StudyShow submenu
    • Early Learning ParticipationShow submenu
    • ECE FinancesShow submenu
    • ECE ServicesShow submenu
    • ECE StaffingShow submenu
    • Early Leaving ExemptionsShow submenu
    • Entering & leaving teachingShow submenu
    • Fees Free tertiary educationShow submenu
    • Financial Support for Tertiary StudentsShow submenu
    • Funding to SchoolsShow submenu
    • HomeschoolingShow submenu
    • Initial Teacher Education StatisticsShow submenu
    • International students in NZShow submenu
    • Language use in ECEShow submenu
    • Literacy & NumeracyShow submenu
    • Māori Language in SchoolingShow submenu
    • NZ's Workplace-based LearnersShow submenu
    • Number of SchoolsShow submenu
    • Ongoing Resourcing SchemeShow submenu
    • Pacific Language in SchoolingShow submenu
    • Per Student Funding for SchoolsShow submenu
    • Post-compulsory education & trainingShow submenu
    • School BoardsShow submenu
    • School Board RepresentationShow submenu
    • School Leaver DestinationsShow submenu
    • School Leaver's AttainmentShow submenu
    • School RollsShow submenu
    • School Subject EnrolmentShow submenu
    • Stand-downs, suspensions, exclusions & expulsionsShow submenu
    • Teacher NumbersShow submenu
      • 2021Show submenu
      • 2020Show submenu
    • Teacher TurnoverShow submenu
    • Tertiary Financial PerformanceShow submenu
    • Tertiary ParticipationShow submenu
    • Tertiary Population DataShow submenu
    • Tertiary ResearchShow submenu
    • Tertiary ResourcingShow submenu
    • Tertiary Achievement & AttainmentShow submenu
    • Tertiary Summary TablesShow submenu
    • Total public expenditure on educationShow submenu
    • Transient StudentsShow submenu
    • Vocational Education & TrainingShow submenu
  • Topics Show submenu
  • Data Services Show submenu

Site information

  • Site map
  • Contact us
  • Feedback
  • About this site
  • Glossary
  • Accessibility
  • Copyright, Legal & Privacy
  • Links
  • © Education Counts 2022
  • Ministry of Education logo.
  • New Zealand Government logo.
Scroll to top of page