The measurement of the research performance of Tertiary Education Organisations: An analysis of the impact of weightings in the PBRF Publications
Analysis of the methodology used to assign quality categories in the 2003 quality evaluation found that the actual level of variation in the research performance of TEOs was less than was indicated in the published results, but reflects the outcomes sought from the Performance-based Research Fund (PBRF) policy.
Author(s): Warren Smart, Tertiary Sector Performance Analysis and Reporting, Ministry of Education.
Date Published: June 2005
The implementation of the Performance-Based Research Fund (PBRF) in 2004 was designed to reward excellence in research. As a result, for the first time in New Zealand , a comprehensive measurement of the relative research performance of Tertiary Education Organisations (TEOs) was undertaken and then reported.
The measure used to rank the research performance of PBRF eligible staff at TEOs in the 2003 assessment, quality category score per full-time equivalent (FTE) staff member, indicated a significant degree of variation in performance1. Although variation in performance is to be expected, the degree to which this variation is a reflection of the actual research performance of TEOs, or is influenced by the methodology used to calculate the quality category score, is the subject of this paper. Generally, universities were the top performers, while polytechnics, colleges of education and some private training establishments received significantly lower quality scores in comparison.
Weightings are applied in several areas of the PBRF to formulate funding allocations for TEOs2. The key weighting involves the allocation of 60 percent of the total PBRF funding to the quality evaluation of researcher performance, 25 percent to research degree completions and 15 percent for external research income. Weightings are also applied in other areas. These include weighting graduate completions by subject area and the ethnicity of the graduate and the weighting of the various performance measures that make up the quality evaluation3.
In this paper, two areas where weightings were applied in the PBRF are investigated to measure the degree to which they impact on the relative performance of TEOs.
Firstly, data for all PBRF eligible staff is used to analyse the impact of the weighting of A, B, C and R staff on the relative performance of TEOs in the quality evaluation. This analysis involves comparing the relative performance of TEOs using the published weighted measure of quality category score per FTE, with a measure where the weightings are removed.
Then, we analyse the impact of the methodology used by the peer review panels in assigning quality categories to those staff that were assessed by a PBRF assessment panel in the 2003 quality evaluation. This includes analysing the effect of weighting research outputs, peer esteem and contribution to research environment within the quality evaluation and the process used to classify staff into their various quality categories.
The analysis of relative TEO performance, using solely information relating to those whose evidence portfolios were quality assessed, is useful in investigating the methodology used by the peer review panels. However, as not all staff had evidence portfolios forwarded to the peer review panels, those TEOs that had a smaller proportion of non-panel assessed staff will be advantaged, as they will receive research scores that are inflated on a per FTE basis4. Therefore, the relative performance of TEOs, using only panel assessed staff data, should not be compared with measures of TEO performance that include all PBRF eligible staff.
- For example, the best performing TEO, the University of Auckland achieved an average quality category score of 3.94 per PBRF eligible FTE, whereas the lowest performing TEO, Bethlehem Institute of Education, received a score of 0.
- See Tertiary Education Commission (2003) Performance-Based Research Fund, Evaluating Research Excellence, the 2003 assessment, for a detailed explanation of the weightings and methodology used in the PBRF.
- See Ministry of Education (2005) An analysis of funding allocations for staff and research degree completions in the Performance-Based Research Fund, for an analysis of how the weightings applied in the PBRF impacted on the funding allocations to staff and research degree completions.
- For universities, the proportion of PBRF eligible staff that were not panel assessed ranged from 66 percent for the Auckland University of Technology to 8 percent for the University of Canterbury .