The PBRF Quality Evaluation 2025 Recommendations Introduce New Impact Considerations for New Zealand Institutions

Jun 24, 2022 | Blog, National Assessments

On the 20th June 2022, New Zealand’s Tertiary Education Commission (TEC) agreed in principle to the recommendations made by the PBRF Sector Reference Group (SRG) around changing the research definitions and design of Evidence Portfolios for the upcoming national Quality Evaluation (QE) 2025.

The report recommends that, for the first time since the PBRF’s inception in 2002, its definition of research excellence should be extended to include impact – amongst other areas such as heightened inclusion and consideration of Māori and Pacific research. Excellence will then be assessed on “the production and creation of knowledge… [and] the dissemination and application of that knowledge within academic and/or other communities and its impact outside the research environment”.

The suggestions include the following definition of impact, which is in line with other comparable national assessment frameworks from around the world:

“For the purposes of the Quality Evaluation, the impact of research is defined as a positive effect on, change, or benefit to society, culture, the environment, or the economy at any level, outside the research environment.

“Impacts on scholarship, research, or the advancement of knowledge within the research environment are not included.”

In short, this means that all New Zealand institutions that submit to the Quality Evaluation 2025 (the major funding component of the PBRF) should be prepared to demonstrate – with evidence – the impact that their research has had beyond the research environment.

Understanding the PBRF and QE

The PBRF – Performance Based Research Fund – is a funding process in New Zealand that assesses the performance of Tertiary Education Organisations (TEOs), including universities, colleges and research institutions, and then allocates funding based on their performance. It is designed to “increase the quality of research by encouraging and rewarding excellent research in Aotearoa New Zealand’s degree-granting organisations”.

Importantly, the PBRF does not fund research directly, but rather supports research capacity and capability. It is the second-largest fund administered by the TEC, after the Student Achievement Component funding; the TEC is a Crown entity that invests over $3 billion NZD of government funding into tertiary education every year, supporting more than 700 institutions.

The PBRF comprises three funding components; the Quality Evaluation, responsible for 55% of funding allocation, postgraduate research degree completions, making up a further 25%, and then the annually-assessed external research income, constituting the final 20%.

Since its inception in 2003, there have been four rounds of QE, with the most recent taking place in 2018. In that time the budget has risen from a relatively modest $18.2 million for the 2004 funding year to $315 million in the current pool. The potential rewards are therefore very high for successful individuals and institutions.

This model is somewhat analogous to cyclical national assessment exercises such as the UK’s REF (2014 and 2021) and Hong Kong’s RAE (2018), which directly influence where funding is allocated based on institutional performance. However, one key difference is that the PBRF assesses the Evidence Portfolios of individual researchers – rather than groups – but nevertheless allocates the funding to the institution.

Interestingly, since the PBRF was first initiated, over half of its funding has gone to just two universities; so will the changing definitions of research excellence create an opportunity for other institutions to receive a greater slice of the pie?

Preparing for Impact in New Zealand

The SRG will continue to operate until the publication of the final QE 2025 Guidelines in June 2023. This draws obvious parallels with both Australia and the UK, where the actual guidelines for what needs to be submitted to the ERA/EI and REF respectively are not announced until the submission deadline has drawn relatively near.

With huge bodies of work to draw from, this frequently causes big problems for researchers, administrators and other teams involved in the submission process. Which are the best projects to showcase? How can these be positioned in the best light? And where is the evidence of impact for projects that were carried out years ago, much earlier in the funding cycle?

All too often the tight timescales between the announcement of the guidelines and the actual submission deadline lead to last-minute rushes and can create huge amounts of stress and overwork for all involved.

So what can be done to resolve this and ensure that institutions and individuals are prepared for the introduction of impact?

One of the most useful techniques is to gather evidence as you go. Having to pore back over years of work, and then search for supporting information that may have been removed or deleted (which can easily happen with information posted online), becomes hugely inefficient and can lead to huge gaps in supporting material.

A popular way of responding to this challenge is through using the EvidenceVault within ImpactTracker, which provides a simple web/email clipping tool alongside comprehensive personal and institution-wide collections of evidence. This solution also enables you to easily associate evidence items with research projects for simple retrieval and demonstration of impact when required.

Another useful method is to structure research projects such that impact can be captured as it develops. Some impact can take years to manifest, but by choosing appropriate indicators, it can be easy to track these changes over time. You may prefer to adapt your existing project management techniques to incorporate certain impact indicators; or use a tool such as the planning, indicator selection and SDG alignment within ImpactTracker.

Based on the SRG’s recommendations, the highest possible quality rating for the assessment of Evidence Portfolios in 2025 – Quality Category A – will be awarded to research activities where the “EP contains evidence of activity that is recognised by peers as outstanding, representing the leading-edge in its field (including if appropriate through international publication or dissemination), demonstrates very significant contributions to the research environment, and/or has led to very significant impact.”

It is therefore critical that individuals and institutions properly prepare and take the necessary steps now to support the demonstration of social, economic, cultural, environmental impact – and eliminate the risk of the last-minute rush.

Examples of New Zealand Impact and Collaboration

In New Zealand, Crown Research Institutions (CRIs) already submit annual Impact Case Studies (ICS) to the SSIF that determines their funding.

But there is of course already great impact being carried out all across the country. One example is this collaboration between the University of Canterbury and Frontiers Abroad, published by Ako Aotearoa on the TrackImpact.org global collaboration platform.

Ako Aotearoa

Bring your own device (BYOD) to field class: Integrating digital and community mapping in field-based coursework
Addressing SDG 4

Ako Aotearoa Canterbury TrackImpact Screenshot

A 2-year project to utilise a field data collection app to incorporate technology, improve spatial awareness and sense of place, and develop community mapping programmes in geology and geography field classes. A collaboration of the University of Canterbury and Frontiers Abroad.

Learn more on TrackImpact.org.