• Share on twitter
  • Share on facebook

REF 2021: Quality ratings hit new high in expanded assessment

Four in five outputs judged to be either ‘world-leading’ or ‘internationally excellent’.

  • Share on linkedin
  • Share on mail

REF 2021 submission rules help push quality to new high

The quality of UK scholarship as rated by the Research Excellence Framework has hit a new high following reforms that required universities to submit all research-active staff to the 2021 exercise.

For the first time in the history of the UK’s national audit of research, all staff with a “significant responsibility” for research were entered for assessment – a rule change that resulted in 76,132 academics submitting at least one research output, up 46 per cent from 52,000 in 2014.

Overall, 41 per cent of outputs were deemed world-leading (4*) by assessment panels and 43 per cent judged internationally excellent (3*), which was described as an “exceptional achievement for UK university research” by Steven Hill, director of research at Research England, which runs the REF.

In the 2014 assessment 30 per cent of research got a 4* rating, with 46 per cent judged to be 3*.

Who’s up, who’s down? See how your institution performed in REF 2021 Output v impact: where is your institution strongest? Unit of assessment tables: see who's top in your subject More staff, more excellent research, great impacts: David Sweeney on REF 2021

Analysis of institutional performance by Times Higher Education now puts the grade point average at UK sector level at 3.16 for outputs, compared with 2.90 in 2014. Scores for research impact have also increased , from 3.24 to 3.35.

At least 15 per cent of research was considered world-leading in three-quarters of the UK’s universities.

And analysis by THE suggests that institutions outside London have improved their performance the most, with several Russell Group universities from outside the “golden triangle” of Oxford, Cambridge and London making major gains .

REF 2021 results at a glance: 

See here for full results table.

The results of the REF will be used to distribute quality-related research funding by the UK’s four higher education funding bodies, the value of which will stand at around £2 billion from 2022-23.

The requirement to submit all research-active staff was introduced following the review of the REF conducted by Lord Stern in 2016 and was designed to reduce institutional “game-playing” over which staff members were submitted.

Outputs in REF 2014 and 2021 

Outputs in REF 2014 and 2021

The uptick in quality may be driven by universities focusing instead on which of their researchers’ outputs should be submitted, allowing greater flexibility to pick “excellent” scholarship.

In the 2014 exercise, each participating researcher was expected to submit four outputs, but this time the number of outputs can range between one and five, with an average of 2.5 per full-time equivalent researcher expected. In the 2021 exercise, a single output was submitted for 44 per cent of researchers who participated.

University staff had welcomed the new submission rules which had removed the “emotional pressure” caused by deliberations over whether they would be “in or out” of the REF – a decision that often had consequences for future promotions, said David Sweeney, executive chair of Research England. “There is no longer that same pressure on individuals,” he reflected.

Methodology: how THE calculates its REF tables

However, the rule change has been linked to universities’ decisions to move many staff on to teaching-only contracts in recent years, with the latest data showing that about 20,000 academics  are employed on such terms compared with five years ago .

This change represented a welcome clarification of academics’ roles rather than “game-playing’ on behalf of institutions, insisted Mr Sweeney. “If these contracts represent the expectations of institutions and the responsibilities of academics, that is not game-playing, it is transparency,” he said.

David Price, vice-provost (research) at UCL and chair of the REF’s main panel B (physical sciences, engineering and mathematics), agreed that “the REF may have helped in resolving many contractual ambiguities. Game-playing has not been noticeable,” he said.

Dame Jessica Corner, pro vice-chancellor (research and knowledge exchange) at the University of Nottingham , said that “less focus on individuals with the partial separation of outputs from academics has been helpful”.  

“That outputs can be returned by institutions where individuals worked if they move jobs has reduced, though not entirely eliminated, the academic transfer market,” she added.

James Wilsdon, Digital Science professor of research policy at the University of Sheffield , agreed. “The large-scale transfers of people between institutions that we saw in the lead-up to REF 2014 have definitely reduced, which is positive,” said Professor Wilsdon, who said that while “choices around inclusion and exclusion of individuals with ‘significant responsibility’ have been complex in some institutions – particularly less research-intensive universities – some have welcomed the clarity that this brought to different roles in terms of research, teaching and hybrid roles.”

“The game-playing, where it occurs, is often more subtle: it’s about the gradual sifting and reordering of what kinds of research, and what kinds of impact, are deemed ‘excellent’,” he explained.

Kieron Flanagan, professor of science and technology policy at the University of Manchester , questioned the extent to which REF game-playing had been eliminated.

“There is bound to have been some of this happening because habits are hard to break – people who run university research have come in a management system informed by REFs over the past 10 to 20 years – some game-playing is inevitable,” he said.

However, Jane Millar, emerita professor of social policy at the University of Bath , who chaired the social sciences REF panel, believed the Stern review reforms had “worked well”.

“We saw a great diversity in the submissions, from very small to very large, from well-established and new units,” said Professor Millar, who added that there had been “examples of world-leading and internationally excellent quality across the range”.

THE Campus: How I plan to get through REF results day THE Campus: The good, the bad and the way forward: how UK universities should respond to REF results THE Campus: Don’t let the REF tail wag the academic dog

Interdisciplinary research was also “well presented”, with the Stern reforms encouraging greater links between subjects, with “sub-panels whose reach stretched through to design and engineering, physical and/or biological sciences, humanities, biomechanics, and medicine”, added Professor Millar.

With an international review body  examining the future of the REF , there has been some speculation that this could be its final incarnation.

But Mr Sweeney said that the REF remained an important tool in justifying the £9 billion or so in research funding given to institutions in open-ended funding that is likely to flow from the exercise.

[email protected]

POSTSCRIPT:

Print headline:  REF submission rules help push quality to new high

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter

Or subscribe for unlimited access to:

  • Unlimited access to news, views, insights & reviews
  • Digital editions
  • Digital access to THE’s university and college rankings analysis

Already registered or a current subscriber? Login

Related articles

Large research-intensives in regional centres rose in REF 2021

REF 2021: Golden triangle looks set to lose funding share

Although major players still dominate on research power, some large – and small – regional institutions have made their mark

REF 2021 impact case studies show ‘real differences made to people’s lives’

REF 2021: Increased impact weighting helps push up scores

Greater weighting helps medical institutions in particular improve overall positions

Art collage with antique sculpture of Apollo face and numbers, geometric shapes. Beauty, fashion and health theme. Science, research, discovery, technology concept. Pop art style. Zine culture.

REF 2021: More staff, more excellent research, great impacts

The latest iteration of the UK’s national research audit has fulfilled its aim to identify research quality across the whole system, says David Sweeney

Reader's comments (2)

You might also like.

A locked book

For open monographs, collective library subscription is the key

These initiatives don’t demand extra funding, undervalue publisher input or create institutional or disciplinary divides, say Anthony Cond and Jane Bunker

Concept of pages flying from a book held by a woman to illustrate REF rules on open access books will have major consequences

Open-access books will push art history out of the picture

Extra costs linked to proposed new Research Excellence Framework rules could send art history departments to the wall, warns Francesca Berry

A visitor looks at a book on the side of the Big Ben Lying Down installation to illustrate Forget book deals if REF open access rules proceed, warn scholars

Forget book deals if REF open access rules proceed, warn scholars

Researchers say precariously employed academics will lose out if universities are required to stump up fees for open access

Two people in silhouette have a conversation

‘Co-creation’ of REF 2029 research environment metrics promised

Pilot to test potential indicators announced amid sector uncertainty

Featured jobs

the research excellence framework

the research excellence framework

Results of Research Excellence Framework are published

Colourful abstract image consisting of red, orange, blue, purple, pink and green

12 May 2022

The results of the UK-wide assessment of university research, conducted through the latest Research Excellence Framework (REF), have been published.

The 2021 assessment process has identified a substantial proportion of world-leading research across all UK nations and English regions, and across the full range of subject areas.

For the first time, the assessment included the submission of all staff with significant responsibility for research.

This means the results provide a unique insight to the quality of research conducted across the breadth of university activity.

Recognising excellence

REF has recognised the wide distribution of excellent research, both across the UK, with over 80% of research judged to be world-leading (4*) or internationally excellent (3*) in each UK nation and English region.

It has also recognised the wide distribution of excellent research across a broad group of universities, of all sizes and types, with world-leading quality identified in 99% of participating universities.

REF assessment process

The REF provides a robust and thorough assessment of the quality of universities’ research in all disciplines, providing accountability for public investment in research and demonstrating the benefits of that investment.

In total, 157 UK universities participated, submitting over 76,000 academic staff.

The submissions included:

  • research outputs
  • examples of the wider benefits of research
  • evidence about the research environment.

This material was assessed by a series of expert panels comprising of:

  • UK and international researchers
  • external users of research and experts in interdisciplinary research.

Executive Chair for Research England, David Sweeney, said:

Changes to the exercise have meant that we have been able to capture more of the excellent research undertaken by our globally-facing universities and the detailed results indicate that world-leading research is distributed widely across subjects, types of university, and in all parts of the UK. This particular exercise evidences the significant contribution research across the whole of the UK makes to the government’s levelling up agenda and reiterates that the UK higher education research sector is indeed playing its role in supporting the government to achieve its ambition as a science super power.

Share this page

  • Share this page on Twitter
  • Share this page on LinkedIn
  • Share this page on Facebook

This is the website for UKRI: our seven research councils, Research England and Innovate UK. Let us know if you have feedback or would like to help improve our online products and services .

Research Excellence Framework 2021

Latest Updates

REF 2021 was delivered by the REF team, based at Research England, on behalf of the four funding bodies. This report provides the REF Director’s review of the operational delivery of the exercise by the REF team, across the period from its inception in 2017 until its completion in 2022.

This report analyses the representativeness of academic staff holding significant responsibility for research and submitted to REF 2021 compared to the wider UK population, of distribution of output attribution amongst submitted staff, and scoring of outputs by the expert panels.

Assessment of the research environment in REF should focus on the university, instead of discipline level, recommends the panel that has been piloting a new assessment process.

EDAP members Raheela Khan, Fiona Ross and Tessa Parkes consider what the analysis of data from REF 2021 tells us about how people from Black, Asian and Minority Ethnic groups have been able to contribute to, and assess, research quality.

Useful Links

▶ results and submissions, ▶ end of exercise reports, ▶ ref2021 key facts, ▶ interdisciplinary research, ▶ guidance on results, ▶ about the ref, ▶ guidance and criteria on submissions, ▶ ref 2014 website.

  • Search Menu
  • Advance articles
  • Author Guidelines
  • Submission Site
  • Open Access
  • Why Publish?
  • About Research Evaluation
  • Editorial Board
  • Advertising and Corporate Services
  • Journals Career Network
  • Self-Archiving Policy
  • Dispatch Dates
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

1. introduction, 2. research excellence framework and related literature, 3. methodology, 5. conclusions, acknowledgements, assessing research excellence: evaluating the research excellence framework.

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Mehmet Pinar, Timothy J Horne, Assessing research excellence: Evaluating the Research Excellence Framework, Research Evaluation , Volume 31, Issue 2, April 2022, Pages 173–187, https://doi.org/10.1093/reseval/rvab042

  • Permissions Icon Permissions

Performance-based research funding systems have been extensively used around the globe to allocate funds across higher education institutes (HEIs), which led to an increased amount of literature examining their use. The UK’s Research Excellence Framework (REF) uses a peer-review process to evaluate the research environment, research outputs and non-academic impact of research produced by HEIs to produce a more accountable distribution of public funds. However, carrying out such a research evaluation is costly. Given the cost and that it is suggested that the evaluation of each component is subject to bias and has received other criticisms, this article uses correlation and principal component analysis to evaluate REF’s usefulness as a composite evaluation index. As the three elements of the evaluation—environment, impact and output—are highly and positively correlated, the effect of the removal of an element from the evaluation leads to relatively small shifts in the allocation of funds and in the rankings of HEIs. As a result, future evaluations may consider the removal of some elements of the REF or reconsider a new way of evaluating different elements to capture organizational achievement rather than individual achievements.

Performance-based research funding systems (PRFS) have multiplied since the United Kingdom introduced the first ‘Research Selectivity Exercise’ in 1986. Thirty years on from this first exercise, Jonkers and Zacharewicz (2016) reported that 17 of the EU28 countries had some form of PRFS, and this had increased to 18 by 2019 ( Zacharewicz et al. 2019 ).

A widely used definition of what constitutes a PRFS is that they must meet the following criteria ( Hicks 2012 ):

Research must be evaluated, not the quality of teaching and degree programmes;

The evaluation must be ex post, and must not be an ex ante evaluation of a research or project proposal;

The output(s) of research must be evaluated;

The distribution of funding from Government must depend upon the evaluation results;

The system must be national.

Within these relatively narrow boundaries, there is significant variation between both what is assessed in different PRFS, and how the assessment is made. With regards to ‘what’, some focus almost exclusively on research outputs, predominantly journal articles, whereas others, notably the UK’s Research Excellence Framework (REF), assess other aspects of research such as the impact of research and the research environment. With regards to ‘how’, some PRFS use exclusively or predominantly metrics such as citations whereas others use expert peer review, and others still a mix of both methods ( Zacharewicz et al. 2019 ). 1

This article focuses on UK’s REF, which originated in the very first PRFS, the Research Selectivity Exercise in 1986. This was followed by a second exercise in 1989 and a series of Research Assessment Exercises (RAEs) in the 1990s and 2000s. Each RAE represented a relatively gentle evolution from the previous one, but there was arguably more of a revolution than evolution between the last RAE in 2008 and the first REF in 2014 ( REF 2014 ), with the introduction of the assessment of research impact into the assessment framework (see e.g., Gilroy and McNamara 2009 ; Shattock 2012 ; Marques et al. 2017 for a detailed discussion on the evolution of RAEs in the UK). Three elements of research, namely research outputs, the non-academic impact of research and the research environment, were evaluated in the REF 2014 exercise. Research outputs (e.g., journal articles, books and research-based artistic works) were evaluated in terms of their ‘originality, significance and rigour’. The assessment of the non-academic impact of research is based on the submission of impact case studies that describe the details of the ‘reach and significance’ of impacts on the economy, society and/or culture, that were underpinned by excellent research. The research environment consisted of both data relating to the environment and a narrative environment statement. The environment data consisted of the number of postgraduate research degree completions and total research income generated by the submitting unit. The research environment statement provided information on the research undertaken, the staffing strategy, infrastructure and facilities, staff development activities, and research collaborations and contribution to the discipline. The quality of the research environment was assessed in terms of its ‘vitality and sustainability’ based on the environment data and narrative environment statements (see REF 2012 for further details).

There has been criticism of several aspects of the assessment of research excellence in the REF, including the cost of preparation and evaluation of the REF, the potential lack of objectivity in assessing them and the effect of the quasi-arbitrary or opaque value judgements on the allocation of quality-related research (QR) funding (see Section 2 for the details). Furthermore, the use of multiple criteria, which is the case for the REF (i.e., environment, impact and outputs), in assessing university performance has been long criticized (see e.g., Saisana, d’Hombres and Saltelli 2011 ; Pinar, Milla and Stengos 2019 ). These multidimensional indices are risky as some of the index components have been considered redundant ( McGillivray 1991 ; McGillivray and White 1993 ). For instance, McGillivray (1991) , McGillivray and White (1993) and Bérenger and Verdier-Chouchane (2007) use correlation analysis to examine the redundancy of different components of well-being when the indices are constructed. The main argument of these papers is that if the index components are highly and positively correlated, then the inclusion of additional dimensions to the index does not add new information to that provided by any of the other components. Furthermore, Nardo et al. (2008) also point out that obtaining a composite index with the highly correlated components leads to a double weighting of the same information and so overweighting of the information captured by these components. Therefore, this literature argues that excluding any component from the evaluation does not lead to loss of information if the evaluation elements are highly and positively correlated. For instance, by using correlation analysis, Cahill (2005) showed that excluding any component from a composite index produces rankings and achievements similar to the composite index. To overcome these drawbacks, principal components analysis (PCA) has been used to obtain indices (see e.g., McGillivray 2005 ; Khatun 2009 ; Nguefack‐Tsague, Klasen and Zucchini 2011 for the use of PCA for well-being indices, and see Tijssen, Yegros-Yegros and Winnink 2016 and Robinson-Garcia et al. 2019 for the use of PCA for university rankings). The PCA transforms the correlated variables into a new set of uncorrelated variables using a covariance matrix, which explains most of the variation in the existing components ( Nardo et al. 2008 ).

This article will contribute to the literature by examining the redundancy of the three components of the REF by using the correlation analysis between them to examine the relevance of each component for the evaluation. If the three elements of the REF are highly and positively correlated, then excluding one component from the analysis will not result in major changes in the overall assessment of universities and funding allocated to them. This article will examine whether this would be the case. Furthermore, we will also carry out PCA to obtain weights that would produce an index that explains most of the variation in the three elements of the REF while obtaining an overall assessment of higher education institutes (HEIs) and distributing funding across them.

The remainder of this article is structured as follows. In Section 2, we will provide details on how the UK’s REF operates, identify the literature on the REF exercise and outline the hypotheses of the article. In Section 3, we provide the detailed data used in this article and examine the correlation between the environment, impact and output scores. In this section, we also provide the details of the QR funding formula used to allocate the funding and demonstrate the correlation between the funding distribution in the environment, impact and output pots. We also will carry out PCA by using the achievement scores and funding distributed in each element in this section. Finally, in this section, we provide an alternative approach to the calculation of overall REF scores and the distribution of QR funding based on the hypotheses of the article. Section 4 will consider the effect on the distribution of QR funding for English universities 2 and their rankings when each element is removed from the calculation one at a time and PCA weights are used. Finally, Section 5 will identify conclusions of our analyses and the implications for how future REF assessment exercises might be structured.

Research assessment exercises have existed in the UK since the first Research Selectivity Exercise was undertaken in 1986. A subsequent exercise was held in 1989, which was followed by RAEs in 1996, 2001 and 2008. Each HEI’s submission to the 1986 exercise comprised a research statement in one or more of 37 subject areas, together with five research outputs per area in which a submission was made (see e.g., Hinze et al. 2019 ). The complexity of the submissions has increased from that first exercise, and in 2014 the requirement to submit case studies and a narrative template to allow for the assessment of research impact was included for the first time, and the exercise was renamed to the REF.

The REF 2014 ‘Assessment Framework and Guidance on Submissions’ (REF 2011) indicated that a submission’s research environment would be assessed according to its ‘vitality and sustainability’, using the same five-point (4* down to unclassified) scale as for the other elements of the exercise. 3

Following the 2014 REF exercise, there have been many criticisms of REF. For instance, the effects of the introduction of impact as an element of the UK’s research assessment methodology has itself been the subject of many papers and reports in which many of the issues and challenges it has brought have been discussed (see e.g., Smith and Ward, House 2011 ; Penfield et al. 2014 ; Manville et al. 2015 ; Watermeyer 2016 ; Pinar and Unlu 2020a ). Manville et al. (2015) and Watermeyer (2016) show that academics in some fields were concerned about how their research focus would be affected by the impact agenda by forcing them to produce more ‘impactful’ research than carrying out their own research agenda. On the other hand, Manville et al. (2015 ) demonstrate that there have been problems with the peer reviewing of the impact case studies where reviewer panels struggled to distinguish between 2-star and 3-star and, most importantly, between 3-star and 4-star. Furthermore, Pinar and Unlu (2020a ) demonstrate that the inclusion of the impact agenda in REF 2014 increased the research income gap across HEIs. Similarly, the literature identifies some serious concerns with the assessment of the research environment ( Taylor 2011 ; Wilsdon et al. 2015 ; Thorpe et al. 2018a , b ). Taylor (2011) considered the use of metrics to assess the research environment, and found evidence of bias towards more research-intensive universities in the assessment of research environment in the 2008 RAE (see Pinar and Unlu 2020b for similar findings for the REF 2014). In particular, he argued that the judgement of assessors may have an implicit bias and be influenced by the ‘halo effect’, where assessors allocate relatively higher scores to departments with long-standing records of high-quality research, and showed that members of Russell Group universities benefited from a ‘halo effect’, after accounting for various important quantitative factors. Wilsdon et al. (2015) wrote in a report for the Higher Education Funding Council for England (HEFCE), which ran the REF on behalf of the four countries of the UK, in which those who had reviewed the narrative research environment statements in REF 2014 as members of the panels of experts expressed concerns ‘that the narrative elements were hard to assess, with difficulties in separating quality in research environment from quality in writing about it.’ Thorpe et al. (2018a , b ) examined environment statements submitted to REF 2014, and their work indicates that the scores given to the overall research environment were influenced by the language used in the narrative statements, and whether or not the submitting university was represented amongst those experts who reviewed the statements. Finally, a similar peer-review bias has been identified in the evaluation of research outputs (see e.g., Taylor 2011 ). Overall, there have been criticisms about the evaluation biases in each element of the REF exercise.

Another criticism of the REF 2014 exercise has been that of the cost. HEFCE commissioned a review of it ( Farla and Simmonds 2015 ) which estimated the cost of the exercise to be £246 million ( Farla and Simmonds 2015 , 6), and the cost of preparing the REF submissions was £212 million. It can be estimated that roughly £19–27 million was spent preparing the research environment statements, 4 and £55 million was spent in preparation of impact case studies, and the remainder cost of preparation may be associated with the output submission. Overall, the cost of preparing each element was significant. Since there is a good agreement between bibliometric factors and peer review assessments ( Bertocchi et al. 2015 ; Pidd and Broadbent 2015 ), it has been argued that cost of evaluating outputs could be decreased with the use of bibliometric information (see e.g., De Boer et al. 2015 ; Geuna and Piolatto 2016 ). Furthermore, Pinar and Unlu (2020b ) found that the use of ‘environment data’ alone could minimize the cost of preparation of the environment part of the assessment as the environment data (i.e., income generated by units, number of staff and postgraduate degree completions) explains a good percentage of the variation between HEIs in REF environment scores.

Because of these criticisms, together with Kelly (2016) and Pinar (2020 )’s works which show that a key outcome of the REF, which is to distribute ca. £1bn per annum of QR funding, is dependent upon somewhat arbitrary or opaque value judgements (e.g., the relative importance of world-leading research compared to internationally excellent research and the relative cost of undertaking research in different disciplines). In this article, we will contribute to the existing literature by using correlation analysis to examine the redundancy of each research element, and also use PCA to obtain weights for each element that overcome high correlation between three elements but explain most of the variation in achievements and funding distribution in each element.

The three components of the REF are highly and positively correlated (see next section for correlation analysis), and a high and positive correlation amongst the three components would suggest that removal of one component from the REF would have only a small effect on the QR funding distribution and overall performance rankings based on the redundancy literature (e.g., McGillivray 1991 ; McGillivray and White 1993 ; Bérenger and Verdier-Chouchane 2007 ). Therefore, based on the arguments put forward in the redundancy literature, we set the hypotheses of this article as follows:

Hypothesis 1: Exclusion of one of the REF elements from the distribution of the mainstream QR funding would lead to relatively small shifts in the allocation of funds if three components of the REF elements are positively and highly correlated. Hypothesis 2: Exclusion of one of the REF elements from the calculation of overall REF grade point averages (GPAs) obtained by HEIs would result in relatively small shifts in the rankings of HEIs when REF elements are positively and highly correlated. Hypothesis 3: Overall REF GPAs and allocation of funding with the PCA weights given to each element of REF would result in small shifts in rankings and funding allocation when three components of the REF are highly and positively correlated.

In this section, we will provide the details of the data sources for the REF results and QR funding allocation based on the REF results. We will also discuss the alternative ways of obtaining overall REF scores and QR funding allocation.

3.1 REF results data

In REF 2014, each participating UK institution submitted in one or more disciplinary areas, known as ‘units of assessment’ (UOAs). Each submission comprised three elements:

A number of research outputs. The expected number of research outputs submitted by each UOA was four times the full-time equivalent (FTE) staff included in that submission, unless one or more staff members was allowed a reduction in outputs. Each FTE staff member was expected to submit four research outputs, but reductions in outputs were allowed for staff members who had individual circumstances which included that they were early career researchers, had taken maternity, paternity or adoption leave during the assessment period, or had had health problems.

A number of case studies demonstrating the impact of research undertaken within that UOA, and a narrative ‘impact template’ which included a description of the UOA’s approach to generating impact from its research. Each case study was a maximum of four pages and the rules stipulated that the number of case studies required depended upon the number of FTEs submitted in the UOA, as was the length of the impact template. Ninety-five per cent of submissions by English universities comprised between two and seven case studies and narratives that were three or four pages long. 5

Information about the research environment, which comprised a narrative ‘environment statement’ describing the research environment, together with data on research income and PhD completions. As with the impact narrative the length of the environment statement was dependent upon the number of FTEs submitted, with 95% of submission from English universities comprising narratives which were between 7 and 12 pages long.

After the submission of UOAs, each individual component in these elements (e.g., a research output, an impact case study) was given a score on a five-point ‘star’ scale, namely 4* (world-leading), 3* (internationally excellent), 2* (internationally recognized), 1* (nationally recognized) and unclassified (for elements which were below the 1* standard) by the peer-reviewers. From the scores for each individual component in each element, a profile for each element was obtained and this was the information which was released by HEFCE. This profile for each element, obtained from REF (2014) gives the percentage of the research in each element (i.e., research outputs, environment and impact) that were rated as 4*, 3*, 2*, 1* or unclassified. Finally, an overall research profile of the UOA is calculated where each element’s score was weighted 65:20:15 for outputs: impact: environment.

To test whether the quality of the research environment, impact and outputs are correlated, we obtain each individual submissions’ weighted average environment, impact and output scores. 6   Table  1 provides a correlation matrix between GPA scores of different elements. This table shows that GPA scores are positively and significantly correlated with each other at the 1% level. Table  2 shows the results of PCA of the three elements when GPA scores in each element are used. The first principal component accounts for approximately 79.0% of the variation in three elements. In comparison, the first two principal components account for approximately 92.5% of the variation in three elements. Clearly, the first principal component contains most of the statistical information embedded in the three elements. Second, the first principal component results in roughly similar eigenvectors, suggesting that the overall GPA scores could be obtained using roughly equal weights given to each element when the eigenvectors are normalized to sum the weights to 1.

Correlation matrix between different element GPAs

Note: Asterisk (*) represents a significance level at the 1% level.

Results of PCA of the three elements using GPA scores

Since all the elements are positively and significantly correlated with each other, removing one of the elements from the REF assessment or an alternative combination of the REF elements (via PCA weights) might have a little overall effect on the distribution of QR income and overall achievement.

3.2 QR funding allocation data based on REF results

Based on the REF results obtained by UOAs, Research England describes how it distributes QR funding in Research England (2019a) . In brief, QR funding comprises six elements: (1) mainstream QR funding; (2) QR research degree programme supervision fund; (3) QR charity support fund; (4) QR business research element; (5) QR funding for National Research Libraries; and (6) the Global Challenge Research Fund. The mainstream QR funding is the largest, comprising approximately two-thirds of the overall QR funding, and is the element which is most directly related to an institution’s performance in REF 2014. The data for the mainstream QR funding allocations across panels, UOAs and HEIs during the 2019–20 funding period are obtained from Research England (2019b) .

In calculating an institution’s mainstream QR funding, Research England follows a four-stage process:

The mainstream QR funding is separated into three elements, for outputs, impact and environment, with 65% of funding for outputs, 20% for impact and 15% for environment.

The funding for each of the three elements is distributed amongst the four ‘main subject panels’ 7 in proportion to the volume of research in each main panel which was 3* or above, weighted to reflect an assumed relative cost of research in different disciplines.

Within each main panel, mainstream QR funding is distributed to each UOA according to the volume of research at 3* or above and the cost weights (which reflect the relative cost of undertaking research in different disciplines), and with an additional multiplier of 4 being given to research rated as world-leading, i.e., 4* research, compared to internationally excellent, or 3*, research.

The mainstream QR funding for each element in each UOA is then distributed to individual HEIs according to the volume of research at 3* or above produced by that HEI, with the cost and quality weights taken into account.

Therefore, a university’s total QR mainstream funding comprises an amount for each element of outputs, impact and environment, for each UOA in which it made a submission.

Since the allocation of the mainstream QR funding in each pot (environment, impact and output) is closely related to the performance of the UOAs in each respective research element, we also found positive and significant correlation coefficients between mainstream QR funding distributed to the UOAs in the environment, impact and output pots at the 1% level (see Table  3 ). Similarly, when we carried out PCA analysis, we found that the first principal component accounts for approximately 97% of the variation in the components, and the first principal component results in roughly similar eigenvectors (see Table  4 ), suggesting that equal funding could be distributed in the environment, impact and output pots.

Correlation matrix between different funding pots

Results of PCA of the three elements using funds distributed in each pot

3.3 Alternative ways of allocating QR funding and obtaining overall REF scores

Based on the arguments in the redundancy literature, we examine the effects of excluding one element of the evaluation while distributing QR funding and calculating overall REF scores. Initially, as described in Section 3.2, the mainstream QR funding is distributed across three pots (i.e., output, environment and impact) where 65%, 20% and 15% of the mainstream QR funding is distributed based on the performances of the submissions in output, impact and environment elements in REF 2014, respectively (i.e., step 1 of the funding formula). Similarly, the overall REF scores of units and HEIs were obtained by a weighted average of the three elements where the output, impact and environment performances were weighted 65%, 20% and 15%, respectively. If one of the elements (i.e., environment, impact and output) is excluded, the weight given to it should be allocated amongst the other two elements to redistribute the QR funding and to obtain the overall REF scores, so that the weights sum to 100%. In the first scenario, we exclude the environment element and reallocate the weight of environment to output and impact in proportion to their initial weights: 65:20, which becomes 76.5% and 23.5%. 8 For the second scenario, we exclude the impact element and reallocate the weight of impact to the environment and output in proportion to their initial weights: 15:65, which results in 18.75% and 81.25%. Finally, if we exclude the output element, then the environment and impact elements are allocated 43% and 57% weights based on their initial weight ratio of 15:20. Finally, as a fourth scenario, we rely on the results obtained with the PCA and that each element is kept in the calculation of the overall GPA and distribution of QR funding, but instead, each element is given equal weights (i.e., 33.33%).

Based on the funding formula of the mainstream QR funding allocation (see Research England 2019a , 16–9 for details on how the mainstream QR funding is allocated or Section 3.2 of this article for the steps), we follow the same steps to redistribute the mainstream QR funding across different panels, UOAs and HEIs based on the alternative scenarios. To obtain the overall REF scores of HEIs, the overall GPA of each unit is obtained by weighting the GPA of output, impact and environment elements with 65%, 20% and 15%, respectively. With the alternative scenarios, we will obtain the overall GPA of the HEIs by weighting elements with respective scenario weights as discussed above.

4.1 Alternative way of allocating QR funding

In this subsection, we will examine the effect of the mainstream QR funding distribution to different panels, UOAs, and HEIs in England with Scenarios 1–4 compared to the official mainstream QR funding allocation. To provide an overall picture of the amount of mainstream QR funding distributed in 2019–20 funding period, Table  5 provides the amount of mainstream QR funding distributed in each of the three pots with the official REF 2014 results, and mainstream QR funding distributed with the alternative scenarios proposed in this article. During the 2019–20 funding period, a total of £1,060 million (i.e., just over a billion pounds) was distributed under the mainstream QR funding and roughly £159 million, £212 million and £689 million of mainstream QR funding are distributed in the environment, impact and output pots across the English HEIs, respectively. 9 On the other hand, with Scenarios 1, 2 and 3, no mainstream QR funding is distributed in the environment, impact and output pots, respectively. Whereas, equal amounts of funds are distributed in each pot with Scenario 4. With Scenario 1, £249 million and £811 million of mainstream QR funding are distributed based on the REF 2014 performances in impact and output elements, indicating that an additional £37 million and £122 million are distributed in the impact and output pots compared to the official scenario, respectively. In contrast, with Scenario 2, £199 million and £862 million of mainstream QR funding are distributed in environment and output elements, indicating that additional £39 million and £72 million are distributed in the environment and output pots compared to the official scenario, respectively. In Scenario 3, £456 million and £605 million are distributed in environment and impact pots, respectively, suggesting that additional £297 million and £392 million were distributed in respective pots compared to the official scenario. Finally, with Scenario 4, an equal amount of funds (i.e., £353.5 million) are distributed in each pot where more funding is allocated in environment and impact pots and less funding is distributed in output pot.

Distribution of mainstream QR funding across different pots based on the REF2014 results and alternative scenarios

Table  6 provides the allocation of the mainstream QR funding to four main panels (i.e., Panel A: Medicine, Health and Life Sciences; Panel B: Physical Sciences, Engineering and Mathematics; Panel C: Social Sciences; Panel D: Arts and Humanities) with the REF 2014 results, and with alternative scenarios. This table also provides the change in the mainstream QR funding received by four main panels from the official allocation to alternative scenarios where a positive (negative) change indicates that the panel would have received more (less) funding with the alternative scenario compared to the official allocation. The results suggest that the panel B would have been allocated more funds, and panels A, C and D would have been allocated less QR funding with the alternative Scenarios 1 and 2 compared to the official allocation, suggesting that exclusion of environment and impact elements would have benefitted panel B. On the other hand, panel B (panels A, C and D) would have generated less (more) QR funding with the third and fourth scenarios (i.e., when the output element is excluded, and equal amount of funds distributed in each pot, respectively) compared to the official scenario. Overall, with the reallocation of QR funding with alternative Scenarios 1, 2, 3 and 4, only 0.34%, 0.64%, 2.29% and 1.08% of the total mainstream QR funding (i.e., £3.6 million, £6.8 million, £24.3 million and £11.5 million) would have been reallocated across the four main panels with the alternative allocation scenarios compared to the official one.

Allocation of the mainstream QR funding to four main panels with the alternative scenarios

Note: Panels A (Medicine, Health and Life Sciences), B (Physical Sciences, Engineering and Mathematics), C (Social Sciences) and D (Arts and Humanities) consist of the UoAs between 1 and 6, 7 and 15, 16 and 26, and 27 and 36, respectively.

Table  7 reports the official QR funding allocation and the QR funding allocation changes between alternative scenarios and official scenario in different UOAs where a positive (negative) figure suggests that the UOA received relatively more (less) QR funding with the alternative scenario compared to the official case. We find, for example, that the Computer Science and Informatics, and the Public Health, Health Services and Primary Care units would have received £2.0 million more and £1.2 million less QR funding when the environment element is excluded (Scenario 1) compared to the official scenario, respectively. On the other hand, when the impact element is excluded (Scenario 2), the Biological Sciences and Clinical Medicine units would have generated £3.0 million more and £4.3 million less than the official scenario, respectively. When the output element is excluded from the evaluation (Scenario 3), we find that the Clinical Medicine and Biological Sciences units would have generated £11.7 million more and £7.2 million less compared to the official scenario, respectively. Finally, if all three elements are weighted equally (Scenario 4), Clinical Medicine and Computer Science and Informatics units would have generated £5.1 million more and £3.5 million less than the official scenario, respectively. This evaluation clearly shows in which elements specific subjects perform better (worse) than other subject areas. Even though we observe changes in funds generated by each unit with alternative scenarios, there is a limited funding shift across units. Overall, the total amounts reallocated across different UOAs are £5.9 million, £11.5 million, £36.9 million and £17.2 million with Scenarios 1, 2, 3 and 4, which correspond to 0.55%, 1.08%, 3.48% and 1.62% of the total mainstream QR funding, respectively.

Allocation of mainstream QR funding across different UoAs and changes in funding allocation with alternative scenarios compared to benchmark

Note: A positive (negative) figure in changes columns suggests that the UOA received relatively more (less) QR funding with the respective alternative scenario compared to the official case.

Finally, we examine the effect of alternative QR funding allocations on the funding received by HEIs. Table  8 shows the five HEIs that would have generated the biggest increase (decrease) in mainstream QR funding with the alternative scenarios compared to the official allocation. The data show that the University of Leicester, University of Plymouth, University of East Anglia, University of Birmingham and the University of Surrey would have generated £745k, £552k, £550k, £522k and £464k more QR funding with the first scenario compared to the official scenario, whereas University College London, University of Cambridge, University of Oxford, University of Manchester and the University of Nottingham would have generated £3.4 million, £2.1 million, £2million, £1.5 million and £1.4 million less, respectively. On the other hand, the University of Cambridge would have generated £1.9 million more if the impact element is excluded (Scenario 2), and University College London would have generated £9.8 million and £5.6 million more if the output element is excluded (Scenario 3) and each element is weighted equally (Scenario 4), respectively. In comparison, the University of Leeds, University of Birmingham and University of Leicester would have generated £1 million, £2.4 million and £1.3 million less with Scenarios 2, 3 and 4, respectively. Overall, the total amounts reallocated across different HEIs are £15.5 million, £11.1 million, £46.7 million and £25.6 million with Scenarios 1, 2, 3 and 4, which correspond to just 1.46%, 1.05%, 4.41% and 2.42% of the total mainstream QR funding, respectively. Furthermore, only a handful of universities would have experienced a significant change in their funding allocation with alternative scenarios where 6, 3, 25 and 10 HEIs experienced a difference in their QR funding allocation of more than £1 million with Scenarios 1, 2, 3 and 4 compared to the official one, respectively (see Appendix Table A.1 for the allocation of the mainstream QR funding to the HEIs with the official case and also the difference in the allocation of QR funding between alternative scenarios and official one).

Five HEIs that would generate more (less) with the alternative scenarios compared to the official scenario

4.2 Ranking of HEIs

Since the REF exercise is used in the rankings of HEIs, in this subsection, we will evaluate the effect of different scenarios on the overall GPA and rankings of HEIs. Table  9 offers the Spearman’s rank correlation coefficients between GPA scores obtained with the official scenario and the GPA scores obtained with the alternative scenarios. We find that the GPA scores obtained with the alternative scenarios are highly and positively correlated with the official GPA scores at the 1% level. Even though the correlation coefficients between GPA scores of HEIs with the alternative scenarios and official one are highly and positively correlated, some HEIs would have been ranked in relatively higher (lower) positions with the alternative scenarios compared to the official scenario. Amongst 111 HEIs, just 9, 5, 22 and 5 HEIs experienced more than 10 position changes in their ranking with the Scenarios 1, 2, 3 and 4, respectively, compared to the official rankings. For instance, Guildhall School of Music & Drama would have experienced a major improvement in their ranking with the third scenario as it would have been ranked in the 53rd position when output element is excluded (i.e., Scenario 1) compared to the 89th position with the official scenario. On the other hand, London Business School would have been ranked in the 32nd position with the third scenario, but ranked 7th with the official scenario (see Appendix Table  A.2 for the GPA scores and respective rankings of HEIs with the official case and Scenarios 1, 2, 3 and 4). However, with very few exceptions, it can be seen that the difference between the rankings in the alternative scenarios compared with the official rankings is relatively small.

Spearman’s rank correlation coefficients between official and alternative scenario GPAs

Note: Asterisk (*) represents significance level at the 1% level.

Given concerns over possible bias in the assessment of the three elements of the REF and the cost of preparing the REF return ( Farla and Simmonds 2015 ), we evaluated the implications of the exclusion of different elements from the REF. Since three components of the REF are positively and highly correlated, each of the elements of the REF could be considered redundant and therefore, this article examined the QR funding allocation implications to different panels, UOAs and HEIs when an element (environment, impact and output) of the REF was disregarded from the allocation of the QR funding and the effect on the obtaining the overall REF GPAs. Furthermore, we also use the PCA method to get weights that explain most of the variation in the funding distributed amongst three elements, which suggested that using equal weights to distribute funds explains most of the variation in funding distribution in three pots.

We found that the exclusion of one element from the REF or using equal weights would have benefited (disadvantaged) some HEIs, but at most £46.7 million (out of over £1 billion) would have been reallocated between HEIs when the output element is excluded from the evaluation. Furthermore, when different elements are excluded from the rankings and the weight of the excluded element redistributed between the other two (in proportion to their original weightings) to produce new rankings, these rankings are highly and significantly correlated with the official rankings, suggesting that alternative ways of obtaining composite scores lead to rankings similar to the official one. Overall, the main argument of this article is that given the high cost of preparing REF returns, the potential bias in assessing each component, and the relatively small effect on QR income distribution and universities’ relative rankings of removing some elements of the REF assessment, removal of some elements from the assessment process may be considered for future assessment exercises.

This article does not quantify the bias involved in the evaluation of each element of the REF exercise, and therefore, we do not provide any suggestion about which element should be removed from the REF. Instead, our findings demonstrate that excluding a component from the REF evaluation does not result in significant rank reversals in overall outcomes and reallocation of funds across units and HEIs.

In addition, the assessment of outputs and impact cases in the REF are both based on the submit-to-be-rated methodology from 1986 by which, in essence, the achievements of individuals, not of the organization, are summed up. Based on the definition of organizational evaluation by BetterEvaluation (2021) , impact and output evaluations of the REF are based on the achievements of individuals, and if the aim is to evaluate the organizations, then evaluation of the impact and output elements, which are in essence individual achievements, could be removed, and their removal from the evaluation will not result in significant effects as found in this article. Therefore, if the REF aims to evaluate the organizational performance, the choice of the components should be further motivated by and rely on the metrics that evaluate the organization rather than the individual achievements.

Furthermore, if future evaluations include new metrics that aim to measure organizational achievement, these metrics should be carefully chosen to provide a new set of information beyond the existing indicators. Therefore, these indicators should not be highly correlated with the already existing indicator set so that new information is captured through their assessment.

There is a significant body of literature on PRFS, and for a review of these systems, the reader is directed to a number of papers and references ( Rebora and Turri 2013 ; Bertocchi et al. 2015 ; De Boer et al. 2015 ; Hicks et al. 2015 ; Dougherty et al. 2016 ; Geuna and Piolatto 2016 ; Sivertsen 2017 ; Zacharewicz et al. 2019 , amongst many others).

Education is a devolved matter in the UK, and university funding and oversight in 2014 was the responsibility of the Higher Education Funding Council (HEFCE) in England, the Scottish Funding Council (SFC) in Scotland, the Higher Education Funding Council for Wales (HEFCW) in Wales and the Department for Employment and Learning (DELNI) in Northern Ireland. The formulae which converted REF performance into QR funding were different in the different administrations, and this article only examines the QR distribution across English HEIs.

An environment that is conducive to producing research of world-leading quality, internally excellent quality, international recognized quality and nationally recognized quality is given 4*, 3*, 2* and 1* scores, respectively. On the other hand, an environment that is not conducive to producing research of at least nationally recognized quality is considered as unclassified.

The cost to UK HEIs of submitting to REF, excluding the impact element was estimated at £157 million ( Farla and Simmonds 2015 , 6). It is further estimated that 12% of time spent at the central level was on the environment template and 17% of time at the UOA level (see Figures 5 and 6 of Farla and Simmonds 2015 , respectively). The estimate of £19–27 million is obtained as 12–17% of the overall £157 million non-impact cost of submission. Furthermore, it was found that the panel members spent on average 533 h on panel duties, which represented an estimated cost to the sector of £23 million (see Farla and Simmonds 2015 , 40, 41).

As stated previously, because the devolved administrations of the UK used different methods to calculate QR income, this article focusses just on English institutions.

The scores for each individual output, environment or impact component are not given on the REF 2014 website, www.ref.ac.uk/2014 . In other words, the ratings of each research output, research environment element and impact case study are not provided. However, the REF results instead provide the percentage of the overall research elements (i.e., research output, environment and impact) that were rated as 4*, 3*, 2* and 1* and unclassified. Therefore, the weighted average of the research elements (i.e., output, environment and impact) are obtained as follows. If the 35%, 30%, 20% and 15% of the research element of a given submission were rated as 4*, 3*, 2* and 1*, respectively, then the weighted average score of this element would be (35 * 4+30 * 3+20 * 2+15 * 1)/100=2.85.

The four main panels are groupings of individual UOAs which broadly speaking encompass medical, and health and biological sciences (Panel A), physical sciences and engineering (Panel B), social sciences (Panel C) and humanities and arts (Panel D).

These percentage weights are obtained by (0.65/0.85)×100 and (0.2/0.85)×100, respectively.

Note that HEIs within inner and outer London area receive 12% and 8% (respectively) additional QR funding on top of their allocated mainstream QR funding but to examine the effect of the exclusion of alternative scenarios on the allocation of the mainstream QR funding, we do not consider the additional funding allocation that is based on the location of HEI.

We would like to thank the editor and three anonymous referees for very constructive and insightful reviews of earlier drafts of this article.

Conflict of interest statement . None declared.

Bertocchi G. , Gambardella A. , Jappelli T. , Nappi C. A. , Peracchi F. ( 2015 ) ‘ Bibliometric Evaluation vs. Informed Peer Review: Evidence from Italy ’, Research Policy , 44 : 451 – 66 .

Google Scholar

Bérenger V. , Verdier-Chouchane A. ( 2007 ) ‘ Multidimensional Measures of Well-Being: Standard of Living and Quality of Life across Countries ’, World Development , 35 : 1259 – 76 .

BetterEvaluation ( 2021 ) Evaluating the Performance of an Organisation < https://www.betterevaluation.org/en/theme/organisational_performance > accessed 15 October 2021.

Cahill M. B. ( 2005 ) ‘ Is the Human Development Index Redundant? ’, Eastern Economic Journal , 31 : 1 – 6 .

De Boer H. , Jongbloed B. , Benneworth P. , Cremonini L. , Kolster R. , Kottmann A. , Lemmens-Krug K. , Vossensteyn H. ( 2015 ) ‘Performance-based funding and performance agreements in fourteen higher education systems’, report for the Ministry of Education, Culture and Science, The Hague: Ministry of Education, Culture and Science.

Dougherty K. J. , Jones S. M. , Lahr H. , Natow R. S. , Pheatt L. , Reddy V. ( 2016 ) Performance Funding for Higher Education , Baltimore, MD : Johns Hopkins University Press .

Google Preview

Farla K. , Simmonds P. ( 2015 ) ‘REF accountability review: Costs, benefits and burden’, report by Technopolis to the four UK higher education funding bodies. Technopolis Group < http://www.technopolis-group.com/report/ref-accountability-review-costs-benefits-and-burden/ > accessed 14 March 2021.

Geuna A. , Piolatto M. ( 2016 ) ‘ Research Assessment in the UK and Italy: Costly and Difficult, but Probably Worth It (at least for a while) ’, Research Policy , 45 : 260 – 71 .

Gilroy P. , McNamara O. ( 2009 ) ‘ A Critical History of Research Assessment in the United Kingdom and Its Post‐1992 Impact on Education ’, Journal of Education for Teaching , 35 : 321 – 35 .

Hicks D. ( 2012 ) ‘ Performance-Based University Research Funding Systems ’, Research Policy , 41 : 251 – 61 .

Hicks D. , Wouters P. , Waltman L. , de Rijcke S. , Rafols I. ( 2015 ) ‘ Bibliometrics: The Leiden Manifesto for Research Metrics ’, Nature , 520 : 429 – 31 .

Hinze S. , Butler L. , Donner P. , McAllister I. ( 2019 ) ‘Different Processes, Similar Results? A Comparison of Performance Assessment in Three Countries’, in Glänzel W. , Moed H. F. , Schmoch U. , Thelwall M. (eds) Springer Handbook of Science and Technology Indicators , pp 465-484. Cham : Springer .

Jonkers K. , Zacharewicz T. ( 2016 ) ‘Research performance based funding systems: A comparative assessment’, JRC Science for Policy Report, European Commission: Joint Research Centre < https://publications.jrc.ec.europa.eu/repository/bitstream/JRC101043/kj1a27837enn.pdf > accessed 15 May 2021.

Kelly A. ( 2016 ) ‘Funding in English Universities and its Relationship to the Research Excellence Framework’, British Education Research Journal , 42 : 665 – 81 .

Khatun T. ( 2009 ) ‘ Measuring Environmental Degradation by Using Principal Component Analysis ’, Environment, Development and Sustainability , 11 : 439 – 57 .

McGillivray M. ( 1991 ) ‘ The Human Development Index: Yet Another Redundant Composite Development Indicator ’, World Development , 19 : 1461 – 8 .

McGillivray M. ( 2005 ) ‘ Measuring Non-Economic Wellbeing Achievement ’, Review of Income and Wealth , 51 : 337 – 64 .

McGillivray M. , White H. ( 1993 ) ‘ Measuring Development? The UNDP’s Human Development Index ’, Journal of International Development , 5 : 183 – 92 .

Manville C. , Guthrie, S. , Henham M. L. , Garrod B. , Sousa S. , Kirtley A., Castle-Clarke, S., Ling, T et al.  ( 2015 ) ‘Assessing impact submissions for REF 2014: An evaluation’ < www.rand.org/content/dam/rand/pubs/research_reports/RR1000/RR1032/RAND_RR1032.pdf>  accessed 7 April 2021.

Marques M. , Powell J. J. W. , Zapp M. , Biesta G. ( 2017 ) ‘ How Does Research Evaluation Impact Educational Research? Exploring Intended and Unintended Consequences of Research Assessment in the United Kingdom, 1986–2014 ’, European Educational Research Journal , 16 : 820 – 42 .

Nardo M. , Saisana M. , Saltelli A. , Tarantola S. ( 2008 ). Handbook on Constructing Composite Indicators: Methodology and User Guide , Paris : OECD Publishing < https://www.oecd.org/sdd/42495745.pdf > accessed 15 October 2021.

Nguefack‐Tsague G. , Klasen S. , Zucchini W. ( 2011 ) ‘ On Weighting the Components of the Human Development Index: A Statistical Justification ’, Journal of Human Development and Capabilities , 12 : 183 – 202 .

Penfield T. , Baker M. , Scoble R. , Wykes M. ( 2014 ) ‘ Assessment, Evaluations, and Definitions of Research Impact: A Review ’, Research Evaluation , 23 : 21 – 32 .

Pidd M. , Broadbent J. ( 2015 ) ‘ Business and Management Studies in the 2014 Research Excellence Framework ’, British Journal of Management , 26 : 569 – 81 .

Pinar M. ( 2020 ) ‘ It is Not All about Performance: Importance of the Funding Formula in the Allocation of Performance-Based Research Funding in England ’, Research Evaluation , 29 : 100 – 19 .

Pinar M. , Milla J. , Stengos T. ( 2019 ) ‘ Sensitivity of University Rankings: Implications of Stochastic Dominance Efficiency Analysis ’, Education Economics , 27 : 75 – 92 .

Pinar M. , Unlu E. ( 2020a ) ‘ Evaluating the Potential Effect of the Increased Importance of the Impact Component in the Research Excellence Framework of the UK ’, British Educational Research Journal , 46 : 140 – 60 .

Pinar M. , Unlu E. ( 2020b ) ‘ Determinants of Quality of Research Environment: An Assessment of the Environment Submissions in the UK’s Research Excellence Framework in 2014 ’, Research Evaluation , 29 : 231 – 44 .

Rebora G. , Turri M. ( 2013 ) ‘ The UK and Italian Research Assessment Exercises Face to Face ’, Research Policy , 42 : 1657 – 66 .

REF ( 2011 ) Assessment Framework and Guidance on Submissions < https://www.ref.ac.uk/2014/media/ref/content/pub/assessmentframeworkandguidanceonsubmissions/GOS%20including%20addendum.pdf > accessed 7 April 2021.

REF ( 2012 ) Panel Criteria and Working Methods < https://www.ref.ac.uk/2014/media/ref/content/pub/panelcriteriaandworkingmethods/01_12_1.pdf > accessed 21 May 2021.

REF ( 2014 ) Results and Submissions < https://results.ref.ac.uk/(S(ag0fd0kpw5wgdcjk2rh1cwxr ))/> accessed 05 May 2021.

Research England ( 2019a ) Research England: How We Fund Higher Education Institutions < https://re.ukri.org/documents/2019/research-england-how-we-fund-higher-education-institutions-pdf/ > accessed 3 July 2020.

Research England ( 2019b ) Annual Funding Allocations 2019–20 < https://re.ukri.org/finance/annual-funding-allocations/annual-funding-allocations-2019-20/ > accessed 5 July 2020.

Robinson-Garcia N. , Torres-Salinas D. , Herrera-Viedma E. , Docampo D. ( 2019 ) ‘ Mining University Rankings: Publication Output and Citation Impact as Their Basis ’, Research Evaluation , 28 : 232 – 40 .

Saisana M. , d’Hombres B. , Saltelli A. ( 2011 ) ‘ Rickety Numbers: Volatility of University Rankings and Policy Implications ’, Research Policy , 40 : 165 – 77 .

Shattock M. ( 2012 ) Making Policy in British Higher Education 1945–2011 , Berkshire : McGraw-Hill .

Sivertsen G. ( 2017 ) ‘ Unique, but Still Best Practice? The Research Excellence Framework (REF) from an International Perspective ’, Palgrave Communications , 3 : 1 – 6 .

Smith S. , Ward V. , House A. ( 2011 ) ‘“ Impact” in the Proposals for the UK’s Research Excellence Framework: Shifting the Boundaries of Academic Autonomy ’, Research Policy , 40 : 1369 – 79 .

Taylor J. ( 2011 ) ‘ The Assessment of Research Quality in UK Universities: Peer Review or Metrics? ’, British Journal of Management , 22 : 202 – 17 .

Thorpe A. , Craig R. , Hadikin G. , Batistic S. ( 2018a ) ‘ Semantic Tone of Research “Environment” Submissions in the UK’s Research Evaluation Framework 2014 ’, Research Evaluation , 27 : 53 – 62 .

Thorpe A. , Craig R. , Tourish D. , Hadikin G. , Batistic S. ( 2018b ) ‘ Environment’ Submissions in the UK’s Research Excellence Framework 2014 ’, British Journal of Management , 29 : 571 – 87 .

Tijssen R. J. W. , Yegros-Yegros A. , Winnink J. J. ( 2016 ) ‘ University–Industry R&D Linkage Metrics: Validity and Applicability in World University Rankings ’, Scientometrics , 109 : 677 – 96 .

Watermeyer R. ( 2016 ) ‘ Impact in the REF: Issues and Obstacles ’, Studies in Higher Education , 41 : 199 – 214 .

Wilsdon J., Allen, L., Belfiore, E., Campbell, P., Curry, S., Hill, S., Jones, R., Kain, R., Kerridge, S., Thelwall, M., Tinkler, J., Viney, I., Wouters, P., Hill, J., Johnson, B. ( 2015 ) ‘The metric tide: Report of the independent review of the role of metrics in research assessment and management’. DOI: 10.13140/RG.2.1.4929.1363.

Zacharewicz T. , Lepori B. , Reale E. , Jonkers K. ( 2019 ) ‘ Performance-Based Research Funding in EU Member States—A Comparative Assessment ’, Science and Public Policy , 46 : 105 – 15 .

Allocation of mainstream QR funding allocation to HEIs with the official and alternative scenarios

Notes: Official column presents the allocation of mainstream QR funding across HEIs with the official funding allocation.

Scenario 1—Official: This column provides the differences in the mainstream QR funding allocated to the HEIs between Scenario 1 and official case.

Scenario 2—Official: This column provides the differences in the mainstream QR funding allocated to the HEIs between Scenario 2 and official case.

Scenario 3—Official: This column provides the differences in the mainstream QR funding allocated to the HEIs between Scenario 3 and official case.

Scenario 4—Official: This column provides the differences in the mainstream QR funding allocated to the HEIs between Scenario 4 and official case.

A positive (negative) figure in changes columns suggests that the HEI received relatively more (less) QR funding with the respective alternative scenario compared to the official case.

GPA scores and respective rankings of HEIs with the official case and Scenarios 1, 2, 3 and 4

Email alerts

Citing articles via.

  • Recommend to your Library

Affiliations

  • Online ISSN 1471-5449
  • Print ISSN 0958-2029
  • Copyright © 2024 Oxford University Press
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

A grid of images representing research

Oxford’s REF 2021 results show largest volume of world-leading research

The Research Excellence Framework (REF) assesses the quality of research in UK Higher Education Institutions

The UK Funding Bodies have published the outcomes of the recent national research assessment exercise, the Research Excellence Framework (REF) 2021 . The REF 2021 results show Oxford’s submission had the highest volume of world­­­-leading research*.

The REF 2021 results demonstrate once again that Oxford is a research powerhouse, and the impact case studies highlight our effectiveness in putting this research in service to society by making critical contributions to global health, economic prosperity and cultural life. Professor Louise Richardson, Vice-Chancellor of the University of Oxford

The University of Oxford made the largest submission of any Higher Education Institution (HEI) in the UK, submitting over 3,600 researchers (3,405 full time equivalent) into 29 subject areas, over 8,500 research outputs in a range of formats from journal articles to compositions, and 220 case studies about the impact of Oxford research beyond academia.

Professor Louise Richardson, Vice-Chancellor at the University of Oxford, said:

'The REF 2021 results demonstrate once again that Oxford is a research powerhouse, and the impact case studies highlight our effectiveness in putting this research in service to society by making critical contributions to global health, economic prosperity and cultural life.'

'The REF is one of the few opportunities to see the remarkable breadth of our research and to draw together all parts of the collegiate University in a single collaborative effort, and I would like to thank everyone involved for their contribution.'

Professor Patrick Grant, Pro-Vice Chancellor for Research at the University of Oxford said:

We are pleased to have submitted work from researchers at a variety of career stages and from across the Collegiate university, thus giving us the opportunity to showcase the depth and breadth of our research. Our submission also shows how through our collaborations with external partners our research is used to benefit society, across the UK and around the world. Professor Patrick Grant, Pro-Vice Chancellor for Research at the University of Oxford

'The publication of the REF 2021 results marks the culmination of the work done by staff across the University, especially in the past two years or so. Thank you to everyone involved in the submission — both those who, in a variety of ways, led and supported our research and impact activities, and also the many colleagues who coordinated the submission itself.

'We are pleased to have submitted work from researchers at a variety of career stages and from across the Collegiate university, thus giving us the opportunity to showcase the depth and breadth of our research. Our submission also shows how through our collaborations with external partners our research is used to benefit society, across the UK and around the world.'

Highlights of the submission can be found on the Oxford REF 2021 webpages .

*Largest volume of world-leading research is calculated from the sum of (overall %4* x submitted FTE) across all submissions

Subscribe to News

FURTHER INFORMATION

  • Oxford 2021 REF highlights
  • 2021 REF outcomes

DISCOVER MORE

  • Support Oxford's research
  • Partner with Oxford on research
  • Study at Oxford
  • Research jobs at Oxford

You can view all news or browse by category

Q&A: what is the REF and how is the quality of university research measured?

the research excellence framework

Vice-Chancellor and Principal, Canterbury Christ Church University

Disclosure statement

Rama Thirunamachandran is former director for research, innovation and skills at the Higher Education Funding Council for England. His is vice-chair and a board member of the Higher Education Academy.

View all partners

the research excellence framework

Think of a researcher measuring the trajectory of a laser beam in a university physics lab or a history professor digging through a church’s long-lost archives and their work can often feel far removed from life’s daily reality. But as the amount of funding the government is able to spend on research comes under pressure, so pressure is mounting on academics to show the impact of their work on the real world.

On December 18, universities and their staff will be keenly analysing data published in the Research Excellence Framework (REF), a national review of university research. For the first time, university research will be judged on its impact outside the world of academia. These results will later be used to inform how much research funding different universities receive.

The Conversation asked the man who developed the REF back in 2008, Rama Thirunamachandran, vice-chancellor and principal at Canterbury Christ Church University, to talk us through it.

* Why is the REF so important? *

A significant amount of public funding goes into research, so government quite understandably needs to ensure that the research coming out of this significant investment is of high quality.

Since 1986 there has been a national review of research done roughly every five or six years. It was initially called the Research Selectivity Exercise, then in 1992 it became the Research Assessment Exercise (RAE) and in 2008 it became the Research Excellence Framework. The purpose is to assess the quality of research going on in universities through a profile of each subject area. These assessment profiles will then inform the amount of public funding going to subject areas at different universities.

What is new this year?

The important addition this year is the assessment of the impact of research. Panels of peer-reviewers assess the research submitted by universities for how much impact it has had on the real world.

* How is it measured? * Research outputs, such as research publications, amount for 65% of the profile, impact is 20% and the research environment, which includes staff development and training of postgraduate researchers, counts for 15% of the overall quality profile . It is not individual academics who receive a rating, but the research subject area at a particular university. They will be given a starred level from one (the lowest) to four (highest), or listed as unclassified if the quality falls below the standard of nationally recognised research.

In the past, the results have shown a concentration of research excellence at a few universities. Will we see that same pattern this year?

The shift in the REF after the 2008 RAE, which I was responsible for, towards assessing universities through quality profiles allowed the system to identify pockets of excellence across a very wide range of universities. Each university has some pockets of excellence. But the majority of very high-quality research, defined as three star and four star – research which is world-leading, or research which is internationally excellent – is found in probably 20 or so universities in the country. But it’s important to note that the UK’s other universities do have a significant amount of research capability, some of which is internationally excellent.

* What relation does the REF have to university funding? * The funding calculations don’t happen until the spring by the Higher Education Funding Council for England (HEFCE). Universities won’t know the funding implications when the REF results are announced. If their results are better than last time round, and better than the average improvement in the sector, they might possibly infer there might be a bit more money coming into their particular university.

But university research funding [in England] is dependent on how much money government provides for research to the Higher Education Funding Council for England (HEFCE) and that won’t be known until the new year. There is a fixed pot of money for research which will go from the government to HEFCE and then HEFCE will develop a formula to allocate those limited resources using the REF results.

HEFCE always gives highest priority to the highest rated research. Four-star rated quality research will get the highest levels of funding and three-star the next highest levels of funding. It’s likely that no other funding will be available for research rated as two star, one star and unclassified, simply because there probably isn’t enough funding.

* How useful is the REF for students and potential students? *

There is clearly a connection between research and teaching, so I’d like to think most academics who are teachers are also involved in research in some way. Not all of them will be involved in research of the quality which will get them submitted to the REF and there are many other factors which affect the quality of teaching and the experience students receive.

Some of the universities which are less-research intensive are the universities which are very focused on the quality of their education and the student experience. Just looking at the REF results is not hugely helpful to a potential applicant. It says something about the quality of research in that unit, or that university, but it doesn’t really tell you anything about the quality of education, the nature of the curriculum, the amount of teaching hours and contact hours that students have.

But at postgraduate level, particularly for postgraduate research students, I think that the stronger research units provide a very good quality research environment for them to succeed and thrive.

What are the key things to look out for in this year’s REF results?

One interesting thing to see will be to look out for the pockets of excellence from 2008 and see whether they have been maintained. The other would be how the subject assessment panels have assessed impact and how this has played into the overall assessment of each unit.

Is the way the UK assesses research different from other countries?

We were probably the pioneers of this type of exercise, way back in the mid 1980s. Undoubtedly, some of the countries whose higher education is modelled on the UK system, such as Australia and Hong Kong, have modelled their research systems on very similar lines. But then there are very successful research systems in countries such as the US, where there isn’t anything comparable with this and assessment and peer review happens at different levels.

  • Research impact
  • Higher education funding
  • University research
  • higher education UK
  • Research Excellence Framework

the research excellence framework

Case Management Specialist

the research excellence framework

Lecturer / Senior Lecturer - Marketing

the research excellence framework

Assistant Editor - 1 year cadetship

the research excellence framework

Executive Dean, Faculty of Health

the research excellence framework

Lecturer/Senior Lecturer, Earth System Science (School of Science)

  • University home
  • For business
  • Alumni and supporters
  • Our departments
  • Visiting us

The Research Excellence Framework

  • Annual research review
  • Impact Privacy Notice
  • Information for external reviewers

The Research Excellence Framework (REF)

the research excellence framework

Find out more >>

ref2021-logo

The Research Excellence Framework (REF) is a national assessment of the research taking place across UK universities. REF takes place cyclically, typically every six years.

The framework is used by the four UK higher education funding bodies (Research England, the Scottish Funding Council, the Higher Education Funding Council for Wales, and the Department for the Economy, Northern Ireland) to assess the quality of research and to inform the distribution of research funding -  worth currently around £2 billion per year, to UK universities.

The main purpose of REF is to:

  • Act as an assessment for UK research funding allocations
  • Provide accountability for public funding of research and demonstrate the benefits
  • Provide significant benchmarking information on research reputation for universities 

The results for REF 2021 are available on the REF2021 website .

How is the REF assessment carried out?

The REF looks at three areas of assessment, which together reflect the key characteristics of research excellence. For REF2029, the next REF assessment, there are currently some changes being finalised. You can read more about this here  REF 2029 .

The three key components of REF are:

  • Contribution to knowledge and understanding (accounting for around 50% of the assessment)
  • Engagement and Impact  (accounting for 25% of the assessment)
  • People, Culture and Environment  (accounting for 25% of the assessment)

Each University is required to make an institutional submission to REF which is broken down into 34 disciplinary units, known as Units of Assessment (UoAs). These submissions are assessed by an expert panel of academics and public, private and third-sector experts in each UoA. There are four main panels to reflect the following subject areas:

  • Main Panel A: Medicine, health and life sciences
  • Main Panel B: Physical sciences, engineering and mathematics
  • Main panel C: Social sciences
  • Main Panel D: Arts and humanities

These panels are appointed by the four UK founding bodies, and oversee the assessment, ensuring that the assessment criteria and standards are consistently applied. These panels are supported by the Equality and Diversity Advisory Panel (EDAP) and the Interdisciplinary Research Advisory Panel (IDAP). All universities are also required to submit a code of practice detailing the policies and processes which govern how they will develop their REF submission.

Find out more about Exeter REF 2021 .

The University of Manchester

Alternatively, use our A–Z index

Aerial view of John Owens building, The University of Manchester

Research Excellence Framework 2021

The University of Manchester's position as a research powerhouse has been confirmed in the results of the 2021 Research Excellence Framework (REF).

Professor Dame Nancy Rothwell

These comprehensive and independent results confirm Manchester's place as a global powerhouse of research. Professor Dame Nancy Rothwell / President and Vice-Chancellor of The University of Manchester

Key results

  • We have retained fifth place for research power 1 .
  • Overall, 93% of the University’s research activity was assessed as ‘world-leading’ (4*) or ‘internationally excellent’ (3*).
  • We ranked in 10th place in terms of grade point average 2 (an improvement from 19th in the previous exercise, REF 2014).
  • The Times Higher Education places us even higher at eighth on GPA (up from 17th place), as their analysis excludes specialist HE institutions.
  • In the top three nationally for nine subjects (Unit of Assessment by grade point average or research power).

The Research Excellence Framework (REF) is the system for assessing the quality of research in UK higher education institutions. Manchester made one of the largest and broadest REF submissions in the UK, entering 2,249 eligible researchers across 31 subject areas.

The evaluation encompasses the quality of research impact, the research environment, research publications and other outputs.

REF results

Overall, 93% of the University’s research activity was assessed as ‘world-leading’ (4*) or ‘internationally excellent’ (3*). The evaluation encompasses the quality of research impact (96% 3* or 4*), the research environment (99% 3* or 4*), research publications and other outputs (90% were 3* or 4*).

We ranked in 10th place in terms of grade point average, an improvement from 19th in the previous exercise, REF 2014. The Times Higher Education places us even higher at eighth on GPA (up from 17th place), as their analysis excludes specialist HE institutions. This result was built upon a significant increase in research assessed as ‘world leading’ (4*) between REF 2014 and REF 2021.

The University came in the top three for the following subjects (Unit of Assessment by grade point average or research power):

  • Allied Health Professions, Dentistry, Nursing and Pharmacy
  • Business and Management Studies
  • Drama, Dance, Performing Arts, Film and Screen Studies
  • Development Studies
  • Engineering

The University had 19 subjects in the top ten overall by grade point average and 15 when measured by research power.

Research impact

Social responsibility underpins research activity at Manchester, and we combine expertise across disciplines to deliver pioneering solutions to the world’s most urgent problems.

We’re ranked as one of the top ten universities in the world for delivering against the UN’s Sustainable Development Goals ( Times Higher Education Impact Rankings) and our research impact showcase includes examples of the positive impact we’ve made across culture and creativity, economic development and inequalities, health and wellbeing, innovation and commercialisation, and sustainability and climate change.

Professor Dame Nancy Rothwell, President and Vice-Chancellor of The University of Manchester, said: "These comprehensive and independent results confirm Manchester's place as a global powerhouse of research.

“We create an environment where researchers can thrive and exchange ideas. Most importantly the quality and impact of our research is down to the incredible dedication and creativity of our colleagues who work every day to solve significant world problems, enrich our society and train the next generation of researchers.

“The fact that our REF results are accompanied by examples of the real difference we’ve made in the world, all driven from this city makes me very proud.”

Research environment

The REF exercise also evaluated the University’s work to provide a creative, ambitious and supportive research environment , in which researchers at every career stage can develop and thrive as leaders in their chosen field.

In this category, the University achieved a result of 99% ‘internationally excellent’ or ‘world-leading’, making it one of the best places in the country to build a research career.

1  Research power is calculated by grade point average, multiplied by the number of FTE staff submitted (FTE – full-time equivalent head count) and gives a measure of scale and quality. Grade point average (GPA) measures the overall or average quality of research, which takes no account of the FTE submitted.

2  Grade point average is a measure of the overall or average quality of research calculated by multiplying the percentage of research in each grade by its rating, adding them all together and dividing by 100.

REF 2021 results

View the University’s full set of results by unit of assessment.

REF 2021 submissions

View the University’s full list of 160 submissions.

the research excellence framework

Research impact showcase

Find out how we're solving the world's most urgent problems.

the research excellence framework

Library item label woz ere --> Centres Research centres and facilities Cross-faculty centres Faculty centres Externally funded centres All research centres   Translational research and innovation Research facilities Centre criteria Impact Impact Research Excellence Framework 2021 Flagship institutes Our priorities Research features Open Research   Coronavirus: our research and innovation Search for a research story Publications - White Rose Research Online Sheffield Player Culture Our research culture Research culture steering board Listening to our research community Our commitments Enhancing Research Culture funding 2023-24 projects 2022-23 projects 2021-22 projects Expertise Expertise Centres and facilities People Nobel Laureates Facilities Partner with us Research degrees PhD study Find a PhD PhD scholarships About PhD research at Sheffield Support for current researchers Connect Connect Partner with us Contact Donate   Research Excellence Framework 2021

The Research Excellence Framework (REF) confirms our place as a world-leading university. The results show our research and impact excellence across a broad range of disciplines and demonstrate that our research is having a significant positive impact on lives across the globe.

REF 2021 illustration showing University building and subject areas

The REF assesses the quality and impact of research taking place in UK universities. The results inform the allocation of around £2 billion per year of public funding for universities’ research.

Research that changes lives

The Research Excellence Framework (REF) 2021 results confirm our research is changing lives and shaping the world we live in.

Our vision is to produce the highest quality research to drive intellectual advances and address global challenges. The REF results demonstrate that we are advancing towards this goal. 

The REF is a retrospective exercise looking back over seven years of incredible research at our University. We’re proud of everything that our researchers, and those that support our research environment have achieved. But we’re also excited for the future and the inspiring discoveries we are yet to make.

the research excellence framework

Our REF results

92 per cent of our research is rated in the highest two categories in the REF 2021, meaning it is classed as world-leading or internationally excellent. The REF results demonstrate our research and impact excellence across a broad range of disciplines and confirm that our research is having a significant positive impact on lives across the globe.

We submitted to 25 different REF units of assessment comprising: 

  • All of our 1,690 independent researchers
  • 3,684 outputs (outputs can include peer-reviewed journal articles and books, as well as other types of research output such as designs and compositions)
  • 114 impact case studies (impact can include examples of improved technologies, public awareness, medical treatments, government policies or structural changes in society locally, nationally and internationally)

View all REF 2021 results

Pushing the boundaries of innovation and technology

Airplane engine

Transforming health, health care and social care

hourglass

Addressing our food, energy and sustainability challenges

cargo ship from above

Developing solutions to our social and cultural challenges

statue balancing weights

Browser does not support script.

  • LSE Research for the World Strategy
  • LSE Expertise: Global politics
  • LSE Expertise: UK Economy
  • Find an expert
  • Research for the World magazine
  • Research news
  • LSE iQ podcast
  • Research films
  • LSE Festival
  • Researcher Q&As

Ref2021banner

LSE results for the latest Research Excellence Framework (REF)

The REF 2021 results are a great achievement for the School and reflect the incredibly hard work done by colleagues over a number of years.

Professor Susana Mourato, Pro Director for Research

LSE’s outstanding contribution to social science research has once again been recognised by the 2021 Research Excellence Framework (REF) . LSE is shown as the top university (of multiple submissions) in the UK based on the proportion of ‘world-leading’ (4*) research produced. LSE is also the joint second ranking university in the UK overall, when considering research outputs, research impact and research environment.

58 per cent of LSE’s research was judged to be world-leading (4*) and 35 per cent was deemed to be internationally excellent (3*).

A number of LSE departments did particularly well, with their Units of Assessment coming top overall. These are the departments of Economics, Anthropology, Social Policy, Health Policy and Media and Communications. Academics from the Departments of Gender Studies, Methodology and Psychological and Behavioural Science also contributed to these top scoring Units of Assessments.

Institutional percentages show weighted averages for 4* and 3* and weighted GPA.

View full REF results (REF UK) View LSE's full results (LSE only, Tableau access required)

The Research Excellence Framework (REF) is the system by which the UK’s higher education funding bodies assess the quality of research in publicly funded UK higher education institutions (HEIs).  REF 2021 comprised three elements:

academic outputs, comprising a portfolio based on the FTE of REF-eligible staff submitted;

research impact, submitted as a number of impact case studies (ICSs) in proportion to the total FTE of REF-eligible staff submitted;

research environment, comprising the total number of research degrees awarded between 2014 and 2020, total research income received over the same time period, and an environment statement detailing how the submitting unit(s) supported research and impact over the period.

Outputs, impact and environment were weighted 60:25:15 respectively.  All three elements were graded on a scale from 0 (unclassified) to 4* (world leading) and the results were published as quality profiles showing the percentage of outputs, impact and environment considered to meet each of the starred levels.  Submissions were invited to 34 Units of Assessment (UoAs); LSE made 15 submissions to 13 UoAs across the SHAPE subjects.

For REF2021, HEIs were required to submit research outputs by all eligible members of staff.  Each submitted member of staff could submit between one and five outputs, with the total number of outputs per UoA calculated as total FTE of staff multiplied by 2.5.

Staff were eligible for REF2021 where they were on a teaching-and-research or research-only contract of at least 0.2 FTE on 31 July 2020 and had a substantive connection to the submitting HEI.  Research-only staff also had to be classified as independent researchers.  HEIs were also required to identify which eligible staff had significant responsibility for research.  LSE submitted 100% of its staff meeting these definitions, but other HEIs had eligible staff who did not have significant responsibility for research and hence had a submission rate of less than 100%.

View a full glossary of REF terminology .

Sequeira2_promo

LSE Impact Case Studies Browse the library of stories about the impact of our research

22_0171 RFTW COMPOSITE_747x420

LSE Research for the World Special REF edition of our online magazine

Lse departments and institutes more information, lse research centres and groups more information.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of nursrep

Research Quality—Lessons from the UK Research Excellence Framework (REF) 2021

David r. thompson.

1 School of Nursing and Midwifery, Queen’s University Belfast, Belfast BT7 1NN, UK

Hugh P. McKenna

2 School of Nursing, Ulster University, Newtownabbey BT37 0QB, UK

Research quality is a term often bandied around but rarely clearly defined or measured. Certainly, the nursing contribution to research and research quality has often been under-recognised, under-valued and under-funded, at least in the UK. For over 20 years, it has been argued that there should be more investment in, and acknowledgement of, nursing’s contribution to high-quality research [ 1 , 2 ].

One way of measuring the quality of research in nursing across different higher education institutions (HEIs) and its parity with other disciplines is through periodic national research assessment exercises. In 1986, the first national Research Assessment Exercise (RAE) in higher education (HE) took place in the UK under the government of Margaret Thatcher. The purpose of the exercise was to determine the allocation of funding to UK universities at a time of tight budgetary restrictions. Since then, RAEs took place in 1989, 1992, 1996, 2001 and 2008 in its different iterations. In response to criticisms, the scale and assessment process has changed markedly since the first RAE.

1. The Research Excellence Framework (REF)

In 2014, the first Research Excellence Framework (REF) replaced the RAE. In 2016, the Stern report [ 3 ] identified five purposes for the REF:

  • To provide accountability for public investment in research and produce evidence of the benefits of this investment.
  • To provide benchmarking information and reputational yardsticks for the HE sector and for public information.
  • To provide a rich evidence base to inform strategic decisions about national research priorities.
  • To create strong performance incentives for HEIs/researchers.
  • To inform decisions on the selective allocation of non-hypothecated funding for research.

The REF2021 has just reported its assessment and its outcomes inform the research grant allocations from the four HE funding bodies, with effect from 2022–2023. This quality related (QR) research funding, totalling £2 billion per year, enables HEIs to conduct their own directed research, much of which is supported subsequently by the UKRI Research Councils and other bodies (charities, industry, EU, etc.): known as the Dual Support System. So, in essence, this is an important exercise for universities and their disciplines in terms of funding, but also reputation, image and prestige.

The REF2021, for the first time, included the submission of all staff with significant responsibility for research. A total of 157 UK HEIs participated, submitting over 76,000 academic staff. As the REF is a discipline-based expert review process, 34 expert sub-panels, working under the guidance of four main panels, reviewed the submissions and made judgements on their quality. The panels comprised 900 academics, including 38 international members, and 220 users of research.

The rigour, significance and originality of research outputs (185,594) were judged on a 5-point scale:

  • 4* (quality that is world-leading in terms of originality, significance and rigour);
  • 3* (quality that is internationally excellent in terms of originality, significance and rigour but which falls short of the highest standards);
  • 2* (quality that is recognised internationally in terms of originality, significance and rigour);
  • 1* (quality that is recognised nationally in terms of originality, significance and rigour);
  • Unclassified (below the quality threshold for 1* or does not meet the definition of research used for the REF).

Impact case studies (6781) were assessed in terms of reach and significance of impacts on the economy, society and/or culture [ 4 ].

Environment was assessed in terms of vitality and sustainability (strategy, people, income and collaboration).

Results were produced as “overall quality profiles”, which show the proportions of submitted activity judged to have met each quality level from 4* to unclassified. Submissions included research outputs, examples of the impact and wider benefits of research and evidence about the research environment. The overall quality profile awarded to each submission is derived from three elements that were assessed: the quality of research outputs (contributing 60% of the profile); the social, economic and cultural impact of research (contributing 25% of the profile); and the research environment (contributing 15%) of the overall quality profile. The panels reviewed the submitted environment statements and statistical data on research income and doctoral degrees. A statement about the overall institution’s environment was provided to inform and contextualise the panel’s assessment.

Key findings were that the overall quality was 41 per cent world-leading (4*) and 43 per cent internationally excellent (3*) across all submitted research activities. At least 15 per cent of the research was considered world-leading (4*) in three-quarters of the UK’s HEIs. In terms of impacts, the expert panels observed the significant gains made from university investment in realising research impact. However, changes between the REF2014 and REF2021 exercises limit the extent to which meaningful comparisons can be made across results, particularly outputs.

2. How Did Nursing Fare?

Nursing research was primarily, but not exclusively, submitted to the REF2021 sub-panel (Unit of Assessment) 3 (Allied Health Professions, Dentistry, Nursing and Pharmacy). SP3 received 89 submissions from 90 universities covering a very wide range of disciplines, thus making it difficult to focus specifically on research emanating from the discipline of nursing. Additionally, some research on nursing-related issues and interventions may have been included in returns to other units of assessment and not seen by SP3 members.

Nevertheless, some overall impressions of nursing emerged:

2.1. Strengths

There was evidence of a strong level of interdisciplinary collaboration and examples of strong academic nursing leadership in large multidisciplinary research teams. The scale of research activity in nursing ranged from modest or emerging centres of nursing research through to substantial, long-established units with mature research environments. A notable feature was high quality research addressing a wide range of nursing-related issues of critical importance to recipients of nursing care.

Many of the strongest outputs focused on people’s quality of life and health outcomes and included interventions designed to support older people and those with enduring health challenges, including symptom management, self-management and managing continence, mobility problems and pain. There were also signs of a growing emphasis on evaluating new approaches to care delivery and new or extended roles which aim to enhance access to care.

There were outstanding case studies that demonstrated clear links to the underpinning research related to the outputs, including in mental health, ageing, dementia, enduring health challenges and self-management of care. There was clear evidence of impact and reach on society, policy, practice and the economy, including changes to public perceptions of health. Evidence of reach was demonstrated through improvements to healthcare practice and delivery which enhanced health outcomes and quality of life.

There was evidence of highly developed research environments in which a significant volume of world leading or internationally excellent quality research was being generated within the nursing discipline. There was evidence of nurses leading large research centres and institutions where mature inter- and trans-disciplinary research was facilitated with a clear and joined up strategy. Most stronger submissions demonstrated clear institutional strategic commitment to and investment in furthering research in the discipline. There was also evidence of methodological developments, strong collaborations with non-academic partners, explicit attention to equality, diversity and inclusion and support for early career researchers. There was good evidence of national and international esteem demonstrated in the discipline through journal editorships, grant awards panel membership, significant positions in national institutions as well as some examples of national honours and other forms of recognition.

The early research assessment exercises were instrumental in sustaining research funding in elite universities and it achieved that goal for decades. Many of these had no nursing departments (e.g., Oxford, Cambridge, Imperial). However, the results of REF2021 show that this is no longer the case. In REF2021, the “golden triangle” universities have lost 2.4 percentage points and there appears to be a “levelling-up” of research with islands of excellence across the UK, often in universities with large nursing departments. For example, Northumbria leaped from 52 to 28 places in the market share table for QR funding. Similar trends were seen at Manchester Metropolitan University (56 to 38) and Portsmouth (60 to 47).

2.2. Challenges

It was sometimes difficult to determine how research activity in nursing was organised, supported and resourced. This included a few stronger research-led universities where there was little evidence that nursing was being prioritised equitably with other disciplines in terms of supporting growth and sustainability.

There were some outputs that were iterative scoping and systematic reviews which did not generate new knowledge and had difficulty meeting the originality and significance criteria. Furthermore, we know that nurses are undertaking excellent pedagogic research, yet little or none of this was returned to SP3, even though the panel’s descriptor showed that it would be welcome. Perhaps it was submitted to other expert panels such as SP23 (Education). Similarly, SP3 received few public engagement impact case studies, even though we know that nurses undertake excellent research in partnership with the public and patients.

It was apparent that, in general, nursing appears to have gained limited access to major funding schemes beyond the National Institute for Health and Care Research (NIHR) compared with many of the other disciplines included in UoA3. Additionally, there was a paucity of career development and personal award schemes to support post-doctoral nurses, which may be a key factor influencing the overall low numbers of early career researchers included in the submissions. Another concern was the low percentage of eligible nursing staff returned by some institutions, which raises the question of the extent to which eligible nursing staff in these institutions are being given time and support to undertake research as part of their academic role. This was often accompanied by a low number of early career researchers, down 3% from the REF2014 exercise, and none returned by some nursing departments. The combination of lower early career researcher numbers with a low return of eligible staff in some universities raises questions about research capacity and capability support.

These issues suggest strongly that there needs to be significant investment in nursing and more attention to the research priorities of major grant awarding bodies and access to a broader base of research funding opportunities. Mentorship, coaching and support will also promote the development of research leaders and strengthen the growth and spread of high-quality research activity.

A major and enduring challenge for nursing has been building and sustaining research capacity and capability. Nursing has long suffered from a lack of proper investment, including funding, and failure to influence the priorities of research funders. These are issues that need to be addressed urgently, particularly as there remain concerns about the availability of nurses in universities and health services at all levels, but particularly early career researchers, in the pipeline [ 5 , 6 ]. National strategies for nursing research have long been articulated [ 1 ], but little progress has been made in their implementation. Such strategies need to raise the profile of nursing research, ensure it is valued and that nurses in universities and the health services have protected research time and access to the same level of funding and support as their peers in other disciplines.

A number of expert committees have been established to identify how the REF structure and process can be improved prior to the next iteration in 2028. Sir Peter Gluckman is chairing a group of international experts and they report in October 2022. There are also committees examining the use of quantitative metrics and artificial intelligence (AI) in research assessment. What is clear is that peer review will remain, but informed by AI, metrics, and citations. Another emerging theme is the greater use of qualitative data and an emphasis on research culture, both of which should benefit nursing.

3. Conclusions

Research assessment exercises in one form or another seem likely to stay. Though imperfect, they are an important, perhaps the best available, means of ensuring scrutiny and accountability of public investment in research and for benchmarking disciplines and universities. However, the REF is merely the tip of the iceberg regarding the research and impact of UK nursing. Therefore, we should not let the REF tail wag the research dog. It is important to rejoice in the improved outcomes of nursing, but all contributions should be celebrated. Many nurses have expertly undertaken heavy administration and teaching loads so that the very best research can be submitted.

The REF is important because such exercises, beginning in the UK in 1986, have been emulated or adapted by other countries across the world, including Australia, New Zealand, Hong Kong, The Netherlands, Germany, Italy, Romania, Denmark, Finland, Norway, Sweden and the Czech Republic. The results inform the allocation of public funding on which the viability and reputation of research in nursing depends. One of the criticisms of the RAE was that university presidents used poor ratings as a reason to close, cut or merge departments/disciplines. Nursing can ill-afford to be at the mercy of such decision making. It is incumbent on the discipline to demonstrate forcibly the quality and impact of its research on the health and wellbeing of the people and communities it serves.

Conflicts of Interest

The author were member and chair, respectively of the REF2021 Unit of Assessment 3: Allied Health Professions, Dentistry, Nursing and Pharmacy. The views expressed are their personal ones.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • CAREER COLUMN
  • 08 July 2022

Why the party is over for Britain’s Research Excellence Framework

  • Richard Watermeyer 0 &
  • Gemma Derrick 1

Richard Watermeyer is professor of higher education and co-director of the Centre for Higher Education Transformations at the University of Bristol, UK.

You can also search for this author in PubMed   Google Scholar

Gemma Derrick is associate professor of higher education at the University of Bristol, UK.

The past few weeks have generated a flurry of excitement for universities in the United Kingdom with the release of the latest assessment by the Research Excellence Framework, the country’s performance-based research-funding system.

Access options

Access Nature and 54 other Nature Portfolio journals

Get Nature+, our best-value online-access subscription

24,99 € / 30 days

cancel any time

Subscribe to this journal

Receive 51 print issues and online access

185,98 € per year

only 3,65 € per issue

Rent or buy this article

Prices vary by article type

Prices may be subject to local taxes which are calculated during checkout

doi: https://doi.org/10.1038/d41586-022-01881-y

This is an article from the Nature Careers Community, a place for Nature readers to share their professional experiences and advice. Guest posts are encouraged.

Competing Interests

The authors declare no competing interests.

Related Articles

the research excellence framework

  • Research management

I’m worried I’ve been contacted by a predatory publisher — how do I find out?

I’m worried I’ve been contacted by a predatory publisher — how do I find out?

Career Feature 15 MAY 24

How I fled bombed Aleppo to continue my career in science

How I fled bombed Aleppo to continue my career in science

Career Feature 08 MAY 24

Illuminating ‘the ugly side of science’: fresh incentives for reporting negative results

Illuminating ‘the ugly side of science’: fresh incentives for reporting negative results

US halts funding to controversial virus-hunting group: what researchers think

US halts funding to controversial virus-hunting group: what researchers think

News 16 MAY 24

Japan can embrace open science — but flexible approaches are key

Correspondence 07 MAY 24

US funders to tighten oversight of controversial ‘gain of function’ research

US funders to tighten oversight of controversial ‘gain of function’ research

News 07 MAY 24

Real-world plastic-waste success stories can help to boost global treaty

Correspondence 14 MAY 24

A DARPA-like agency could boost EU innovation — but cannot come at the expense of existing schemes

A DARPA-like agency could boost EU innovation — but cannot come at the expense of existing schemes

Editorial 14 MAY 24

Overseas Talent, Embarking on a New Journey Together at Tianjin University

We cordially invite outstanding young individuals from overseas to apply for the Excellent Young Scientists Fund Program (Overseas).

Tianjin, China

Tianjin University (TJU)

the research excellence framework

Chair Professor Positions in the School of Pharmaceutical Science and Technology

SPST seeks top Faculty scholars in Pharmaceutical Sciences.

Chair Professor Positions in the School of Precision Instruments and Optoelectronic Engineering

We are committed to accomplishing the mission of achieving a world-top-class engineering school.

Chair Professor Positions in the School of Mechanical Engineering

Aims to cultivate top talents, train a top-ranking faculty team, construct first-class disciplines and foster a favorable academic environment.

Chair Professor Positions in the School of Materials Science and Engineering

SMSE consists of materials science-related majors and research, including metallic materials, ceramic materials, polymeric materials...

the research excellence framework

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies
  • Coronavirus Updates
  • Education at MUSC
  • Adult Patient Care
  • Hollings Cancer Center
  • Children's Health

Biomedical Research

  • Research Matters Blog
  • NIH Peer Review

NIH announces the Simplified Framework for Peer Review

NIH Peer Review

The mission of NIH is to seek fundamental knowledge about the nature and behavior of living systems and to apply that knowledge to enhance health, lengthen life, and reduce illness and disability . In support of this mission, Research Project Grant (RPG) applications to support biomedical and behavioral research are evaluated for scientific and technical merit through the NIH peer review system.

The Simplified Framework for NIH Peer Review initiative reorganizes the five regulatory criteria (Significance, Investigators, Innovation, Approach, Environment;  42 C.F.R. Part 52h.8 ) into three factors – two will receive numerical criterion scores and one will be evaluated for sufficiency. All three factors will be considered in determining the overall impact score. The reframing of the criteria serves to focus reviewers on three central questions they should be evaluating: 1) how important is the proposed research? 2) how rigorous and feasible are the methods? 3) do the investigators and institution have the expertise/resources necessary to carry out the project? 

•        Factor 1: Importance of the Research  (Significance, Innovation), scored 1-9

•        Factor 2: Rigor and Feasibility  (Approach), scored 1-9

•        Factor 3: Expertise and Resources  (Investigator, Environment), to be evaluated with a selection from a drop-down menu

o             Appropriate (no written explanation needed)

o             Identify need for additional expertise and/or resources (requires reviewer to briefly address specific gaps in expertise or resources needed to carry out the project) 

Simplifying Review of Research Project Grant Applications

NIH Activity Codes Affected by the Simplified Review Framework.

R01, R03, R15, R16, R21, R33, R34, R36, R61, RC1, RC2, RC4, RF1, RL1, RL2, U01, U34, U3R, UA5, UC1, UC2, UC4, UF1, UG3, UH2, UH3, UH5, (including the following phased awards: R21/R33, UH2/UH3, UG3/UH3, R61/R33).

Changes Coming to NIH Applications and Peer Review in 2025

•        Simplified Review Framework for Most Research Project Grants (RPGs )

•        Revisions to the NIH Fellowship Application and Review Process

•        Updates to NRSA Training Grant Applications (under development)

•        Updated Application Forms and Instructions

•        Common Forms for Biographical Sketch and Current and Pending (Other) Support (coming soon)

Webinars, Notices, and Resources

Apr 17, 2024 - NIH Simplified Review Framework for Research Project Grants (RPG): Implementation and Impact on Funding Opportunities Webinar Recording & Resources

Nov 3, 2023 - NIH's Simplified Peer Review Framework for NIH Research Project Grant (RPG) Applications: for Applicants and Reviewers Webinar Recording & Resources

Oct 19, 2023 - Online Briefing on NIH’s Simplified Peer Review Framework for NIH Research Project Grant (RPG) Applications: for Applicants and Reviewers. See  NOT-OD-24-010

Simplifying Review FAQs

Enhancing research accessibility and reuse: new study outlines strategic measures

Today, the European Commission published a study aimed at improving access to and reuse of research results, including publications and data for scientific purposes. This marks a significant step under the European Research Area Policy Agenda 2022-2024 on an EU copyright and data legislative and regulatory framework fit for research. 

The study has identified barriers and challenges to access and reuse of publicly funded research results, evaluated effects of the EU copyright framework on research, and identified relevant provisions for research in EU data and digital legislation. On this basis, it presents options for legislative and non-legislative measures to strengthen the free circulation of knowledge and thereby contribute to reinforce the  European Research Area .

Iliana Ivanova , Commissioner for Innovation, Research, Culture, Education and Youth, said: 

“The European Union has been pioneering open science policies and actions for over a decade. At the heart of our ambitious open science policy lies a simple but powerful belief: Publicly funded research should be a public resource. In our ongoing efforts under the European Research Area and its Policy Agenda, we collaborate closely with Member States, Associated Countries and stakeholders to create an environment where knowledge flows freely to benefit the society”.

The most common barriers encountered by researchers include lack of subscriptions by their organisations, inability to get permissions from the copyright owner, and fear of copyright infringement. Research performing organisations report challenges emerging from copyright law, not only in accessing and re-using publicly funded research results, but also in making results available in open access. 

Special focus has been placed in investigating the situation in EU Member States that had introduced a Secondary Publication Right (SPR), including Germany, France, Netherlands, Belgium and Austria. SPR grants authors the right to freely share their published articles under certain conditions, alongside the initial publication in scientific journals. The study found that most research performing organisations in these Member States consider SPR to have at least a moderate impact on their research activities, including the share of research publications in open access. However, the study indicates that many researchers remain unaware of this right and a majority of research performing organisations consider certain provisions of national SPR legislation to be limiting factors. For example, the need to respect embargo periods and the fact that SPR is applicable only to the author-accepted manuscript, not the version of record, for publication. 

The study presents options for legislative and non-legislative measures. It also outlines a diversity of stakeholders’ perspectives on the options proposed, indicating the need for further analysis and discussions. Measures explored encompass the introduction of an EU-wide Secondary Publication Right and provisions that could be included in such legislation, spanning from the type of scientific output to the embargo period to be allowed. Other proposed measures focus on strengthening open-ended and flexible research exceptions. This could be achieved by introducing a fully harmonised, mandatory, and general exemption for scientific research, by clarifying lawful forms of access, and by removing excessive barriers posed by technological protection measures. Lastly, options explored also include giving guidance on current text and data mining provisions, to raise awareness and facilitate implementation by the research community. 

Finally, the study provides an analysis of provisions relevant to researchers and research organisations in EU data and digital legislation, examine the interplay against different legislative instruments, and present the main opportunities and challenges. The findings identify a growing entanglement of provisions relating to research activities and put forward recommendations. 

The study was commissioned as part of Action 2 of the  ERA Policy Agenda 2022-2024 . It has been carried out by a consortium led by PPMI Group, and including as partners the Institute for Information Law of the University of Amsterdam, the Centre for IT & IP Law at KU Leuven, and the Sant'Anna School of Advanced Studies.

It also responds to the  Council Conclusions of May 2023 on ‘High-quality, transparent, open, trustworthy and equitable scholarly publishing’, which encouraged the Commission to examine and propose measures at EU level.

More information  

Study: Improving access to and reuse of research results, publications and data for scientific purposes

Executive Summary 

European Research Area Platform

EU open science policy

Press contact:

EC Spokesperson for Research, Science and Innovation

Share this page

Viewpoint: research versus the rest of the world is a false dilemma

In polarised times, it is critical that we take a nuanced approach to designing the next framework programme, FP10

the research excellence framework

Ricardo Miguéis, head of INESC Brussels HUB.

It’s an increasingly binary world, where two opposites are presented as the only options available - and this lack of nuance is now infecting research and innovation policy.

Take for instance: Research Infrastructures versus Technology Infrastructures; basic research versus applied research; Pillar 1 versus Pillar 2; cohesion versus excellence; the east versus the west, the north versus the south; universities versus research technology organisations; the research office versus the researcher; the researcher versus the entrepreneur; the EU versus the world.

As discussions on the shape of the next EU research programme, FP10, gather pace, this polarisation is of increasing concern. In particular, it must not be allowed to undermine the European Research Council (ERC). I am a staunch supporter of the hugely successful ERC, but find that nowadays 90% of the conversations about its future in FP10 start with some version of an ‘ERC versus the world’ argument.

But the inherent danger of binary thinking in the research and innovation landscape extends far beyond the ERC, manifesting in practical, often detrimental consequences for the advancement of knowledge and technology. This binary framing not only narrows the scope of funding and policy support but also diminishes the capacity of research and innovation to respond to complex global challenges that demand a multifaceted approach.

As one example of this, a researcher faced funding barriers in her interdisciplinary research on the impact of climate change on biodiversity. Another example is an EU project to develop water purification technologies which struggled due to its cross-disciplinary nature. Both examples highlight the need for funding that supports specific skills to transition fundamental knowledge to market readiness.

But all the examples above ignore the most dangerous binary view of all:  research and innovation versus the world. This is especially when we are talking about budgetary divisions - and in the end, everything hangs on this. By bickering between ourselves about who should get more from the ever-small pot we will have access to, we are overlooking the main issue, which is properly funding R&I and thus contributing to solving the main societal problems on planet earth (and beyond, it seems).

To overcome this, we need to start thinking objectively. We need to acknowledge that the research and innovation funded by framework programmes is not an end in itself. It is, as it has been since the first EU research programme Esprit (European Strategic Programme for Research and Development in Information Technology) was set up in 1984 about Europe’s competitiveness. As Michel Carpentier, director general of DG XIII and the leading force behind Esprit put it, the aim was, “to coordinate and focus pre-competitive research in Europe and so strengthen the ability of European companies to compete in world markets.”

Did this mean basic research was out of the equation and did not receive any funding? Not at all. As the 1991/1992 Esprit report says, “The year 1991/92 for basic research marks a watershed — the completion of actions started from the first call for proposals in 1989 and the recommendations for funding further work from a second call made in October 1991.”

“Almost all the 61 actions and 13 working groups launched as a result of the first call for proposals in 1989 were completed, and 108 new projects, working groups and networks of excellence were initiated. There was, however, a high degree of directionality and the awareness that specific methodologies, skills, tools and investment were needed to transfer basic research into precompetitive research or industrial development,”

This highlights another issue that should be at the forefront of our thoughts in the build up towards FP10: the assumption that we all have the same capacity to bring knowledge to pre-competitive phase. We don’t. But we have built an impressive edifice of research and innovation capabilities, and they are all needed.

Do we need more money invested in research and innovation and in FP10? No doubt.

Will we secure €200B ? I hope so and will fight for that (we hereby fully endorse the Research Matters campaign). But we also need a visionary perspective to increase investment in research. First, we need to clearly set out a vision for the role of the EU level funding through FP10 vis à vis national and regional funding.

There is not and will not be enough money to fund all the good practice through FP10, but we do have, and will fight to increasingly have, the capacity to build the meta-structure, that is, the connections and the knowledge flows to make it happen.

This is why, in second place, we need route maps to mobilise resources towards concrete strategic goals, especially when it comes to the use of EU cohesion funds for research and innovation related investments.

Third, we need synergies, not only in funding instruments regulation, but also in planning, especially in the strategic thinking of regional, national and EU priorities, and in particular in relation to the conditions required to obtain and spend cohesion funds on research.

This must be done when negotiating framework agreements with each of the member states, if we want to even get near the goal of investing 3% of GDP on R&D across the EU in a more balanced manner.

And finally, we need far better planning and synergies at an EU horizontal level as well. We need to set a much more articulated vision for the role of different directorates in the Commission in the funding of research and innovation.

Ricardo Miguéis is head of INESC Brussels HUB, which represents the largest research and technology organisation in Portugal

Never miss an update from Science|Business:   Newsletter sign-up

Related News

the research excellence framework

Get the free Science|Business newsletters

newsletter icon

  Sign up for the Funding Newswire

  Sign up for the Policy Bulletin

  Sign up for The Widening

Funding Newswire subscription

align-right

Follow where the public and private R&I money is going and which collaborative opportunities you can pursue. 

Subscribe today

Manage my subscription

Find out more about the Science|Business Network

Network icon

  Why join?

  Become a member

  • Funding Newswire
  • The Widening
  • R&D Policy
  • Why subscribe?
  • Testimonials
  • Become a member
  • Network news
  • Strategic advice
  • Sponsorships
  • EU Projects
  • Enquire today

the research excellence framework

IMAGES

  1. What is the Research Excellence Framework?

    the research excellence framework

  2. PPT

    the research excellence framework

  3. Research Excellence Framework 2021

    the research excellence framework

  4. PPT

    the research excellence framework

  5. Research Excellence Framework (REF)

    the research excellence framework

  6. Research Excellence Framework

    the research excellence framework

VIDEO

  1. Global Performance Excellence Benchmarking Framework: Program Implementation III [71S]

  2. Faculty Development Programme HIGHER EDUCATION

  3. Faculty Development Programme HIGHER EDUCATION

  4. Faculty Development Programme HIGHER EDUCATION

COMMENTS

  1. Research Excellence Framework

    The Research Excellence Framework (REF) is the UK's system for assessing the excellence of research in UK higher education providers (HEPs). The REF outcomes are used to inform the allocation of around £2 billion per year of public funding for universities' research. The REF was first carried out in 2014, replacing the previous Research ...

  2. Ref 2029

    The Research Excellence Framework (REF) is the UK's system for assessing the excellence of research in UK higher education institutions (HEIs). The REF outcomes are used to inform the allocation of around £2 billion per year of public funding for universities' research. The REF is a process of expert review, carried out by sub-panels ...

  3. What is the REF?

    The Research Excellence Framework (REF) is the UK's system for assessing the excellence of research in UK higher education providers (HEIs). The REF outcomes are used to inform the allocation of around £2 billion per year of public funding for universities' research. The REF was first carried out in 2014, replacing the previous Research ...

  4. Research Excellence Framework

    The Research Excellence Framework (REF) is a research impact evaluation of British Higher Education Institutions (HEIs). It is the successor to the Research Assessment Exercise and it was first used in 2014 to assess the period 2008-2013.

  5. REF 2021: research excellence framework results

    The quality of UK scholarship as rated by the Research Excellence Framework has hit a new high following reforms that required universities to submit all research-active staff to the 2021 exercise. For the first time in the history of the UK's national audit of research, all staff with a "significant responsibility" for research were ...

  6. Results of Research Excellence Framework are published

    12 May 2022. The results of the UK-wide assessment of university research, conducted through the latest Research Excellence Framework (REF), have been published. The 2021 assessment process has identified a substantial proportion of world-leading research across all UK nations and English regions, and across the full range of subject areas.

  7. Home

    REF 2021 was delivered by the REF team, based at Research England, on behalf of the four funding bodies. This report provides the REF Director's review of the operational delivery of the exercise by the REF team, across the period from its inception in 2017 until its completion in 2022. Analysis of inclusion for submission, representation in ...

  8. University of Oxford REF 2021

    University of Oxford REF 2021. The UK Funding Bodies have published the outcomes of the recent national research assessment exercise, the Research Excellence Framework (REF) 2021. The REF 2021 results show Oxford's submission had the highest volume of world-leading research. [1] The University of Oxford made the largest submission of any ...

  9. Assessing research excellence: Evaluating the Research Excellence Framework

    The UK's Research Excellence Framework (REF) uses a peer-review process to evaluate the research environment, research outputs and non-academic impact of research produced by HEIs to produce a more accountable distribution of public funds. However, carrying out such a research evaluation is costly. Given the cost and that it is suggested that ...

  10. PDF YOUR SIMPLE GUIDE TO REF

    The Research Excellence Framework (REF) is a national assessment of the research taking place across UK universities. REF takes place every six years. The framework is used by the four UK higher education funding bodies (Research England, the Scottish Funding Council, the Higher Education Funding Council ...

  11. Unique, but still best practice? The Research Excellence Framework (REF

    In seven major research assessment exercises, beginning in 1986 and concluding with the 2014 Research Excellence Framework (REF), the UK has used the peer review of individuals and their outputs ...

  12. Home : REF 2014

    The Research Excellence Framework (REF) is the new system for assessing the quality of research in UK higher education institutions. The results of the 2014 REF were published on 18 December 2014. REF 2014 - key links. 2014 REF Results and submissions ; Evaluation of the 2014 REF;

  13. Oxford's REF 2021 results show largest volume of world-leading research

    The UK Funding Bodies have published the outcomes of the recent national research assessment exercise, the Research Excellence Framework (REF) 2021. The REF 2021 results show Oxford's submission had the highest volume of world­­­-leading research*. The REF 2021 results demonstrate once again that Oxford is a research powerhouse, and the ...

  14. Q&A: what is the REF and how is the quality of university research

    It was initially called the Research Selectivity Exercise, then in 1992 it became the Research Assessment Exercise (RAE) and in 2008 it became the Research Excellence Framework. The purpose is to ...

  15. The Research Excellence Framework

    The Research Excellence Framework (REF) is a national assessment of the research taking place across UK universities. REF takes place cyclically, typically every six years. The framework is used by the four UK higher education funding bodies (Research England, the Scottish Funding Council, the Higher Education Funding Council for Wales, and the ...

  16. Research Excellence Framework 2021

    The Research Excellence Framework (REF) is the system for assessing the quality of research in UK higher education institutions. Manchester made one of the largest and broadest REF submissions in the UK, entering 2,249 eligible researchers across 31 subject areas.

  17. Research Excellence Framework 2021

    Research that changes lives. The Research Excellence Framework (REF) 2021 results confirm our research is changing lives and shaping the world we live in. Our vision is to produce the highest quality research to drive intellectual advances and address global challenges. The REF results demonstrate that we are advancing towards this goal.

  18. Mammoth UK research assessment concludes as leaders eye ...

    Mammoth UK research assessment concludes as leaders eye radical shake up. Funding councils are considering changes to the Research Excellence Framework to improve research culture. By. Holly Else ...

  19. REF 2021

    The Research Excellence Framework (REF) is the system by which the UK's higher education funding bodies assess the quality of research in publicly funded UK higher education institutions (HEIs). REF 2021 comprised three elements: research environment, comprising the total number of research degrees awarded between 2014 and 2020, total ...

  20. Research Quality—Lessons from the UK Research Excellence Framework (REF

    Research Quality—Lessons from the UK Research Excellence Framework (REF) 2021. Research quality is a term often bandied around but rarely clearly defined or measured. Certainly, the nursing contribution to research and research quality has often been under-recognised, under-valued and under-funded, at least in the UK.

  21. Why the party is over for Britain's Research Excellence Framework

    Mammoth UK research assessment concludes as leaders eye radical shake up. For international colleagues who might not be familiar with it, the Research Excellence Framework (REF) is a mechanism by ...

  22. NIH announces the Simplified Framework for Peer Review

    The Simplified Framework for NIH Peer Review initiative reorganizes the five regulatory criteria (Significance, Investigators, Innovation, Approach, Environment; 42 C.F.R. Part 52h.8) into three factors - two will receive numerical criterion scores and one will be evaluated for sufficiency. All three factors will be considered in determining ...

  23. Building a Foundation for Excellence : Advancing Evidence-Based

    The improvement of patient care outcomes hinges on the advancement of nursing knowledge development at the bedside. Nurse-generated research is a cornerstone of evidence-based practice (EBP) and a mark of nursing excellence. 1 In the 2023 Magnet Application Manual, an updated requirement includes providing a description with supporting evidence of an infrastructure that supports nursing ...

  24. Enhancing research accessibility and reuse: new study outlines

    Today, the European Commission published a study aimed at improving access to and reuse of research results, including publications and data for scientific purposes. This marks a significant step under the European Research Area Policy Agenda 2022-2024 on an EU copyright and data legislative and regulatory framework fit for research.

  25. Viewpoint: research versus the rest of the world is a false dilemma

    Viewpoint: research versus the rest of the world is a false dilemma. In polarised times, it is critical that we take a nuanced approach to designing the next framework programme, FP10. Ricardo Miguéis, head of INESC Brussels HUB. It's an increasingly binary world, where two opposites are presented as the only options available - and this ...

  26. 2024 Research Excellence Award Winners

    2024 Research Excellence Award Winners. Anne Converse Willkomm, Associate Dean of the Graduate College (right) presenting the Research Excellence Award for Most Original and Creative Work to Pratusha Reddy, a PhD student in biomedical engineering (left) at Graduate Student Day on June 1, 2023.

  27. Lean Six Sigma and Industry 4.0 implementation framework for

    The purpose of this study is, therefore, to develop a structured framework that directs organizations to successfully implement LSS4.0. This study follows a combined approach including a systematic literature review to identify existing gaps in recent research and an expert panel to provide valuable insights and validation during the ...

  28. The development of an organizational excellence architecture model to

    While various studies have examined different aspects of business excellence, there is still a lack of comprehensive research on the optimal organizational excellence architecture (OEA) for an award-winning business excellence journey. The absence of a unified framework has led to inconsistent practices across organizations.