Blog
evaluation

Assessing excellence in research impact

updated on 06 Aug 2023
5 minutes

HEFCE’s consultation on the REF closed on March 17th, 2017.  It invited suggestions for how to enhance the guidance about impact and public engagement. Richard Holliman (The Open University) had a big idea: inviting panels to assess the ‘rigour’ of the process described in a case study, as well as assessing the ‘reach and significance’ of the claimed impacts. His blog post is below.

Impact!

Introducing ‘rigour’ to ‘reach and significance’

Richard Holliman

In REF 2014, social and economic impacts from research were assessed according to their demonstrable reach and significance.  One of the important questions for the current HEFCE consultation is whether these criteria were fair and equitable across all research domains. Further, did these two criteria assess all the essential elements of an impact case study?  If not, what was missing from this assessment that could be added for REF 2021?

Having supported researchers across The Open University as they prepared for REF 2014, I encountered examples where Units of Assessment were risk averse about the evidence presented in impact case studies.  

In part, I put this down to a lack of confidence in how significance was to be assessed in relation to reach, an issue that colleagues and I have also encountered in relation to planning for pathways to impact (Holliman and Davies, 2015). Further, some researchers were struggling to assess why quality in process was being overlooked, when culture change programmes were encouraging researchers to plan more effectively for impact (Grand et al., 2015).  In effect, the cart and the horse did not seem to be connected.

Evidence of confusion in researchers’ perceptions of the way these criteria were to be assessed, combined with a healthy dose of caution manifest through self-censorship, isn’t that surprising given the relatively recent introduction of the research impact agenda.  Many researchers are, in effect, ‘muddling through’ and ‘learning on the job’, as they seek to embed how they plan for, enact, and collect evidence of research impact (Holliman and Warren, 2017Grand et al., 2016).

What should change for REF 2021

My argument is that the REF 2014 criteria of reach and significance were not sufficient (and not equitable for all case studies) to genuinely assess all the quality of the impacts from research.  Excellent examples of research impact were either not selected for submission to REF 2014 because they were deemed not to meet the guidance, or presented to meet the requirements of reach and significance, thereby overlooking excellence in process.

My solution to the challenge, which I post here for discussion and comment, is that an assessment of rigour should be introduced in REF 2021 alongside reach and significance.  In effect, I’m asking for researchers to be given a chance to ‘show their working’ in terms of their planning, enacting and evidence collection for research impact.

In the short term, I argue that the introduction of rigour would result in Units of Assessment having more confidence in submitting a wider range of evidence to demonstrate research impact.  In the longer term it should improve the planning for pathways to impact, and therefore the quality of research impacts across the sector.

Assessing ‘rigour’

“Surely not another requirement to demonstrate excellence in the next REF,” I hear you shout.  “Not necessarily,” I reply.  Researchers in receipt of public funds since 2010 should find this additional requirement relatively straightforward.  They can call on their previous planning for Pathways to impact, for instance. 

For the assessors, there are some well-established indictors of what excellence (‘rigour’) looks like.  My suggestions below draw on my experience of developing and implementing an Award Scheme to assess the quality of Engaged Research at the Open University (Holliman et al., 2015): 

  • People: Are the various participants in the engagement processes clearly identified?  Are they the most appropriate publics to be involved?  Have ethical issues and issues of equity, inclusion and opportunity been considered?
  • Purposes: To what extent are the aims and objectives clear, where relevant SMART, and appropriate?  Is there evidence that the aims and objectives are meaningful and relevant to all the participants?  Is there evidence that ‘publics’ been involved in shaping the aims and objectives?
  • Processes: How have the engagement activities been conducted?  When, and how often, have the publics been involved, through what mechanisms, and to what ends?  Is there evidence that the various publics have been involved in meaningful ways at different stages of the research cycle, e.g. in shaping the research, the processes of conducting the research, and co-producing outputs?
  • Gathering evidence of impact: Was performance against the aims and objectives measured in an appropriate and effective way?  Is there evidence that the knowledge exchange activities made a difference to the participants (researchers and/or publics), in terms of effects, changes or benefits?
  • Reflective practices: Has the learning from these impact-generating research activities been consolidated and shared with relevant publics and academic communities?

Reviewing the options

I’m grateful that HEFCE have opened up this consultation about the way that REF 2021 will be organised, and to the NCCPE for commissioning and hosting this discussion.  Do the ideas introduced in this post look useful?  What would you change, revise or adapt to improve on the ideas briefly outlined in this post? Share your thoughts by leaving a comment below.

Acknowledgements

The ideas discussed in this post are informed by research funded by NERC through an Innovation Award (NE/L002493/1) and draw on research findings and experiences through an award made as part of the RCUK Public Engagement with Research Catalysts (EP/J020087/1) and a further award made through the RCUK School University Partnership Initiative (EP/K027786/1).  I am grateful to Paul Manners and Jane Perrone for comments on a draft of this post.