How can academic research generate impact? What support structures are in place to promote and support impact? And how can impact be measured? These were just some of the questions up for debate at a unique workshop on “Measuring the Impact of Academic Research”, hosted by Birkbeck’s Centre for Innovation Management Research (CIMR) on 2 December 2016.
Research Councils UK defines research impact as ‘the demonstrable contribution that excellent research makes to society and the economy’. This can involve academic impact, economic and societal impact or both. Academics are increasingly called upon to provide evidence of research impact, sometimes as a requirement to secure research funding and sometimes as part of formalised evaluation processes, which might involve providing quantitative evidence. As impact becomes increasingly important for academic visibility and even for the purposes of funding allocation, it is vital for the research community to better understand how it occurs and how it might be utilised to add value to the economy and society. Events such as the CIMR workshop provide a platform for collaboration and discussion, and Birkbeck was delighted to welcome experts, academics and policymakers from across the European research communities to join the debate.
The complex nature of impact
A much discussed theme of the workshop appeared not so much as, how can we generate impact, but crucially, how can we define impact, and how researchers benefit from a more concrete definition of what impact means in the context of their study.
Loet Ledesdorff, Professor in Dynamics of Scientific Communication and Technological Innovation at the University of Amsterdam, opened the first session with an insightful presentation on linear impact models and articulating societal demand. This provided an excellent starting point for participant discussion; he stressed the importance of understanding what impact is before it can be measured – after all, measuring is easy once we are aware of what and why we are measuring. While there are many definitions of ‘impact’, we must bear in mind that the way in which we choose to define it influences the measurements we obtain. Therefore it is paramount to establish a clear theoretical question to which researchers can refer back, after which an appropriate system to measure impact can be developed.
Talks by Martyna Śliwa from the University of Essex, Anne-Wil Harzing from Middlesex University and Fernando Galindo-Rueda from the OECD, brought further perspective on how impact can be measured by the results of the Research Excellence Framework 2014 (REF), from academic impact metrics and from key data indicators. They explored how research might be organised in order to generate greater impact; one key advantage of using impact case studies to assess the impact of research, for example, is that they allow a shift in focus from the impact of a single piece of research to the impact of a whole research programme. A research programme could be considered a more appropriate unit of analysis to assess the way in which research makes an impact on society, and probably on the development of science itself.
Promoting Engagement and Narrative
While all in attendance agreed that measuring impact must remain a high priority, both Loet Ledesdorff and Johnathan Adams, Chief Scientist at Digital Science, warned that current policy debate has too narrow a focus on measuring impact, at the expense of promoting it. A key issue for leading-edge university research is to identify “articulation points”, at which points different communities can meet. The workshop provided an opportunity to share tested methods and case studies to create a system of incentives that encourages researchers to generate impact. Rick Delbridge and Tim Edwards from Cardiff University showcased two of their projects designed to tackle societal “grand challenges”: the Social Science Research Park (SSPARK) and the Responsible Innovation Networks (RIN). These initiatives build on the view that in order to generate impact, the social sciences need to engage with stakeholders and allow them a part in identifying the problems universities could be focusing on. Stakeholders want a voice throughout the process of generating and showcasing impact – in the deliberation, evaluation and dispute of research.
Narrative has become a crucial instrument to showcase the impact of these kinds of processes, which also emerged from the case study presented by Federica Rossi from Birkbeck, Ainurul Rosli from the University of Westminster, Nick Yip from UEA, and Muthu de Silva from the University of Kent. This research group interviewed participants in Knowledge Transfer Partnership (KTPs) and found that impact is achieved through sustained interactions within and outside the KTP, which result in knowledge co-production. The mutual benefits of the KTP start an organic ripple – the benefit of the KTP cannot be immediately established by stakeholders, but unfolds over a longer period of time. Broader economic and societal impact of knowledge co-production can be captured by asking key stakeholders to narratively reconstruct their interactions with academic research and how this, over time, has led to a change in their perspective – of the actors involved, and of their roles.
The CIMR workshop provided a space to consider and evaluate successful cases of academic impact, and to share ideas that might offer particular potential for impact academically and socially. The final panel discussion helped to draw out the key messages of the event:
- It is important to keep the definition of impact broad, as research impact can take many forms
- Universities must make space for impact by creating a system of rules and incentives that encourages academics to seek impact
- Must also implement incentives that encourage interdisciplinary research because this kind of research is the most impactful
- The impact of teaching must not be neglected, since, particularly in the social sciences, one of the key avenues for generating impact is by teaching students how to think about the world in different ways
Thank you again to our Workshop Panel Chairs and Speakers:
- Emanuela Todeva, BCNED
- Loet Leydesdorff, University of Amsterdam
- Jonathan Adams and Martin Szomszor, Digital Science
- Martyna Śliwa, University of Essex
- Anne-Wil Harzing, Middlesex University
- Fernando Galindo-Rueda, OECD
- Rosa Fernandez, NCUB
- Rick Delbridge and Tim Edwards, Cardiff University
- Nola Dundas-Hewitt, Queens University of Belfast
- Steve Roper, University of Warwick
- Federica Rossi, Birkbeck
- Ainurul Rosli, University of Westminster
- Nick Yip, University of East Anglia
- Muthu de Silva, University of Kent
- Jeremy Howells, Kellogg College Oxford
Suma Athreye, University of Essex
- Steven Hill, HEFCE
- Gino Martini, Roche Innovation and King’s College