Acessibilidade / Reportar erro

Research impact – How to deal with it? Editorial impact series part 3

We seek the truth and will endure the consequences.

Charles Seymour

(American historian 1885–1963)

This publication is the last part of a three-editorial series about research impact. In the first (Sandes-Guimarães & Hourneaux Junior, 2020Sandes-Guimarães, L., & Hourneaux Junior, F. (2020). Research impact – what is it, after all? Editorial impact series part 1. RAUSP Management Journal, 55(3), 283–288. doi: 10.1108/RAUSP-07-2020-202.
https://doi.org/10.1108/RAUSP-07-2020-20...
), we presented the main concepts and ideas that define research impact. In the second (Hourneaux Junior & Sandes-Guimarães, 2020Hourneaux Junior, F., & Sandes-Guimarães, L. (2020). Research impact – how to evaluate it? Editorial impact series part 2. RAUSP Management Journal, 55(4), 427–433. doi: 10.1108/RAUSP-10-2020-227.
https://doi.org/10.1108/RAUSP-10-2020-22...
), we discussed the most important models and frameworks for assessing research impact in the literature.

If these two challenges – identifying and measuring – are not quite enough, another task presents itself to researchers, universities and society in general. It is related to the numerous constraints and difficulties can be found in the literature regarding the research impact assessment.

In this editorial, we aim to present the most critical (and controversial) problems and critics in the current debate on research impact and discuss some initiatives and possible solutions to those problems.

Main problems and critics on research impact

Despite its importance and extensive knowledge, literature also presents diverse problems and some criticism regarding research impact. Some of these issues are discussed next.

Causality/attribution

The attribution and causality issues are repeatedly highlighted in the research impact literature as one of the main limitations of impact assessment. Causality refers to the challenge in attributing impacts to a specific source or cause. Attribution means the proportion of influence or impact that can be attributed to efforts derived from research projects, researchers or organisations (Derrick & Samuel, 2016Derrick, G. E., & Samuel, G. N. (2016). The evaluation scale: Exploring decisions about societal impact in peer review panels. Minerva, 54(1), 75–97. doi: 10.1007/s11024-016-9290-0.
https://doi.org/10.1007/s11024-016-9290-...
; Morgan Jones, Manville, & Chataway, 2017Morgan Jones, M., Manville, C., & Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, doi: https://doi.org/10.1007/s10961-017-9608-6.
https://doi.org/10.1007/s10961-017-9608-...
).

Using a linear and unidirectional logic model, from inputs to impacts, one can have the impression that the research is solely responsible for impacting that group or community directly and measurably (Edwards & Meagher, 2020Edwards, D. M., & Meagher, L. R. (2020). A framework to evaluate the impacts of research on policy and practice: a forestry pilot study. Forest Policy and Economics, 114, 101975 doi: 10.1016/j.forpol.2019.101975.
https://doi.org/10.1016/j.forpol.2019.10...
). However, it has been recognised that the path between scientific research and its impact is a complex and non-linear process, including several interactions among researchers, users and stakeholders and interconnections among research activities and their outcomes (Penfield, Baker, Scoble, & Wykes, 2014Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
https://doi.org/10.1093/reseval/rvt021...
; Riley et al., 2018Riley, B. L., Kernoghan, A., Stockton, L., Montague, S., Yessis, J., & Willis, C. D. (2018). Using contribution analysis to evaluate the impacts of research on policy: Getting to ‘good enough. ‘ Research Evaluation, 27(1), 16–27. doi: 10.1093/reseval/rvx037.
https://doi.org/10.1093/reseval/rvx037...
). In this sense, the impact or the perceived change derives from specific research and several factors that are at stake, including “luck”, serendipity and complex networks that interact in this interface between scientific knowledge and society (Penfield et al., 2014Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
https://doi.org/10.1093/reseval/rvt021...
).

Morton (2015)Morton, S. (2015). Progressing research impact assessment: a “contributions” approach. Research Evaluation, 24(4), 405–419. doi: 10.1093/reseval/rvv016.
https://doi.org/10.1093/reseval/rvv016...
pointed out that research results are integrated with current beliefs and understandings, so it is not reasonable to attribute change owing to specific research. Meagher, Lyall, and Nutley (2008)Meagher, L., Lyall, C., & Nutley, S. (2008). Flows of knowledge, expertise and influence: a method for assessing policy and practice impacts from social science research. Research Evaluation, 17(3), 163–173. doi: 10.3152/095820208X331720.
https://doi.org/10.3152/095820208X331720...
and Morgan Jones et al. (2017)Morgan Jones, M., Manville, C., & Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, doi: https://doi.org/10.1007/s10961-017-9608-6.
https://doi.org/10.1007/s10961-017-9608-...
highlighted in their study this complexity in attributing impact to findings of any particular research. They also stated that in some cases, it was more reasonable to associate the impact with a researcher’s expertise and body of research rather than a specific research output.

In scientific disciplines undertaking fundamental research, such as pure mathematics, attribution poses unique challenges because the impact of research is unlikely to be foreseen. Research results can be absorbed in other science areas and further developed before impacts occur, at which stage attribution becomes a substantial problem. We must be able to evaluate the contribution of fundamental science and not undermine its impacts when compared to more applied research (Penfield et al., 2014Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
https://doi.org/10.1093/reseval/rvt021...
).

Many researchers have now distanced themselves from this attempt to attribute an impact to specific research (linear causality). They started to understand the research’s contribution to the “change” they may have been related, recognising the dynamic nature of the research impact process and the many influencing factors that contribute to generate impact; and research is only one of them (Morton, 2015Morton, S. (2015). Progressing research impact assessment: a “contributions” approach. Research Evaluation, 24(4), 405–419. doi: 10.1093/reseval/rvv016.
https://doi.org/10.1093/reseval/rvv016...
). Analysing how the research has contributed to a specific impact, rather than caused it and understanding the role of contextual factors in this process, is one way of addressing the attribution problem (Morton, 2015Morton, S. (2015). Progressing research impact assessment: a “contributions” approach. Research Evaluation, 24(4), 405–419. doi: 10.1093/reseval/rvv016.
https://doi.org/10.1093/reseval/rvv016...
).

Temporality.

The time-lag between the research (the period when it was actually done) and its impact can vary widely, depending on the field of knowledge and other factors. For example, in the medical area, the average estimate of the impact is 17 years “from bench to bedside”, while in physics, impacts may take more than 50 years to emerge (Morgan Jones et al., 2017Morgan Jones, M., Manville, C., & Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, doi: https://doi.org/10.1007/s10961-017-9608-6.
https://doi.org/10.1007/s10961-017-9608-...
). Other examples cited in the literature (from DNA discovery to DNA fingerprinting – 30 years; treatments for cardiovascular diseases – 10–25 years) (Penfield et al., 2014Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
https://doi.org/10.1093/reseval/rvt021...
) mark the difficulty in deciding for a timeframe for impact evaluation. In this sense, depending on when the assessment takes place, it can influence the degree and importance of the impact (Penfield et al., 2014Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
https://doi.org/10.1093/reseval/rvt021...
).

For researchers in basic sciences, such as physics and mathematics, the research impact may take several decades to occur. Such impacts are usually unknown when research is being designed and carried out. In the UK context, the Research Excellence Framework (REF) (REF, 2019Research Excellence Framework [REF]. (2019). REF 2021: Panel criteria and working methods. Retrieved from www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/
www.ref.ac.uk/publications/panel-criteri...
) values impact case studies being more than published papers (one 4-star case study = six 4-star scientific papers). As highlighted by McKenna (2021)McKenna, H. P. (2021). Research impact: Guidance on advancement, achievement and assessment, Cham: Springer Nature. available at: https://doi.org/10.1007/978-3-030-57028-6
https://doi.org/10.1007/978-3-030-57028-...
, this practice might diminish the pursuit of knowledge-driven by curiosity and towards blue skies (even though this can be highly impactful in the medium to long term). Another consequence is that it can drive researchers to address actual grand challenges (George, Howard-Grenville, Joshi, & Tihanyi, 2016George, G., Howard-Grenville, J., Joshi, A., & Tihanyi, L. (2016). Understanding and tackling societal grand challenges through management research. Academy of Management Journal, 59(6), 1880–1895. doi: 10.5465/amj.2016.4007.
https://doi.org/10.5465/amj.2016.4007...
).

Morton (2015)Morton, S. (2015). Progressing research impact assessment: a “contributions” approach. Research Evaluation, 24(4), 405–419. doi: 10.1093/reseval/rvv016.
https://doi.org/10.1093/reseval/rvv016...
presents some solutions to address this problem based on examples from literature: early documentary research and, after a lag period, a workshop-based follow-up; immediate analysis to characterise local and short-term and impacts and analysis after a time lag to evaluate long-term and broader impacts. It might also be necessary to adapt the time window of analysis for some knowledge areas, as explained before. In REF, for example, the general impact case studies must be from the past 20 years, but the architecture area was granted five more years considering the specificities of the field (McKenna, 2021McKenna, H. P. (2021). Research impact: Guidance on advancement, achievement and assessment, Cham: Springer Nature. available at: https://doi.org/10.1007/978-3-030-57028-6
https://doi.org/10.1007/978-3-030-57028-...
). The inclusion of continued case studies for the REF 2021 assessment also enables the improvement of previously submitted case studies demonstrating the impact’s evolution.

Performance x process.

As we showed in the last editorial (Hourneaux Junior & Sandes-Guimarães, 2020Hourneaux Junior, F., & Sandes-Guimarães, L. (2020). Research impact – how to evaluate it? Editorial impact series part 2. RAUSP Management Journal, 55(4), 427–433. doi: 10.1108/RAUSP-10-2020-227.
https://doi.org/10.1108/RAUSP-10-2020-22...
), research impact is commonly assessed in two forms, considering either the research process or the research performance. We will not dive into each one’s specificities, as we explained before. The idea is to argue that the decision to focus either on the process or the performance will have different implications for the research assessment.

The focus only on the performance – results or outcomes – of research and how it generated some impact on society fails to consider the paths and processes that have led to this impact, especially the engagement with users during the research process. This engagement is essential, as knowledge is socially built, and this interaction facilitates this mobilisation of knowledge among different communities (Hughes, Webber, & O’Regan, 2019Hughes, T., Webber, D., & O’Regan, N. (2019). Achieving wider impact in business and management: Analysing the case studies from REF 2014. Studies in Higher Education, 44(4), 628 doi: 10.1080/03075079.2017.1393059.
https://doi.org/10.1080/03075079.2017.13...
). Besides, the focus only on the results disregards mistakes and learnings during the process, which can help academics in future researches.

Evaluators’ bias towards more direct impacts.

The evaluators’ panel for assessing impacts on society in REF (2014Research Excellence Framework [REF]. (2014). About the REF. Retrieved from www.ref.ac.uk/2014/about/
www.ref.ac.uk/2014/about/...
, 2019Research Excellence Framework [REF]. (2019). REF 2021: Panel criteria and working methods. Retrieved from www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/
www.ref.ac.uk/publications/panel-criteri...
) consists of academics and users or stakeholders of the academic research. Most of these evaluators will have their first time participating in a panel for research impact assessment. These contrasting views are vital to understanding what counts as societal impact.

Samuel and Derrick (2015)Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterisation of impact under the REF2014. Research Evaluation, 24(3), 229–241. doi: 10.1093/reseval/rvv007.
https://doi.org/10.1093/reseval/rvv007...
studied these differences. They identified that some evaluators perceived impact only “after there had been a marked health, economic, or other similarly ‘final’ outcome” while others perceived it as an incremental process, accounting for more than the outcome, but considering the process and movements made towards that outcome. Overall, the authors concluded that the evaluators’ views favoured “impact as an outcome over a process involving several individual impact events”. A recent study showed that the REF cases which received the lowest impact scores were focussed mainly on the process and the path to generating impact and less on the change or contribution itself (compared to cases with higher impact scores) (Reichard et al., 2020Reichard, B., Reed, M. S., Chubb, J., Hall, G., Jowett, L., Peart, A., & Whittle, A. (2020). Writing impact case studies: a comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications, 6(1), doi: https://doi.org/10.1057/s41599-020-0394-7.
https://doi.org/10.1057/s41599-020-0394-...
). In this sense, there can be a certain bias of the evaluators (even with users in the evaluation process) for those more direct and instrumental impacts, possibly focussing on causality and attribution.

Having different views about what societal impact means and how it should be evaluated is normal and beneficial. This way, we can challenge our own opinions and come to think differently about the assessment process. Nevertheless, a panel of evaluators must reach a consensus on the impact of a specific case study. To have a foundation guide for evaluators, the alignment of what should be considered impact is necessary to reach a consensus in the evaluation, regardless of their diverse opinions and ideas. Derrick and Samuel (2017)Derrick, G., & Samuel, G. (2017). The future of societal impact assessment using peer review: pre-evaluation training, consensus building and inter-reviewer reliability. Palgrave Communications, 3(1), 1–10. doi: 10.1057/palcomms.2017.40.
https://doi.org/10.1057/palcomms.2017.40...
suggest, for example, having pre-evaluation training.

Another criticism regarding evaluators is the composition of such panels. The inclusion of societal impact in research evaluation processes should be accompanied by non-academic actors’ inclusion in the assessment process. Including only academics to assess the societal (hence non-academic) research impact could decrease the assessment process’s credibility. Scholars might attribute more importance to impacts most valued by their own group because of the academic culture’s values. Thus, it is essential to include users and other stakeholders in the evaluation process (Gunn & Mintrom, 2017Gunn, A., & Mintrom, M. (2017). Evaluating the non-academic impact of academic research: design considerations. Journal of Higher Education Policy and Management, 39(1), 20–30. doi: 10.1080/1360080X.2016.1254429.
https://doi.org/10.1080/1360080X.2016.12...
). The optimal percentage is still debateable. In REF, 27% of the panel was composed of users, while in Australia, this percentage was 70% (Morgan Jones et al., 2017Morgan Jones, M., Manville, C., & Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, doi: https://doi.org/10.1007/s10961-017-9608-6.
https://doi.org/10.1007/s10961-017-9608-...
). Still, regardless of the optimal ratio, research users should be included to reduce the bias towards more “academic-like” impacts.

Research impact: what has been done

Given the difficulties mentioned earlier regarding research impact, it is also important to discuss “what to do” considering this challenging background. We can observe that there is a series of recent movements and trends regarding dealing with research impact coming from the several actors involved in this process.

First, regarding scholars. It may be obvious that whatever happens related to research impact can have a huge impact, without any pun intended, on scholars. The consequences of the new forms of assessment can be inevitable and profound to them in terms of academic practice, recognition and future rewards (Gunn & Mintrom, 2017Gunn, A., & Mintrom, M. (2017). Evaluating the non-academic impact of academic research: design considerations. Journal of Higher Education Policy and Management, 39(1), 20–30. doi: 10.1080/1360080X.2016.1254429.
https://doi.org/10.1080/1360080X.2016.12...
; Joly & Matt, 2017). It may also require a change in their mindset and maybe in their current research focusses.

In contrast, we can notice that scholars have felt that they can also “be a part of the solution”. One interesting academic initiative is the Responsible Research for Business and Management (RRBM), a network “dedicated to inspiring, encouraging and supporting credible and useful research in the business and management disciplines” (RRBM, 2020Responsible Research for Business and Management [RRBM]. (2020). Principles of responsible science. Retrieved from www.rrbm.network/position-paper/principles-of-responsible-science/
www.rrbm.network/position-paper/principl...
). RBBM’s signatory researchers are supposed to follow seven principles, as we can see in Table 1.

Table 1.
RBBM’s seven principles

Within RRBM’s principles, we can notice that most of them are directly related to the research impact itself. By acknowledging – and practising – these principles, scholars naturally would increase a broader research impact in their activities. A similar network created by scholars is the Impact Scholar Community, founded by participants of the Organisations and the Natural Environment Division of the Academy of Management (AOM), but focussed on early-career management researchers (Impact Scholar Community, 2020Impact Scholar Community. (2020). Retrieved from https://www.impactscholarcommunity.com/
https://www.impactscholarcommunity.com/...
).

Second, regarding higher education institutions (HEIs). HEIs and their post-graduate programs will undoubtedly be affected by changes in how research impact is defined and measured. We can expect natural internal changes in HEI’s processes, from the faculty selection to its evaluation and reward (Gunn & Mintrom, 2017Gunn, A., & Mintrom, M. (2017). Evaluating the non-academic impact of academic research: design considerations. Journal of Higher Education Policy and Management, 39(1), 20–30. doi: 10.1080/1360080X.2016.1254429.
https://doi.org/10.1080/1360080X.2016.12...
; Joly & Matt, 2017), as mentioned before. Further, with today’s impact-oriented funding systems, HEIs will be required to demonstrate how their research is not only adding new knowledge but also benefitting societal actors.

HEIs will also need to justify their worth to stakeholders and public funding agencies. This process will require new modes of research organisation and knowledge production, perhaps more interdisciplinary and coproduced with research users, at least in part. HEIs should also start keeping better track of their own data and information, analysing their situation internally and contributing to promoting changes in the evaluation process based on these data. One example is the universities UK network, an initiative of 140 universities in England, Scotland, Wales and Northern Ireland, for disseminating their impact activities. Another is the project called metricas.edu, developed by three universities in the State of Sao Paulo, Brazil, to monitor and understand the importance of activities performed by these universities and build their own metrics to assess their contribution to regional and national development (Metricas.edu, 2020Metricas.edu. (2020). Projeto de pesquisa em políticas públicas. Retrieved from https://metricas.usp.br/
https://metricas.usp.br/...
).

Third, regarding government. The government can also participate in this debate by raising the bar and pushing other impact categories rather than the traditional approach (number of publications, number of citations and so on) as criteria for post-graduate programs’ evaluation. Following foreign tendencies such as REF (2014Research Excellence Framework [REF]. (2014). About the REF. Retrieved from www.ref.ac.uk/2014/about/
www.ref.ac.uk/2014/about/...
, 2019Research Excellence Framework [REF]. (2019). REF 2021: Panel criteria and working methods. Retrieved from www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/
www.ref.ac.uk/publications/panel-criteri...
), the Brazilian Government has recently shifted to emphasise research impact as essential, not only for evaluating academic performance but also to establish criteria for project funding approval. Coordination for the Improvement of Higher Education Personnel (CAPES) is the government post-graduation regulation agency that monitors and evaluates all the countries’ research institutions. Recently, CAPES has increased the weight of social, cultural and economic impacts in its program evaluation criteria (CAPES, 2019Coordenação de Aperfeiçoamento de Pessoal de Nível Superior [CAPES]. (2019). GT impacto e relevância econômica e social: Relatório final de atividades, Brasília, DF: CAPES.). Moreover, the Brazilian Ministry of Science, Technology, Innovations and Communications and its research funding agency CNPq (National Council for Scientific and Technological Development) have restricted the criteria for selecting research funding, directing the resources to projects more directly related to “economic and social development of the country” (Ministério da Ciência, Tecnologia, Inovações e Comunicações [MCTIC], 2020Ministério da Ciência, Tecnologia, Inovações e Comunicações [MCTIC]. (2020). Portaria N° 1.122, de 19 de março de 2020, diário oficial da união, edição 57, seção 1, página 19.).

Finally, the journals. As the prevalent form of disseminating research, journals also have an essential role in this process. Journals can create clauses, including any other type of impact as mandatory for publication approval, besides the traditional theoretical and practical. For instance, since 2020, authors who submit their papers to RAUSP Management Journal must include a brief description of their research’s “social impact” in the extended abstract. It is also a way to emphasise to the reviewers if the article has an actual social impact. Moreover, it can also be a criterion for desk-rejecting the article (please see Hourneaux Junior [2020Hourneaux Junior, F. (2020). Eight factors for desk-review rejection at RAUSP’s management journal. RAUSP Management Journal, 55(2), 123–125. doi: 10.1108/RAUSP-04-2020-151.
https://doi.org/10.1108/RAUSP-04-2020-15...
]).

Final remarks

It is quite clear that research impact research represents a challenge for all the actors somehow involved in research activities. It configures a novel reality that requires different behaviours and processes, resulting in new and hopefully better effects. With this editorial series, we hope we could help the reader understand and tackle this challenge.

References

  • Coordenação de Aperfeiçoamento de Pessoal de Nível Superior [CAPES]. (2019). GT impacto e relevância econômica e social: Relatório final de atividades, Brasília, DF: CAPES.
  • Derrick, G. E., & Samuel, G. N. (2016). The evaluation scale: Exploring decisions about societal impact in peer review panels. Minerva, 54(1), 75–97. doi: 10.1007/s11024-016-9290-0.
    » https://doi.org/10.1007/s11024-016-9290-0
  • Derrick, G., & Samuel, G. (2017). The future of societal impact assessment using peer review: pre-evaluation training, consensus building and inter-reviewer reliability. Palgrave Communications, 3(1), 1–10. doi: 10.1057/palcomms.2017.40.
    » https://doi.org/10.1057/palcomms.2017.40
  • Edwards, D. M., & Meagher, L. R. (2020). A framework to evaluate the impacts of research on policy and practice: a forestry pilot study. Forest Policy and Economics, 114, 101975 doi: 10.1016/j.forpol.2019.101975.
    » https://doi.org/10.1016/j.forpol.2019.101975
  • George, G., Howard-Grenville, J., Joshi, A., & Tihanyi, L. (2016). Understanding and tackling societal grand challenges through management research. Academy of Management Journal, 59(6), 1880–1895. doi: 10.5465/amj.2016.4007.
    » https://doi.org/10.5465/amj.2016.4007
  • Gunn, A., & Mintrom, M. (2017). Evaluating the non-academic impact of academic research: design considerations. Journal of Higher Education Policy and Management, 39(1), 20–30. doi: 10.1080/1360080X.2016.1254429.
    » https://doi.org/10.1080/1360080X.2016.1254429
  • Hourneaux Junior, F. (2020). Eight factors for desk-review rejection at RAUSP’s management journal. RAUSP Management Journal, 55(2), 123–125. doi: 10.1108/RAUSP-04-2020-151.
    » https://doi.org/10.1108/RAUSP-04-2020-151
  • Hourneaux Junior, F., & Sandes-Guimarães, L. (2020). Research impact – how to evaluate it? Editorial impact series part 2. RAUSP Management Journal, 55(4), 427–433. doi: 10.1108/RAUSP-10-2020-227.
    » https://doi.org/10.1108/RAUSP-10-2020-227
  • Hughes, T., Webber, D., & O’Regan, N. (2019). Achieving wider impact in business and management: Analysing the case studies from REF 2014. Studies in Higher Education, 44(4), 628 doi: 10.1080/03075079.2017.1393059.
    » https://doi.org/10.1080/03075079.2017.1393059
  • Impact Scholar Community. (2020). Retrieved from https://www.impactscholarcommunity.com/
    » https://www.impactscholarcommunity.com/
  • McKenna, H. P. (2021). Research impact: Guidance on advancement, achievement and assessment, Cham: Springer Nature. available at: https://doi.org/10.1007/978-3-030-57028-6
    » https://doi.org/10.1007/978-3-030-57028-6
  • Meagher, L., Lyall, C., & Nutley, S. (2008). Flows of knowledge, expertise and influence: a method for assessing policy and practice impacts from social science research. Research Evaluation, 17(3), 163–173. doi: 10.3152/095820208X331720.
    » https://doi.org/10.3152/095820208X331720
  • Metricas.edu. (2020). Projeto de pesquisa em políticas públicas. Retrieved from https://metricas.usp.br/
    » https://metricas.usp.br/
  • Ministério da Ciência, Tecnologia, Inovações e Comunicações [MCTIC]. (2020). Portaria N° 1.122, de 19 de março de 2020, diário oficial da união, edição 57, seção 1, página 19.
  • Morgan Jones, M., Manville, C., & Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, doi: https://doi.org/10.1007/s10961-017-9608-6.
    » https://doi.org/10.1007/s10961-017-9608-6
  • Morton, S. (2015). Progressing research impact assessment: a “contributions” approach. Research Evaluation, 24(4), 405–419. doi: 10.1093/reseval/rvv016.
    » https://doi.org/10.1093/reseval/rvv016
  • Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: a review. Research Evaluation, 23(1), 21–32. doi: 10.1093/reseval/rvt021.
    » https://doi.org/10.1093/reseval/rvt021
  • Reichard, B., Reed, M. S., Chubb, J., Hall, G., Jowett, L., Peart, A., & Whittle, A. (2020). Writing impact case studies: a comparative study of high-scoring and low-scoring case studies from REF2014. Palgrave Communications, 6(1), doi: https://doi.org/10.1057/s41599-020-0394-7.
    » https://doi.org/10.1057/s41599-020-0394-7
  • Research Excellence Framework [REF]. (2014). About the REF. Retrieved from www.ref.ac.uk/2014/about/
    » www.ref.ac.uk/2014/about/
  • Research Excellence Framework [REF]. (2019). REF 2021: Panel criteria and working methods. Retrieved from www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/
    » www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/
  • Riley, B. L., Kernoghan, A., Stockton, L., Montague, S., Yessis, J., & Willis, C. D. (2018). Using contribution analysis to evaluate the impacts of research on policy: Getting to ‘good enough. ‘ Research Evaluation, 27(1), 16–27. doi: 10.1093/reseval/rvx037.
    » https://doi.org/10.1093/reseval/rvx037
  • Responsible Research for Business and Management [RRBM]. (2020). Principles of responsible science. Retrieved from www.rrbm.network/position-paper/principles-of-responsible-science/
    » www.rrbm.network/position-paper/principles-of-responsible-science/
  • Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterisation of impact under the REF2014. Research Evaluation, 24(3), 229–241. doi: 10.1093/reseval/rvv007.
    » https://doi.org/10.1093/reseval/rvv007
  • Sandes-Guimarães, L., & Hourneaux Junior, F. (2020). Research impact – what is it, after all? Editorial impact series part 1. RAUSP Management Journal, 55(3), 283–288. doi: 10.1108/RAUSP-07-2020-202.
    » https://doi.org/10.1108/RAUSP-07-2020-202
  • Universities UK. (2020). Impact of universities. Retrieved from www.universitiesuk.ac.uk/facts-and-stats/Pages/impact-of-higher-education.aspx
    » www.universitiesuk.ac.uk/facts-and-stats/Pages/impact-of-higher-education.aspx

Publication Dates

  • Publication in this collection
    19 May 2021
  • Date of issue
    Jan-Mar 2021
Universidade de São Paulo Avenida Professor Luciano Gualberto, 908, sala F184, CEP: 05508-900, São Paulo , SP - Brasil, Telefone: (11) 3818-4002 - São Paulo - SP - Brazil
E-mail: rausp@usp.br