Altmetrics (alternative metrics)
Altmetrics are gaining momentum in higher education (Holmberg, 2016). This post is based on my master’s thesis (Fraumann, 2017) that explores the usage of altmetrics with a focus on research funding. Altmetrics track down and count the mentions of scholarly outputs in social media, news sites, policy papers, and social bookmarking sites. Then altmetrics data providers aggregate the number of mentions. This allows an observation of how many times research has been viewed, discussed, followed, shared, and downloaded.
By following this line of thought, one might relate these mentions to impact or attention in the wider public or the society outside of the scientific community. As such, everyone with an internet connection would be able to engage with scholarly outputs online, even if only a fraction of the overall number of users do so. Nevertheless, it is important to note these mentions do not correlate with the quality of a scholarly output, they mostly visualise a community of attention, that is internet users that engage in some or way or the other with a scholarly output, such as a journal article. Altmetrics is an innovation with potential for further development (Bornmann, 2014; CWTS, 2017; Holmberg, 2016; Liu & Adie, 2013; Piwowar, 2013; Priem, Taraborelli, Groth, & Neylon, 2010; Robinson-García, Torres-Salinas, Zahedi, & Costas, 2014; Thelwall, Haustein, Larivière, Sugimoto, & Bornmann, 2013).
Following this development, altmetrics have reached the highest levels in European policy debates, and have been discussed, for instance, during the Open Science Mutual Learning Exercise (MLE) by the Horizon 2020 Policy Support Facility. MLEs are carried out under the Joint Research Centre Research and Innovation Observatory (RIO), and are aimed at providing the best practice examples from European Union (EU) Member States, and Associated Countries (European Commission, 2017b). Further evidence can be found in EU high-level expert groups that advise the European Commission, among others, on science, research, and innovation. From 2016 until 2017, altmetrics have been discussed in several reports of these high-level advisory bodies (European Commission, 2017a).
For this study, representatives of a research funding organisation, and policymakers were first interviewed. Second, reviewers of a research funding organisation and researchers registered with an institutional altmetrics system were invited to take an online survey. Overall, the survey respondents and interviewees were unaware of the usage of altmetrics. The data also suggests a few of respondents are well-aware of the debates on altmetrics. If one closely follows the international debates on the usage of altmetrics, it might come as a surprise that the concept is so widely unused in this sample. It was expected that more respondents would be aware on the usage of altmetrics. In particular, if altmetrics are discussed in high-level policy debates in EU research policy, researchers need to be made aware of it, because this might also affect their academic career to some extent.
As discussed before, altmetrics seems to be on the rise in policy papers and further international initiatives, such as at the level of EU policy. In turn, the findings that could be drawn from this sample of stakeholders suggest that altmetrics are not yet widely spread. In fact, they were unknown to the vast majority of the study participants. Furthermore, findings from the interviews also showed that different organisational types, academic disciplines, and further categories have to be treated differently. As proven in several technical studies, altmetrics are not yet ready for routine use in research evaluations, and several challenges need to be addressed (Erdt, Nagarajan, Sin, & Theng, 2016). Nevertheless, through altmetrics, it is possible to make a certain impact on the society visible or to visualise attention. How this impact is interpreted and set into context is essential.
Additionally, it was suggested by some interviewees that altmetrics might play a larger role in reporting on funded research rather than demonstrating impact in research funding applications. Criticisms were put forward by some respondents on altmetrics. Further, altmetrics should only be seen as a complementary measurement compared to citation counts and, especially, peer review. For instance, the impact of sharing a research data set can be made visible in a timely manner compared to citation counts of a journal article. The context of altmetrics data and aggregated scores needs to be analysed, as suggested by several scholars. As previously mentioned, the study findings for this sample of stakeholders in research funding indicate that altmetrics are mostly unknown. This needs to be considered if and when the usage of altmetrics is proposed by policymakers.
Grischa Fraumann is a recent graduate of the Master in Research and Innovation in Higher Education (MARIHE) at University of Tampere (Finland) and Danube University Krems (Austria). This blog post is based on his master’s thesis: ‘Valuation of altmetrics in research funding’.