Metrics refer to the quantitative analysis of published scholarly information (e.g. books, journals articles, datasets) and its related metadata (e.g. abstracts, keywords, citations), using statistics to demonstrate its impact, show trends, or highlight relationships.
This page contains information about both bibliometrics (traditional metrics) and altmetrics (alternative metrics) and provides links to resources you can use.
Metrics should be used cautiously as they do not give a full picture of research impact. They also vary considerably between disciplines so cannot be compared like-for-like.
Given this, there is increasing focus on using metrics responsibly and understanding their limitations.
The San Francisco Declaration on Research Assessment (DORA) provides a framework for understanding metrics responsibly, “a need to improve the ways in which researchers and the outputs of scholarly research are evaluated”. DORA calls for a detachment of quality of individual research from the journal in which it is published, based on the understanding that quality of individual outputs should not be unduly influenced by the publication they appear in.
Article level metrics refer to data attached to individual articles.
Examples
Journal level indicators refer to metrics such as Journal Impact Factor and Scimago Journal Rank. These measure the average number of citations an article receives over a given period within a specific publication.
JIF: The Journal Impact Factor is a measure reflecting the annual average (mean) number of citations to recent articles published in that journal. "The annual JCR impact factor is a ratio between citations and recent citable items published.” (Taken from Metrics Toolkit)
It’s important to understand that there are important limitations to JIF – the metrics toolkit provides more information on these.
Journal Acceptance Rate: The percentage is calculated by dividing the number of manuscripts accepted for publication in a given year by the number of manuscripts submitted in that same year.
Eigenfactor Score: Eigenfactor Scores are based on the weighted number of citations in a given year to citable publications published in the journal within the 5 preceding years.
(source: https://researchguides.library.wisc.edu/c.php?g=1226768&p=8979285 )
Scimago Journal Rank: Measures the average number of weighted citations received in a year, by articles published in a journal in the previous 3 years.
(source: https://www.nottingham.ac.uk/library/research/research-intelligence/research-metrics.aspx)
Author level metrics refer to citation data that collectively analyses the work of specific authors.
H-index: An author-level metric (although it can also be calculated for any aggregation of publications, e.g. journals, institutions, etc.) calculated from the count of citations to an author’s set of publications. The metrics toolkit has more information on these metrics.
Like bibliometrics, altmetrics aim to evaluate how much attention a piece of research has received, how widely it’s been shared, and its overall influence. Unlike bibliometrics, altmetrics focus on web-driven interactions, tracking mentions, likes and shares on sources such as social media, blogs, online news, reference managers, Wikipedia, patents, peer reviews, and citation data. This means they can give a broader view of attention received by research outputs, beyond that of the academic community. In addition to quantitative data, such as how often a source is mentioned, altmetrics also present qualitative information, such as who mentioned the research, where they’re located, and what they said about it. However, they can be gamed, may reflect popularity over quality, and are hard to standardize, with contextless mentions and a bias towards recent papers.
By monitoring social media sites, newspapers, government policy documents, and other sources, Altmetric aggregates mentions of research outputs and produces an Altmetric Attention Score. The Altmetric Attention Score is a weighted aggregate that reflects the volume and sources of mentions a publication receives. It considers three main factors: the number of mentions, the type of sources, and the influence of the authors sharing the research. For instance, a mention in a newspaper carries more weight than a blog post, and a tweet from an academic is valued higher than one from a general account.
This data is visually represented through the Altmetric Doughnut, a coloured donut that displays the Altmetric Attention Score at its centre. The colours in the doughnut indicate the source of the mentions, such as blue for X (formerly Twitter), yellow for blogs, and red for mainstream media. Researchers can access Altmetric data through a number of publisher websites, on the repository and by downloading the free Altmetric bookmarklet tool.
PlumX aggregates research metrics for various scholarly outputs, categorising them into five distinct areas: citations, usage, captures, mentions, and social media. It collects mentions from sources similar to Altmetric but integrates these with traditional bibliometrics. The PlumX visual offers a multicoloured representation of the different metrics: green for usage, purple for captures, yellow for mentions, blue for social media, and red for citations. If you hover over the PlumX visual, you are given a full breakdown of mentions and contexts. For more information on PlumX visit the PlumX help guide.
Overton is the world’s largest curated database of policy documents, parliamentary transcripts, government guidance, and think tank research. Accessible via the Library Databases page, Overton encompasses around eight million policy documents and five million academic papers, and maps connections between the two, enabling users to see where ideas, papers, reports, and staff are cited or mentioned. It serves as both a discovery tool for policy material, offering full-text searching and indexing, and a citation database to trace the broader impact of research. This extensive resource helps users find, understand, and measure their influence on government policy, making it invaluable for academics seeking to track the impact of their work on policy and explore the dynamics of global policymaking.