
In scholarly and scientific publishing, altmetrics (stands for "alternative metrics") are non-traditional
bibliometrics
Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics (the analysis of scientific metri ...
proposed as an alternative or complement
to more traditional
citation impact
Citation impact or citation rate is a measure of how many times an academic journal article or book or author is cited by other articles, books or authors.
Citation counts are interpreted as measures of the impact or influence of academic work a ...
metrics, such as
impact factor
The impact factor (IF) or journal impact factor (JIF) of an academic journal is a type of journal ranking. Journals with higher impact factor values are considered more prestigious or important within their field.
The Impact Factor of a journa ...
and
''h''-index. The term altmetrics was proposed in 2010,
as a generalization of
article level metrics,
and has its roots in the #altmetrics
hashtag
A hashtag is a metadata tag operator that is prefaced by the hash symbol, ''#''. On social media, hashtags are used on microblogging and photo-sharing services–especially Twitter and Tumblr–as a form of user-generated tagging that enable ...
. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc.
Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover
citation
A citation is a reference to a source. More precisely, a citation is an abbreviated alphanumeric expression embedded in the body of an intellectual work that denotes an entry in the bibliographic references section of the work for the purpose o ...
counts,
but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.
It demonstrates both the impact and the detailed composition of the impact.
Altmetrics could be applied to research filter,
promotion and tenure dossiers, grant applications and for ranking newly-published articles in
academic search engines.
Overtime, the diversity of sources mentioning, citing, or archiving articles has gone down. This happened because services ceased to exist, like Connotea, or because changes in API availability. For example, PlumX removed Twitter metrics in August 2023.
Adoption
The development of web 2.0 has changed the research publication seeking and sharing within or outside the academy, but also provides new innovative constructs to measure the broad scientific impact of scholar work. Although the traditional metrics are useful, they might be insufficient to measure immediate and uncited impacts, especially outside the peer-review realm.
Projects such as
ImpactStory,
and various companies, including
Altmetric,
Plum Analytics
Plum Analytics is a Philadelphia
Philadelphia ( ), colloquially referred to as Philly, is the List of municipalities in Pennsylvania, most populous city in the U.S. state of Pennsylvania and the List of United States cities by population, si ...
and Overton are calculating altmetrics. Several publishers have started providing such information to readers, including
BioMed Central
BioMed Central (BMC) is a United Kingdom-based, for-profit scientific open access publisher that produces over 250 scientific journals. All its journals are published online only. BioMed Central describes itself as the first and largest open a ...
,
Public Library of Science (PLOS),
Frontiers,
Nature Publishing Group
Nature Portfolio (formerly known as Nature Publishing Group and Nature Research) is a division of the international scientific publishing company Springer Nature that publishes academic journals, magazines, online databases, and services in scien ...
,
and
Elsevier
Elsevier ( ) is a Dutch academic publishing company specializing in scientific, technical, and medical content. Its products include journals such as ''The Lancet'', ''Cell (journal), Cell'', the ScienceDirect collection of electronic journals, ...
.
The
NIHR Journals Library also includes altmetric data alongside its publications.
In 2008, the
Journal of Medical Internet Research
The ''Journal of Medical Internet Research'' is a peer-reviewed open-access medical journal established in 1999 covering eHealth and "healthcare in the Internet age". The editors-in-chief are Gunther Eysenbach and Rita Kukafka. The publisher is JMI ...
started to systematically collect tweets about its articles.
Starting in March 2009, the
Public Library of Science
PLOS (for Public Library of Science; PLoS until 2012) is a nonprofit publisher of open-access journals in science, technology, and medicine and other scientific literature, under an open-content license. It was founded in 2000 and launched its ...
also introduced article-level metrics for all articles.
Funders have started showing interest in alternative metrics,
including the UK Medical Research Council.
Altmetrics have been used in applications for promotion review by researchers.
Furthermore, several universities, including the
University of Pittsburgh
The University of Pittsburgh (Pitt) is a Commonwealth System of Higher Education, state-related research university in Pittsburgh, Pennsylvania, United States. The university is composed of seventeen undergraduate and graduate schools and colle ...
are experimenting with altmetrics at an institute level.
However, it is also observed that an article needs little attention to jump to the
upper quartile of ranked papers,
suggesting that not enough sources of altmetrics are currently available to give a balanced picture of impact for the majority of papers.
Important in determining the relative impact of a paper, a service that calculates altmetrics statistics needs a considerably sized knowledge base. The following table shows the number of artefacts, including papers, covered by services:
Categories
Altmetrics are a very broad group of metrics, capturing various parts of impact a paper or work can have. A classification of altmetrics was proposed by ImpactStory in September 2012,
and a very similar classification is used by the Public Library of Science:
* Viewed – HTML views and PDF downloads
* Discussed – journal comments, science blogs, Wikipedia, Twitter, Facebook and other social media
* Saved –
Mendeley
Mendeley is a reference management, reference manager software founded in 2007 by Doctor of Philosophy, PhD students Paul Foeckler, Victor Henning, Jan Reichelt and acquired by the Dutch academic publishing company Elsevier in 2013. It is used to ...
,
CiteULike and other social bookmarks
* Cited – citations in the scholarly literature, tracked by
Web of Science
The Web of Science (WoS; previously known as Web of Knowledge) is a paid-access platform that provides (typically via the internet) access to multiple databases that provide reference and citation data from academic journals, conference proceedi ...
,
Scopus
Scopus is a scientific abstract and citation database, launched by the academic publisher Elsevier as a competitor to older Web of Science in 2004. The ensuing competition between the two databases has been characterized as "intense" and is c ...
,
CrossRef and others
* Recommended – for example used by F1000Prime
Viewed
One of the first alternative metrics to be used was the number of views of a paper. Traditionally, an author would wish to publish in a journal with a high subscription rate, so many people would have access to the research. With the introduction of web technologies it became possible to actually count how often a single paper was looked at. Typically, publishers count the number of HTML views and PDF views. As early as 2004, the ''
BMJ'' published the number of views for its articles, which was found to be somewhat correlated to citations.
Discussed
The discussion of a paper can be seen as a metric that captures the potential impact of a paper. Typical sources of data to calculate this metric include
Facebook
Facebook is a social media and social networking service owned by the American technology conglomerate Meta Platforms, Meta. Created in 2004 by Mark Zuckerberg with four other Harvard College students and roommates, Eduardo Saverin, Andre ...
,
Google+
Google+ (sometimes written as Google Plus, stylized as G+ or g+) was a Social networking service, social network owned and operated by Google until it ceased operations in 2019. The network was launched on June 28, 2011, in an attempt to challe ...
,
Twitter
Twitter, officially known as X since 2023, is an American microblogging and social networking service. It is one of the world's largest social media platforms and one of the most-visited websites. Users can share short text messages, image ...
, Science Blogs, and Wikipedia pages. Some researchers regard the mentions on social media as citations. For example, citations on a social media platform could be divided into two categories: internal and external. For instance, the former includes retweets, the latter refers to tweets containing links to outside documents. The correlation between the mentions and likes and citation by primary scientific literature has been studied, and a slight correlation at best was found, e.g. for articles in
PubMed
PubMed is an openly accessible, free database which includes primarily the MEDLINE database of references and abstracts on life sciences and biomedical topics. The United States National Library of Medicine (NLM) at the National Institute ...
.
In 2008 the ''
Journal of Medical Internet Research
The ''Journal of Medical Internet Research'' is a peer-reviewed open-access medical journal established in 1999 covering eHealth and "healthcare in the Internet age". The editors-in-chief are Gunther Eysenbach and Rita Kukafka. The publisher is JMI ...
'' began publishing views and
tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.
However, if implementing use of the Twimpact factor, research shows scores to be highly subject specific, and as a result, comparisons of Twimpact factors should be made between papers of the same subject area.
While past research in the literature has demonstrated a correlation between tweetations and citations, it is not a causative relationship. At this point in time, it is unclear whether higher citations occur as a result of greater media attention via Twitter and other platforms, or is simply reflective of the quality of the article itself.
Recent research conducted at the individual level, rather than the article level, supports the use of Twitter and social media platforms as a mechanism for increasing impact value.
Results indicate that researchers whose work is mentioned on Twitter have significantly higher h-indices than those of researchers whose work was not mentioned on Twitter. The study highlights the role of using discussion based platforms, such as Twitter, in order to increase the value of traditional impact metrics.
Besides Twitter and other streams, blogging has shown to be a powerful platform to discuss literature. Various platforms exist that keep track of which papers are being blogged about. Altmetric.com uses this information for calculating metrics, while other tools just report where discussion is happening, such as ResearchBlogging and Chemical blogspace.
Recommended
Platforms may even provide a formal way of ranking papers or recommending papers otherwise, such as
Faculty of 1000.
Saved
It is also informative to quantify the number of times a page has been saved, or bookmarked. It is thought that individuals typically choose to bookmark pages that have a high relevance to their own work, and as a result, bookmarks may be an additional indicator of impact for a specific study. Providers of such information include science specific
social bookmarking
Social bookmarking is an online service which allows users to add, annotate, edit, and share Internet bookmark, bookmarks of web documents. Many online bookmark management services have launched since 1996; Delicious (website), Delicious, founded i ...
services such as
CiteULike and
Mendeley
Mendeley is a reference management, reference manager software founded in 2007 by Doctor of Philosophy, PhD students Paul Foeckler, Victor Henning, Jan Reichelt and acquired by the Dutch academic publishing company Elsevier in 2013. It is used to ...
.
Cited
The cited category is a narrowed definition, different from the discussion. Besides the traditional metrics based on citations in scientific literature, such as those obtained from
Google Scholar
Google Scholar is a freely accessible web search engine that indexes the full text or metadata of Academic publishing, scholarly literature across an array of publishing formats and disciplines. Released in Beta release, beta in November 2004, th ...
,
CrossRef,
PubMed Central
PubMed Central (PMC) is a free digital repository that archives open access full-text scholarly articles that have been published in biomedical and life sciences journals. As one of the major research databases developed by the National Cente ...
, and
Scopus
Scopus is a scientific abstract and citation database, launched by the academic publisher Elsevier as a competitor to older Web of Science in 2004. The ensuing competition between the two databases has been characterized as "intense" and is c ...
, altmetrics also adopt citations in secondary knowledge sources. For example, ImpactStory counts the number of times a paper has been referenced by Wikipedia. Plum Analytics also provides metrics for various academic publications,
seeking to track research productivity. PLOS is also a tool that may be used to utilize information on engagement.
Numerous studies have shown that scientific articles disseminated through social media channels (i.e. Twitter, Reddit, Facebook, YouTube, etc) have substantially higher biblometric scores (downlodas, reads and citations) than articles not advertised through social media. In the fields of plastic surgery, hand surgery and more, higher Altmetric scores are associated with better short-term bibliometrics.
Interpretation
While there is less consensus on the validity and consistency of altmetrics, the interpretation of altmetrics in particular is discussed. Proponents of altmetrics make clear that many of the metrics show attention or engagement, rather than the quality of impacts on the progress of science.
Even citation-based metrics do not indicate if a high score implies a positive impact on science; that is, papers are also cited in papers that disagree with the cited paper, an issue for example addressed by the Citation Typing Ontology project.
Altmetrics could be more appropriately interpreted by providing detailed context and qualitative data. For example, in order to evaluate the scientific contribution of a scholar work to policy making by altmetrics, qualitative data, such as who's citing online
and to what extent the online citation is relevant to the policymaking, should be provided as evidence.
Regarding the relatively low correlation between traditional metrics and altmetrics, altmetrics might measure complementary perspectives of the scholar impact. It is reasonable to combine and compare the two types of metrics in interpreting the societal and scientific impacts. Researchers built a 2*2 framework based on the interactions between altmetrics and traditional citations.
Further explanations should be provided for the two groups with high altmetrics/low citations and low altmetrics/high citations.
Thus, altmetrics provide convenient approaches for researchers and institutions to monitor the impact of their work and avoid inappropriate interpretations.
Controversy
The usefulness of metrics for estimating scientific impact is controversial.
Research has found that online buzz could amplify the effect of other forms of outreach on researchers' scientific impact. For the nano-scientists that are mentioned on Twitter, their interactions with reporters and non-scientists positively and significantly predicted higher h-index, whereas the non-mentioned group failed.
Altmetrics expands the measurement of scholar impact for containing a rapid uptake, a broader range of audiences and diverse research outputs. In addition, the community shows a clear need: funders demand measurables on the impact of their spending, such as public engagement.
However, there are limitations that affect the usefulness due to technique problems and systematic bias of construct, such as data quality, heterogeneity and particular dependencies.
In terms of technique problems, the data might be incomplete, because it is difficult to collect those online research outputs without direct links to their mentions (i.e. videos) and identify different versions of one research work. Additionally, whether the API leads to any missing data is unsolved.
As for systematic bias, like other metrics, altmetrics are prone to self-citation, gaming, and other mechanisms to boost one's apparent impact such as performing
citation spam in Wikipedia. Altmetrics can be
gamed: for example, likes and mentions can be bought.
[J. Beall, Article-Level Metrics: An Ill-Conceived and Meretricious Idea, 2013, ] Altmetrics can be more difficult to standardize than citations. One example is the number of tweets linking to a paper where the number can vary widely depending on how the tweets are collected. Besides, online popularity may not equal to scientific values. Some popular online citations might be far from the value of generating further research discoveries, while some theoretical-driven or minority-targeted research of great science-related importance might be marginalized online.
For example, the top tweeted articles in biomedicine in 2011 were relevant to curious or funny content, potential health applications, and catastrophe.
Altmetric state that they have systems in place to detect, identify and correct gaming. Finally, recent research has shown Altmetrics reproduce gendered biases found in disciplinary publication and citation practices: for example, journal articles authored exclusively by female scholars score 27% lower on average than exclusively male-authored outputs. At once, this same research shows 0 attention scores are more likely for male-authored articles.
Altmetrics for more recent articles may be higher because of the increasing uptake of the social web and because articles may be mentioned mainly when they are published.
As a result, it might not be fair to compare the altmetrics scores of articles unless they have been published at a similar time. Researchers has developed a sign test to avoid the usage uptake bias by comparing the metrics of an article with the two articles published immediately before and after it.
It should be kept in mind that the metrics are only one of the outcomes of tracking how research is disseminated and used. Altmetrics should be carefully interpreted to overcome the bias. Even more informative than knowing how often a paper is cited, is which papers are citing it. That information allows researchers to see how their work is impacting the field (or not). Providers of metrics also typically provide access to the information from which the metrics were calculated. For example,
Web of Science
The Web of Science (WoS; previously known as Web of Knowledge) is a paid-access platform that provides (typically via the internet) access to multiple databases that provide reference and citation data from academic journals, conference proceedi ...
shows which are the citing papers, ImpactStory shows which Wikipedia pages are referencing the paper, and CitedIn shows which databases extracted data from the paper.
Another concern of altmetrics, or any metrics, is how universities or institutions are using metrics to rank their employees make promotion or funding decisions, and the aim should be limited to measure engagement.
The overall online research output is very little and varied among different disciplines.
The phenomenon might be consistent with the social media use among scientists. Surveys has shown that nearly half of their respondents held ambivalent attitudes of social media's influence on academic impact and never announced their research work on social media. With the changing shift in open science and social media use, the consistent altmetrics across disciplines and institutions will more likely be adopted.
Ongoing research
The specific use cases and characteristics is an active research field in
bibliometrics
Bibliometrics is the application of statistical methods to the study of bibliographic data, especially in scientific and library and information science contexts, and is closely associated with scientometrics (the analysis of scientific metri ...
, providing much needed data to measure the impact of altmetrics itself. Public Library of Science has an Altmetrics Collection
and both the ''Information Standards Quarterly'' and the ''Aslib Journal of Information Management'' recently published special issues on altmetrics.
A series of articles that extensively reviews altmetrics was published in late 2015.
There is other research examining the validity of one altmetrics
or make comparisons across different platforms.
Researchers examine the correlation between altmetrics and traditional citations as the validity test. They assume that the positive and significant correlation reveals the accuracy of altmetrics to measure scientific impact as citations.
The low correlation (less than 0.30
) leads to the conclusion that altmetrics serves a complementary role in scholar impact measurement such as the study by Lamba (2020) who examined 2343 articles having both altmetric attention scores and citations published by 22 core health care policy faculty members at Harvard Medical School and a significant strong positive correlation (r>0.4) was observed between the aggregated ranked altmetric attention scores and ranked citation/increased citation values for all the faculty members in the study. However, it remains unsolved that what altmetrics are most valuable and what degree of correlation between two metrics generates a stronger impact on the measurement. Additionally, the validity test itself faces some technical problems as well. For example, replication of the data collection is impossible because of the instant changing algorithms of data providers.
See also
*
Academic journal
An academic journal (or scholarly journal or scientific journal) is a periodical publication in which Scholarly method, scholarship relating to a particular academic discipline is published. They serve as permanent and transparent forums for the ...
*
Open Researcher and Contributor ID – nonproprietary unique identifiers
*
San Francisco Declaration on Research Assessment
The San Francisco Declaration on Research Assessment (DORA) is a statement that denounces the practice of correlating the journal impact factor to the merits of a specific scientist's contributions. Also according to this statement, this practice ...
*
Scientometrics
Scientometrics is a subfield of informetrics that studies quantitative aspects of scholarly literature. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citati ...
*
Webometrics
The science of webometrics (also cybermetrics) tries to measure the World Wide Web to get knowledge about the number and types of hyperlinks, structure of the World Wide Web and using patterns. According to Björneborn and Ingwersen, the definiti ...
References
External links
Altmetrics ManifestoNISO Altmetrics Standards Project White Paper*Special issue of Research Trends
Issue 37 – June 2014
{{Academic publishing
Bibliometrics
Citation metrics