In September 2010, Jason Priem, a doctoral student at the University of North Carolina at Chapel Hill’s School of Information and Library Science, was interested in promoting the value of a set of metrics that could describe relationships between the social aspects of the web and the spread of scholarship online. He saw few terms available to describe this diverse group of analytics, so he coined the word “altmetrics.”
For practical purposes, the best-known definition of altmetrics, “the creation and study of new metrics based on the Social Web for analyzing and informing scholarship,” comes from altmetrics.org, a website set up by Priem and three of his colleagues. Since then, others have questioned the definition and the methods of calculating altmetrics in various scholarly contexts.
More than a decade earlier, changes in information technology and scholarly communication had made the idea of a set of web-based metrics for measuring impact a tempting proposition—not only for scholars but also for publishers, toolmakers, and librarians. However, Priem’s positioning of altmetrics as an alternative to citation-based bibliometrics created an immediate set of obstacles for the movement.
Bibliometrics, originally defined as a set of quantitative methods used to analyze scholarly literature, have been around since the early 1960s. These measures are largely concerned with counting and tracking journal article citations. Because journal articles tend to cite other journal articles, the major providers of bibliometrics are closely connected to established indexers of scholarly journals, such as Thomson Reuters, Scopus, and the increasingly popular Google Scholar Metrics.
As librarians, we are natural leaders in the field of altmetrics…
In the STEM fields particularly, article-based productivity metrics are commonly accepted for purposes of evaluation and benchmarking. However, for scholars in areas that emphasize scholarly monographs over journal articles, such as the humanities, bibliometrics wield significantly less clout. The same goes for scholars whose research portfolios go beyond the bounds of traditional citation, such as the fine arts.
Although the field of altmetrics may have been positioned originally as an alternative to the filtering systems offered up by print- and citation-based bibliometrics, both approaches seek insight from the quantitative analysis of information related to scholarly output and publication. This similarity has not, however, prevented occasional periods of tension between the two fields’ respective followers.
As altmetrics have emerged and continue to grow and evolve, so too has the role that academic librarians play in supporting altmetrics, bibliometrics, and citation impact, from support to professional use to advocacy. It’s a familiar role. Impact factor, a measure of the average number of citations to recent articles in a given journal, was created as early as 1955 for use by librarians in making collection development and retention decisions. Libraries continue to bear primary responsibility for acquiring bibliometric tools, notably Web of Science, Journal Citation Reports, and Scopus, and training researchers in their use. Support for these tools extends to academic social networking, institutional repositories, and the web services CiteULike and Mendeley.
Librarians are supporting and interacting with altmetrics in such areas as acquisition, evaluation, outreach, training, and collection development. As librarians, we are natural leaders in the field of altmetrics because of our skills with these tools and our relationships with various groups in the academic community. Our historic connections across departments make us a neutral voice, able and willing to advocate for the needs of others.