Not logged in.

Contribution Details

Type Journal Article
Scope Discipline-based scholarship
Title Unbiased evaluation of ranking metrics reveals consistent performance in science and technology citation data
Organization Unit
Authors
  • Shuqi Xu
  • Manuel Mariani
  • Linyuan Lü
  • Matúš Medo
Item Subtype Original Work
Refereed Yes
Status Published in final form
Language
  • English
Journal Title Journal of Informetrics
Publisher Elsevier
Geographical Reach international
ISSN 1751-1577
Volume 14
Number 1
Page Range 101005
Date 2020
Abstract Text Despite the increasing use of citation-based metrics for research evaluation purposes, we do not know yet which metrics best deliver on their promise to gauge the significance of a scientific paper or a patent. We assess 17 network-based metrics by their ability to identify milestone papers and patents in three large citation datasets. We find that traditional information-retrieval evaluation metrics are strongly affected by the interplay between the age distribution of the milestone items and age biases of the evaluated metrics. Outcomes of these metrics are therefore not representative of the metrics’ ranking ability. We argue in favor of a modified evaluation procedure that explicitly penalizes biased metrics and allows us to reveal metrics’ performance patterns that are consistent across the datasets. PageRank and LeaderRank turn out to be the best-performing ranking metrics when their age bias is suppressed by a simple transformation of the scores that they produce, whereas other popular metrics, including citation count, HITS and Collective Influence, produce significantly worse ranking results.
Digital Object Identifier 10.1016/j.joi.2019.101005
Other Identification Number merlin-id:19177
PDF File Download from ZORA
Export BibTeX
EP3 XML (ZORA)