Article ID Journal Published Year Pages File Type
523097 Journal of Informetrics 2015 17 Pages PDF
Abstract

•Three aspects of ranking stability were analyzed for 26,411 WoS and Scopus journals.•Five-year citation windows provide higher temporal stability of journal rankings.•Source normalization partially controls for cross-discipline ranking fluctuations.•Cross-indicator ranking stability largely differs among subject fields and indicators.•Journals in social sciences and humanities have generally lower ranking stability.

The article presents a large-scale comparison of journal rankings based on seven impact measures: Impact Factor (2- and 5-year), SJR, IPP, SNIP, H index, and Article Influence Score. Three aspects of ranking stability in the 2007–2014 period were analyzed: temporal, cross-discipline, and cross-indicator. Impact measures based on five-year citation windows enable more stable journal rankings over time. Journal rankings based on the source-normalized indicator (SNIP) have the largest cross-discipline stability. Journals in the fields of social sciences and humanities have lower temporal and cross-discipline ranking stability compared to those in “hard” sciences. Although correlation coefficients indicate relatively high agreement among the rankings based on different indicators, variations in quartile and percentile ranks suggest different conclusions. WoS journals almost linearly improve their ranking positions in Scopus lists, while many high-impact journals covered by Scopus are not available in WoS. An important element of the ranking stability is the discriminability of impact measures. Beyond the segregation between the top and bottom ranked journals, our assessment of “quality” relies in most cases on a rather arguable assumption that a couple of citations more or less is making a big difference.

Related Topics
Physical Sciences and Engineering Computer Science Computer Science Applications
Authors
,