Home / General Kitchen / Tipping the Scales: Is Impact Factor Suppression Biased Against Small Fields?

Tipping the Scales: Is Impact Factor Suppression Biased Against Small Fields?

Earlier this summer, three economic history journals were suppressed from the 2017 edition of the Journal Citation Reports (JCR), the annual metrics source that reports Journal Impact Factors along with other citation metrics.

The three journals were the Journal of the History of Economic Thought (JHET), The European Journal of the History of Economic Thought (EJHET) and the History of Economic Ideas (HEI).

The grounds for suppressing all three journals was “citation stacking,” a pattern wherein a large number of citations from one or more journals affects the performance and ranking of others. In some instances, I have referred to this pattern as a “citation cartel“. Clarivate Analytics, the publisher of the JCR, has defined their methods for detecting and evaluating citation stacking.

Hand pushing on scale of justice

Shortly after the JCR release, Nicola Giocoli, Editor of HEI, posted an open letter dated June 21, 2018 to Tom Ciavarella, Manager of Public Relations for Clarivate, arguing that the anomalous citation pattern was caused by a single annual review of the literature. This type of review, Giocoli argued, was “a serious piece of research that we commission to junior scholars in the field (usually, scholars who just finished their PhD) ‎to let them recognize and assess the research trends in the discipline.” HEI had published a similar review in 2016 and was planning to publish others in 2018 and 2019.

While I am not privy to Ciavarella’s response to the appeal, Clarivate did not reverse its editorial decision. Exasperated, Giocoli lashed out in a second open letter, dated July 3, 2018:

“By a single stroke, namely, by excluding one single item published in HEI, you could indeed publish a very meaningful 2017 metric for the JHET (and the EJHET, too!). I cannot believe your team is unable to (manually?) do this very simple operation and cancel all its references from your metrics for any journal the authors may have quoted. It is just one paper!”

Following a similar communications approach, seven presidents of history of economics societies, together with eight journal editors, published another open letter, appealing Clarivate’s decision, arguing that their methods for detecting citation distortion were biased against small fields:

“In many areas of the natural sciences, or in economics, for example, none of those initiatives would cause any citation distortions because of their broad net and higher level of citations.”

The problematic paper, “From Antiquity to Modern Macro: An Overview of Contemporary Scholarship in the History of Economic Thought Journals, 2015-2016” was published in 2017 and includes 212 references. Like its predecessor (“Annual Survey of Ideas in History of Economic Thought Journals, 2014-2015”), the authors restrict their review entirely to papers that count toward Journal Impact Factor (JIF) calculations. While this is enough to raise some eyebrows, it was not the factor that resulted in three journal suspensions.

Figure 1. Citations from a 2017 literature review in History of Economic Ideas include 212 references to recent papers in economic thought.

Of the 212 references in this review, 41 were directed to papers published in JHET, 46 to EJHET, but just one citation to itself. The remaining 124 citations referenced other journals.

According to the Web of Science, there were a total of 81 citations made in 2017 from indexed sources to JHET, 66 (81%) of which were directed to papers published in the previous two years. Between 2015 and 2016, JHET published 49 citable items. If we use these numbers to calculate JHET’s 2017 Impact Factor, we arrive at a score of 1.347 (66/49). If we remove the 41 references from HEI’s literature survey, JHET’s score would be just 0.510 (25/49). For context, JHET’s 2016 JIF score was 0.490. For EJHET, we calculate a 2017 JIF score of 1.200 (90/75) including citations from the review and 0.587 (44/75) excluding the review. The latter figure is similar to the JIF score EJHET received in 2016 (0.325).

Conspicuously, one journal that avoided suppression was the History of Political Economy (HPC), which received 48 JIF-directed citations from the literature review, more than half of the 92 citations that counted towards its JIF score in 2017. With these citations, HPC received a 2017 JIF score of 1.415 (92/65); without it, just 0.677.

The only apparent fault of the suppressed journals was that they suffered from a general lack of citation interest.

There is no doubt that the literature survey from the HEI was responsible for greatly inflating the scores and rankings of three journals. This led to suppression of two of the three citation recipients along with HEI itself. Apart from the obvious fact that the literature search conveniently focused on papers published in the previous two years (as did an earlier HEI review), there are no other suspicious circumstances surrounding the paper: There was just one self-citation to HEI itself; all three journals were published by different organizations and the authors did not concurrently serve as editors of the recipient journals. I could not find any conflicts of interest among the authors, journals, and publishers. The only apparent fault of the suppressed journals was that they suffered from a general lack of citation interest.

There are many fields that suffer from the same ailment. Consider Library Science.

The library journal, Interlending & Document Supply (I&DS) was suppressed for high levels of self-citation. In 2014, 73% of all self-citations were directed at its JIF. When reinstated in the JCR in 2015, JIF-directed self-citation rates grew to 96%. Most of the citations came from its editor, Mike McGrath, who routinely published surveys on related library topics. While McGrath was comprehensive in reviewing the relevant literature, I&DS generally receives very few citations. Its 2015 JIF was defined with just 24 citations in the numerator, 18 of which came from the McGrath’s reviews.

The Law Library Journal (LLJ) was also suspended from the JCR from 2011 through 2013. In 2013, 90% of the citations defining its Impact Factor were self-citations. When LLJ was reinstated in the JCR in 2014, self-citations accounted for 64% of its score. In real terms, this was just 10 citations out of a total of 28.

In contrast, consider the following review paper, “Highlights of the Year in JACC 2013,” published in February 2014 by the Editor-in-Chief and associate editors of the Journal of the American College of Cardiology (JACC). This review cites 267 papers, 261 of which were JIF-directed self-citations. In 2014, JACC received 14,539 JIF-directed citations, meaning, this one review contributed less than 2% of JACC’s IF score — huge in absolute terms but minimal in relative terms.

Also consider Land Degradation & Development (LDD), a journal in which its society had determined that its editor abused his position to coerce authors to cite LDD and his own publications. The journal went from rags to citation riches within a few years, quintupling its Impact Factor within four years and moving from 10th place to 1st place within its field.

While both JACC and LDD are examples of citation behaviors that artificially load hundreds of citations onto the performance scale, neither practice was sufficient to tip Clarivate into taking action to suppress either journal. Reverse-engineering JCR’s suppression decisions illustrate that decisions are based on high proportional thresholds.

Clarivate relies upon several criteria when determining when titles should be suppressed and “does not assume motive on behalf of any party.” To Clarivate, this is purely an exercise in measurement. Nevertheless, the very methods for defining citation distortion makes the JCR highly sensitive to tiny aberrations when they affect specialist journals in small fields and largely insensitive to big distortions when they affect large, highly cited journals.

It may be time to review Clarivate’s grounds for suppression and decide whether its scales require rebalancing.


Source link

If You Like This Post Please Share This......

About josan

No comments

  1. Can I just say what a relief to find a person that genuinely knows what they are discussing online.
    You actually know how to bring an issue to light and make it important.
    More and more people have to look at this and understand this side of the story.
    I was surprised that you’re not more popular because you certainly possess the
    gift.

  2. Awesome! Its really awesome post, I have got much clear idea regarding from this article.

Leave a Reply

Your email address will not be published. Required fields are marked *

*