Elsevier’s Answer to the Impact Factor

“CiteScore,” which ranks twice as many journals compared with publications assigned impact factors, makes its debut during a contentious time for journal metrics.

By | December 9, 2016

PIXABAY, JACKMAC34

The impact factor (IF) has arguably never been less popular. But now, the controversial journal metric has another potential rival. On December 8, Elsevier introduced its own system for ranking journals, called CiteScore. It is similar to the existing IF metric, but covers twice as many journals and is based on the Scopus database, which is more comprehensive than the IF’s data source, Web of Science.

CiteScore has a few quirks in its methodology that will affect prominent journal rankings. The Lancet, for instance, ranks fourth in the world under the IF, according to Nature, but ranks below 200th in the CiteScore system. That’s because, while both metrics calculate impact by dividing the number of citations by the total number of articles published, CiteScore includes editorials, letters to the editor, corrections, and news items in its calculation. The Lancet loses points for publishing a good deal of these types of articles, which are seldom cited.

“As there is intense competition among top-tier journals, adoption of CiteScore will push editors to stop publishing non-research documents, or shunting them into a marginal publication or their society website,” Phil Davis, a publishing consultant in Ithaca, New York, told Nature.

Meanwhile, CiteScore will likely have to contend with the same criticisms scientists have leveled against the IF. Critics have argued that journal metrics contribute to a culture of journal worship, judging researchers based on the publications that accept their work, rather than doing so on the merits of their research.

“In my view, journal metrics should always be accompanied by health warnings that are at least as prominent as the ones you see on cigarette packets,” Stephen Curry, a structural biologist at Imperial College London, told Nature. “Such metrics are at the root of many of the current evils in research assessment.”

[H3] See “Opinion: The Impact Factor, Re-envisioned

Add a Comment

Avatar of: You

You

Processing...
Processing...

Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo

Comments

Avatar of: dumbdumb

dumbdumb

Posts: 89

December 9, 2016

The IF may have become less popular in the country clubs of science, but out there it is more and more the only metrics that counts, only after the number of publications per year.

In the past few years, I have witnessed a rising in job advertisments from institutions where you are clearly required to list IF, H index, and any other element that can prove you are "productive", no matter producing what.

I have been recently denied a job because my publication output (number) was not up to the "standards" of the institution. Despite being told that my interview and expertise were satisfactory. And that the person I was replacing came from the same lab as me and he was not more skilled, according to our common previous supervisor.

And BTW, the job is still vacant

Popular Now

  1. Symmetrical Eyes Indicate Dyslexia
  2. German Scientists Resign from Elsevier Journals’ Editorial Boards
  3. Germany Sees Drastic Decrease in Insects
  4. Swapping Cigarettes for Vaping
    The Scientist Swapping Cigarettes for Vaping

    New evidence suggests e-cigarettes are not without risks to human health, but can be useful in getting people to kick their smoking habit.

RayBiotech