Scientific papers are the recordkeepers of progress in research. Each year researchers publish millions of papers in more than 30,000 journals. The scientific community measures the quality of those papers in a number of ways, including the perceived quality of the journal (as reflected by the title’s impact factor) and the number of citations a specific paper accumulates. The careers of scientists and the reputation of their institutions depend on the number and prestige of the papers they produce, but even more so on the citations attracted by these papers.

In recent years, there have been several episodes of scientific fraud, including completely made-up data, massaged or doctored figures, multiple publications of the same data, theft of complete articles, plagiarism of text, and self-plagiarism. And some scientists have come up with another way to artificially boost the number of citations to their work. 


Citation cartels, where journals, authors, and institutions conspire to inflate citation numbers, have existed for a long time.

The advent of electronic publishing and authors’ need to find outlets for their papers resulted in thousands of new journals, frequently with the imprimatur of “international” and promises of open access and wide circulation. The birth of predatory journals wasn’t far behind. Recently a group of authors published a consensus definition of such publications: “Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.”

These journals can act as milk cows where every single article in an issue may cite a specific paper or a series of papers. Sometimes the citations are more or less on topic, but in other instances, there is absolutely no relationship between the content of the article and the citations. The peculiar part is that the journal that the editor is supposedly working for is not profiting at all—it is just providing citations to other journals. It’s easy enough to spot if someone at the journal would pay attention. Such practices can lead an article to accrue more than 150 citations in the same year that it was published.

How insidious is this type of citation manipulation? In one example, an individual—acting as author, editor, and consultant—was able to use at least 15 journals as citation providers to articles published by five scientists at three universities. The problem is rampant in Scopus, which includes a high number of the new “international” journals. In fact, a listing in Scopus seems to be a criterion to be targeted in this type of citation manipulation.

Why is this important? First, these individuals who are not only authors, but also editors and consultants, and their colleagues obtain hundreds of citations and outshine their colleagues who play by the rules. Guess who’s more likely to get tenure or that plum promotion? Second, the numbers are staggering. For one university, I found that of the nearly 700 Scopus-listed papers its researchers published in 2019, the citation numbers of at least 20 appear to have been boosted in this way. Almost 60 percent of the citations to published studies from this university came from 15 manipulated journals, and this significantly padded the citation numbers of the 20 articles.  These suspect citations drove the citations per paper (C/P) average for this university up to 2.50 for the year, whereas without them, the C/P would have been 1.08. Because citations per paper and/or citations per faculty are criteria in the Quacquarelli Symonds ranking and the Times Higher Education World University Rankings, this also artificially inflates the quantitative standing of the university.

What can be done about citation cartels? First, editors and editorial boards of legitimate journals should be paying attention and correcting their colleagues. They carry a responsibility. When every single article in a journal is citing a specific article or group of articles, something is likely amiss. When the subjects of the cited and citing articles are unrelated, it is a dead giveaway. Before publication of a journal issue, this could be analyzed. In fact, checking for appropriateness of citations is a major task for reviewers and editors. Journals should also reconsider the practice of using outside support service consultants as editors of articles or special issues that result from conferences.

Scopus itself has all the data necessary to detect this malpractice. Red flags include a large number of citations to an article within the first year. And for authors who wish to steer clear of citation cartel activities: when an editor, a reviewer, or a support service asks you to add inappropriate references, do not oblige and do report the request to the journal. 

Sibrandes Poppema is the Tan Sri Jeffrey Cheah Distinguished Professor at Sunway University in Malaysia and is president emeritus of the University of Groningen in the Netherlands.

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!
Already a member?