Don't Fight to be Cited

Forget and - submit your papers to the journals read by your grant reviewers.

By | January 1, 2009

An essential part of our job is to publish our work. Unfortunately, it seems like not just any scientific journal will suffice. Both grant review panels and promotion committees appear to be most impressed by papers that have made their way past the editorial gatekeepers and persnickety reviewers of top-tier journals. Of course, our own egos usually feel the same way. We all like to think of our work as both exciting and cutting-edge and acceptance in a prestigious journal is one way to get validation. Unfortunately, editors and reviewers are frequently uncooperative.

When I was a young scientist, I also thought that top journals were the best places to publish. Unfortunately, I found that trying to publish in popular journals required an enormous amount of time and effort, from ultra-succinct writing of the manuscript to answering absurd requests by reviewers. I fought not only because of youthful moral righteousness, but also because I felt that publication in highly visible journals was necessary for my papers to get cited. Without highly cited papers, I thought, I would never be able to advance in my career.

What I did not appreciate sufficiently when I was young was that high citation numbers couldn't be achieved simply by targeting specific journals. Conversely, I found that the total number of citations garnered by an article is rarely an indication of its importance—review articles in popular fields are well-known citation magnets, and my most highly cited article (>360 citations to date ) is a techniques paper published in the Journal of Biological Chemistry.

Playing the citation game would be of purely academic interest if the stakes were not so high. Our citation rankings do affect our careers, no matter how much we might protest the fact. Communicating the results of our research is one of our primary responsibilities as scientists, and our set of publications is evidence of our commitment to this idea. It is important, however, not to place too much emphasis on where your papers are published. Spending a year to get your work published in a journal such as Cell, instead of a couple of papers in less prestigious journals, could negatively impact your overall research productivity, and perhaps even your citation count. (Of course, sometimes where you publish does make a difference to citations. I still kick myself for succumbing to the entreaties of several new journals by submitting a new and interesting paper to them, only to see it disappear from the face of the scientific earth.)

The best publication advice I ever received was from my postdoctoral advisor who suggested that I choose target journals based on which scientists would evaluate my grants and write letters of recommendation. What journals did they cite? Where did they publish? Presumably they published in journals that they respected, so if I published in those journals, they would see my work and could comment on my science and its impact. My peers are actually more likely to see my papers in good quality specialty journals than in Science or Nature or Cell, which they often don't browse. But since everyone knows how hard it is to get accepted in top-tier journals, it looks great on your CV.

Indeed, although the promotion committees I have served on consider the number and quality of the candidate's publications, they are usually most impressed by glowing letters of recommendation from prominent scientists in the candidate's field. This almost always requires publishing in the appropriate specialty journals and talking to those scientists at meetings. If you are serious about a field, your publication efforts should be specifically targeted to where the field publishes.

For the most part, I took my advisor's advice and only rarely tried to publish in the trendy journals. And I have found that many of my papers that I published in places like the Journal of Biological Chemistry or Molecular Biology of the Cell have been cited more than those I managed to get in Science or Cell. By primarily targeting specialty journals, I have managed to publish more than 100 papers that have been cited more than 6,000 times. Knowing your target audience is a lot easier way to get more citations than fighting with journal editors and reviewers.

Steven Wiley is a Pacific Northwest National Laboratory Fellow and director of PNNL's Biomolecular Systems Initiative.


Avatar of: anonymous poster

anonymous poster

Posts: 2

January 5, 2009

This might be great advice for people who are already running their labs, but don't listen to it if you are looking to land a job at a top-tier institution. I have been told again and again that I need a paper in those highest impact journals to be considered for those positions, and that this would have been preferable to the numerous papers I have in good quality specialty journals already. This has been true for the recent hires at our institution as well, unfortunately. The problem is that you never know what they actually contributed to the work, or whether the big name lab they came from helped to get it in.\n\nI consider this a sad statement on the current state of affairs in the transition of post-docs to faculty, but it is what I have been told over and over. Better a trendy article than solid research.
Avatar of: Ellen Hunt

Ellen Hunt

Posts: 199

January 5, 2009

Getting glowing recommendations and excellent "product placement" for ones papers can also be a hallmark of a corrupt scientist working for a corrupt scientist. Yes, you know who you are. \n\nToday, science has a don't ask, don't tell policy on corruption. Some who practice it proclaim publicly that "everybody does it". Those are the scientists who are more sophisticated intellectual thieves. Others who practice corruption must, of necessity, be silent and confrontational when questioned, because they publish papers based on fraudulent data or cherry-picked data in order to make sure their next R01 comes in. \n\nThis fact makes it even more difficult for honest scientists. Like Bernie Madoff, some manage to always publish, and never make a bad call, despite (in some cases) having a less than thorough grasp of their subject matter. In finance, this is called over correlation of year on year results. It is one thing to bury failures and mistakes, and quite another to renovate them into something they are not.


Posts: 11

January 5, 2009

@Anonymous poster\n\nThis article *is* good advice, and it is not just about Science/Nature papers. I am presently on a search committee for a high-profile faculty job at a tier I research university.\n\nCandidate evaluation is very little about counting the number of Science/Nature articles. Those don't hurt - but they do not by any means make or break an application. There are much more important things, like the topic of the research, the clarity of the research plan, the letters of recommendation, and publishing topical papers in respectable journals in the field.\n\nSo, please listen to this article, not the anonymous poster.
Avatar of: anonymous poster

anonymous poster

Posts: 2

January 5, 2009

@Morgan\n\nWhat I said was top tier, not Tier 1, of which there is a great variety.\n\nI am sure there are Tier 1 institutions that have more sensible internal guidelines on their committees, as you mentioned.\n\nThe top 20 schools that my post-doc colleagues have been applying to are not interested in them without those high-impact (i.e. Science, Nature) papers.\n\nMy point was that if you are interested in those schools, you will need those papers. Doesn't mean you will be happier there if you get hired there! Indeed, it may be the opposite.\n\n\n
Avatar of: anonymous poster

anonymous poster

Posts: 4

January 5, 2009

The whole business of high impact journals damages the scientific enterprise. Many of these journals reject papers on the basis of editorial triage rather than submitting all to peer review. To cede to the editors of these journals the ability to decide what scientific topics are, or are not, interesting will also subvert scientific progress.\nNone of this is new. See for example:\nSeglen,PO "Why the impact factor of journals should not be used for evaluating research." BMJ. 1997 314:498-502\n\nLachmann PJ and Rowlinson JS "It?s what not where you publish that matters" Science and Public Affairs, Winter 1997/98\n
Avatar of: anonymous poster

anonymous poster

Posts: 125

January 6, 2009

"By primarily targeting specialty journals, I have managed to publish more than 100 papers that have been cited more than 6,000 times. Knowing your target audience is a lot easier way to get more citations than fighting with journal editors and reviewers." \n\nFirst, Steve, check your ego and admit that you CO-published 100 papers with others who probably did the bulk of experimenting and writing - like your students, postdocs, junior faculty, and other collaborators. Second, you are a consumate scientist-politician who play the odds game to get published by knowing who will review "your" papers, rather than how those will be reviewed and judged on scientific merits. Although there's a definite benefit to your strategy, it requires that one knows who the reviewers are before submitting his or her manuscript, which isn't easy since sometimes the journals keep their reviewers anonymous.

Popular Now

  1. Secret Eugenics Conference Uncovered at University College London
  2. How Do Infant Immune Systems Learn to Tolerate Gut Bacteria?
  3. That Other CRISPR Patent Dispute
    Daily News That Other CRISPR Patent Dispute

    The Broad Institute and Rockefeller University disagree over which scientists should be named as inventors on certain patents involving the gene-editing technology.

  4. DOE-Sponsored Oak Ridge National Laboratory to Cut 100 More Jobs