Advertisement

Opting Out of the Numbers Game

As a long-time student of the scientific journal, I have witnessed incidences of unwarranted co-authorship, repeated publication of the same work, and the practice of "salami science"—the slicing of a single research project into its least publishable units. In large part, such behavior by authors can be ascribed to a growing and long excessive pressure to publish in great quantity. This pressure has also been cited as contributing to recent, notorious cases of scientific fraud. Unfortunat

By | February 23, 1987

As a long-time student of the scientific journal, I have witnessed incidences of unwarranted co-authorship, repeated publication of the same work, and the practice of "salami science"—the slicing of a single research project into its least publishable units. In large part, such behavior by authors can be ascribed to a growing and long excessive pressure to publish in great quantity. This pressure has also been cited as contributing to recent, notorious cases of scientific fraud. Unfortunately, our academic review and reward system, which too often focuses on numbers, may occasionally encourage misconduct both great and small.

A good alternative has been suggested. Marcia Angell, deputy editor of the New England Journal of Medicine, and DeWitt Stetten, of the National Institutes of Health, have independently suggested that a ceiling be placed on the number of works a committee considers in making promotion and grant decisions. Angell has proposed that a peer-review board examine "only, at most, the three articles the candidate considers to be his or her best in any given year, with a maximum of perhaps ten in any 5-year period. Other publications should not even be listed." (Annals of Internal Medicine 104, February 1986, 262.) Stetten has suggested, "Let the applicant select, say, one dozen of his bibliographic citations that are most meaningful to him." He added, "in this regard it may be pointed out that … nomination to membership in the National Academy of Sciences requires a selective bibliography of no more than 12 publications." (Science 232, 4 April 1986, 11.)

Whatever the ceiling, the idea is the same: to emphasize quality of publications, rather than quantity. If a committee were faced with tens instead of hundreds of papers, it is more likely that members would actually read the work before them and judge its substance. "Each publication would then receive commensurately more attention, both from the researcher and those evaluating the work," wrote Angell.

Imagine the potential result of implementing a quality-oriented review process: researchers would produce fewer, better, and more thoughtful papers, and by doing so would lessen some of the clutter now clogging the journal literature. Investigators, in choosing a research project without regard to its probability for yielding rapid and numerous publications, might feel freer to tackle more difficult questions, whose answers offer greater rewards. Although obviously not a panacea, a ceiling might lessen the excess pressure to publish by degrees.

Neither Angell nor Stetten knows of a single instance in which their suggestion has been adopted by a promotion or grant committee. This raises the question of what might be preventing its implementation. Tradition is probably one inhibiting factor, but there are others.

Some have argued that certain fast-growing fields require the quick, preliminary article to establish priority for the researcher, and that such an article is written chiefly for this reason, not as a substantive investigation of a problem. However, the famous two-page paper by Watson and Crick in 1953 describing the structure of DNA proves that establishing priority in a brief and substantive fashion is not impossible.

Another argument against the proposal involves research reported in a series of articles that reveal the substance of the work only when considered as a group. But a ceiling of a dozen papers or two is certainly high enough to meet this objection.

Still others think it is unfair to base a judgment on a sample. Although it does not wholly refute this objection, I point out that the researcher, not the committee, would select the work being judged. In any case, a sample that is read seems more satisfactory than a large corpus that is skimmed over or not read at all.

No doubt some flexibility and certain refinements, such as including a mechanism to evaluate what a person has accomplished recently, can be built into the system.

I would be most interested to hear from readers who have other objections, as well as from any committee that has actually adopted a quality-oriented peer-review system of the type described here.

Finally, a personal note. I have acquired over the years a reputation as "the great quantifier," owing to the citation-based analyses we at ISI® publish. It might therefore seem ironic to have me endorse subjective over quantitative measures. But there is no irony. I have always emphasized the difference between the simple-minded counting of articles or citations as indicators of quality and the in-depth analyses that can and should be carried out. I have repeatedly warned against the cavalier use of citation data. But, sadly, many have found it simpler to "do their additions," in spite of the pernicious implications of this practice. Bibliometric studies can contribute to an evaluation, but ought not to substitute for other more detailed measures.

I use citation analysis as a step toward identifying publications that have elicited great attention among peers—"Citation Classics"—which are highlighted each week in Current Contents®. Researchers also can employ citation analysis to help them choose their own influential works to submit under an Angell-Stetten model of review. Since identification of quality contributions has occupied so much of my energy, my endorsing an emphasis on quality in peer review should come as no surprise.

Eugene Garfield is Publisher and Editor-in-Chief of The Scientist,
and President of the Institute for Scientific Information, Philadelphia, PA 19104.

Advertisement
The Scientist
The Scientist

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement
Anova
Anova

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
Advertisement
Life Technologies