Advertisement

Citation amnesia: The results

Citing past scientific work in present-day research papers can be a slippery business. Contributions from competing labs can be glossed over, pertinent studies accidentally left out, or similar research not mentioned in an attempt to give the study at hand a sheen of novelty. We at __The Scientist__ often hear complaints from our readers concerning what they regard as either honest or purposeful omissions in the reference lists of high-profile scientific papers. So we conducted a linkurl:study;

By | June 25, 2009

Citing past scientific work in present-day research papers can be a slippery business. Contributions from competing labs can be glossed over, pertinent studies accidentally left out, or similar research not mentioned in an attempt to give the study at hand a sheen of novelty. We at __The Scientist__ often hear complaints from our readers concerning what they regard as either honest or purposeful omissions in the reference lists of high-profile scientific papers. So we conducted a linkurl:study;http://www.the-scientist.com/citationamnesia/survey/ of our own to try and quantify the prevalence of these types of slights and ask our readers how the problem might be fixed.
Image: Wikimedia
Indeed, the vast majority of the survey's roughly 550 linkurl:respondents;http://www.the-scientist.com/citationamnesia/results/ -- 85% -- said that citation amnesia in the life sciences literature is an already-serious or potentially serious problem. A full 72% of respondents said their own work had been regularly or frequently ignored in the citations list of subsequent publications. Respondents' explanations of the causes range from maliciousness to laziness. "It certainly shows a lot of frustration out there," Geoffrey Bilder, director of strategic initiatives at the UK-based non-profit association CrossRef, said of the survey results and the accompanying anonymous comments which respondents were encouraged to leave. The root of this frustration is likely twofold, Bilder told __The Scientist__. First, there is such a vast and growing body of scientific literature in existence that authors have an increasingly difficult job of finding and citing all the published work that relates to their own research. With modern indexing and search technologies, publishers and the publishing community may be able to help scientists accomplish the Herculean task of combing the literature. "The joke I tell is that if you can help researchers avoid reading, you're going to make a lot of money," Bilder said. But there's also a certain "perversion" at play in the citation practices of some authors, Bilder said. "We have this naive notion that a citation is a vote." Because so much of a scientific author's worth is encapsulated in the raw numerical heft of his or her citation record, some researchers purposefully avoid citing colleagues with whose work or viewpoints they disagree. "The hidden motivation here is that [authors] don't want to give it any more prominence or any more of a vote than they have to." One of the themes to emerge in respondents' comments was simple resentment. "Competitors willfully exclude references to my work and no one, even other colleagues, can do anything about it," one respondent wrote. "Papers published in lower impact factor journals are presumed to be second rate and ignoring/disregarding them is easy," wrote another. One commenter suggested that "several papers in prominent journals including __Cell__ would not have been accepted if the [past] work was cited." One early-career scientist described his harsh baptism in the dog-eat-dog world of scientific publishing. "I only have 1 first author paper, and it was recently published (Jan 2009, online)," the commenter wrote. "It has already been passed over for citations by the most recent articles, even though it was very much on topic." Another commenter wrote that the main perpetrators of citation amnesia seem to be seasoned researchers, "'big guns' who apparently feel it is safe to appropriate the work of lesser workers because the journal editors will protect them. Only junior people ever get nailed." Some commenters felt that American scientific authors tend not to recognize the contributions of their colleagues across the pond. "Especially in the US there is a trend to cite only papers of fellow citizens," one wrote. "American clinical researchers tend not to read, or at least not to cite publications in European journals," wrote another. Other survey respondents pointed to less salacious causes of the problem, noting requests from editors to winnow down the citation lists or difficulties slogging through databases. "We searched hard for papers that were relevant to our novel finding and could not find anything," one respondent wrote. "We missed one that was apparently too new to be found in PubMed when we were writing our paper." Some element of the problem may be unavoidable, simply because of the sheer number of papers out there. "Nobody can cite all relevant papers all the time; there are simply too many of them," one commenter wrote. "I do cite a reasonable selection of relevant papers and I don't try to pass off other peoples' ideas as my own." At least one commenter railed against the importance of citing past work at all. "I think there is too much emphasis on history," the respondent wrote. "To cite the original paper can be a waste of time for the reader who wants a recent relevant summary rather than an 'honour' for the initial scientist." As for curing the problem of improper or missing citations, our survey respondents seemed evenly split between several possibilities suggested in the survey: raising awareness, random checks, removing editorial restrictions on citation numbers, signing a pledge, or including citations in online supplementary information. But several offered their own solutions. "Referees can and should make editors and authors aware of poor citations," a commenter wrote. "This should be cause to refuse acceptance." Another commenter proposed early training. "The practice of good citation etiquette should be taught in college, if not earlier," he or she wrote. "Most students want to cite a review article and call it a day." Still another, meanwhile, suggested this very idea to avoid inadvertent citation omissions: "When there are many relevant papers, I've tried to find a review article that cites them," he or she wrote. Part of the solution, though, may lie in changing how citations are formatted. "I suggest to include a list of references as PubMed IDs, so all citations of a paper can be downloaded (at least as citations and URLs to pdfs) in bulk," one respondent wrote. Bilder agreed that revising citation formatting could go a long way toward easing frustration surrounding the issue in the scientific community. "We could make it more informative and certainly more efficient," he said. Bilder said that by displaying only the minimum essential information -- author names, publication years, and numerical identifiers such as DOIs or PMIDs -- publishers could make room for more citations. "That would give you enough information to recognize what [authors] were talking about if you're familiar with the literature, and if not, to locate [the referenced work]." he said. Bilder also suggested borrowing a formatting trick from citations that appear in the legal literature -- in particular, the parts of those citations called "signals," which indicate why a particular work is being cited. Signals specify whether the citing author includes a particular citation as a comparison, a contrast, or an example of the point being made. Bilder said that scientific publications could adopt signals in their citations to make citing previous work more clear cut than appearing as a simple endorsement. "Then the different kinds of citations could be treated differently," he said. But University of Chicago sociologist linkurl:James Evans;http://sociology.uchicago.edu/people/faculty/evans.shtml noted that some degree of citation amnesia may not be such a bad thing, as it may be indicative of a healthy level of competition for funding and recognition in the field. "At some level, people fundamentally think that their work is important," Evans told __The Scientist__. "It needs to be that way." Evans, who studies the relationship between markets and science, said that scientists design research projects anticipating that their findings will form the hub of a larger network of subsequent research, and that this expectation makes for good science. "We want people to be gambling and to try to pick the project that they feel will be at the center of this network," he said. "My guess is that if you looked across scientific areas," he added, "in really crowded research areas lots of people are not going to get cited. And in the most crowded areas, people would feel the most neglected." However the life science and/or publishing communities choose to address problems with citation practice, researchers that publish their work should steel themselves for some disappointment down the road. "I have learned to have a thick skin," one survey respondent wrote.
**__Related stories:__***linkurl:Citation Violations;http://www.the-scientist.com/article/display/55627/
[May 2009]*linkurl:Critics rip Cell paper;http://www.the-scientist.com/blog/display/55240/
[25th November 2008]*linkurl:Demand Citation Vigilance;http://www.the-scientist.com/article/display/12829/
[21st January 2002]*linkurl:The Ethics Of Citation: A Matter Of Science's Family Values;http://www.the-scientist.com/article/display/17598/
[9th June 1997]
Advertisement

Comments

Avatar of: John Quackenbush

John Quackenbush

Posts: 1

June 25, 2009

One of the problems we have faced is a limitation on the number of citations allowed for specific article types by nearly every journal. Authors are often forced to pick and choose lest they exceed their allotted quota for citation. I can point to a number of instances over the past year where papers I have submitted have been editorially rejected prior to peer review because I have included too many citations, forcing a pruning that some might interpret as a either deliberate omission or a case of "amnesia."
Avatar of: Mitchell Wachtel

Mitchell Wachtel

Posts: 30

June 25, 2009

Anyone can use pubmed to arrive at over 200 relevant articles. Scientists should use review articles or textbook chapters as much as possible to prove assertions in the introduction or discussion, decreasing the burden upon the reader, limiting the number of references to twenty-five. Review articles and book chapters are increasingly more often cited than the original research; this does not make the original research less worthy. Researchers should not be judged by the number of citations their article produces.
Avatar of: anonymous poster

anonymous poster

Posts: 1

June 25, 2009

Perhaps it might be valuable to borrow some rules that courts use. If an author intentionally omits a citation that disagrees or contradicts the author's paper, as determined by a neutral body, the author surrenders his/her tenure position at the university, and is put on a tenure track to compete with younger, more ethical researchers.
Avatar of: anonymous poster

anonymous poster

Posts: 20

June 25, 2009

Science shouldn't be about getting your ego stroked daily. If your upset because the other kids didn't include you, perhaps you should be in a different field. Science is about doing something you love for the purpose of increasing human knowledge. If I see that my work has done that, whether cited or not, I'm happy.\n\nPerhaps what we should do is stop putting names on papers. Then we would see who is in this for the glory.
Avatar of: Jan W.  Schoones

Jan W. Schoones

Posts: 1

June 25, 2009

Big part of the solution is easy, but apparently missed by all: ask you academic librarian for help. She/he will get the best search results, certainly when a search is performed where both brains work together. \n\n\n1. Schoones JW. Selective publication of antidepressant trials. N Engl J Med. 2008 May 15;358(20):2181
Avatar of: anonymous poster

anonymous poster

Posts: 2

June 25, 2009

I would argue that some of the problem may stem from whether or not a given institution has access to a given journal. Who wants to reference a research article that costs $35 or more to view? I would also argue that we should vote the wallet and refuse to reference articles that cost so much to view. (Hence the argument for immediate, public dissemination of all publicly-funded research.)
Avatar of: Ellen Hunt

Ellen Hunt

Posts: 199

June 25, 2009

One of the obvious cures is to have the citation counting engines process reviews differently from original research. All that needs to happen is to pass-through citations from chains of reviews down to the original research paper. Since these factors matter, we need to make them work properly. Right now, citations are absurdly weighted to the publishers of nice reviews. I am not saying review articles aren't worthwhile, they most definitely are. But we should modify citation counts so that a reference to a review automatically counts as a reference to the citations in the review. And if there is a citation in a review that is itself a review, then the original research of those papers in turn should be counted. \n\nAside from that, I fail to see why any journal in today's world should care about number of citations. But if they must care, then what they can do is have the author produce a subset list for print publication. The full citation list can be provided for online publication. \n\nFor god's sake people! Virtually unlimited space is the whole point of online journals! There is absolutely no reason why an online journal should not allow an unlimited number of citations.
Avatar of: anonymous poster

anonymous poster

Posts: 5

June 25, 2009

Perhaps closer and more rigorous review of the cited literature is actually a responsibility of peer reviewers and editors? It would , for example, be relatively easy for editors to conduct an objective (3rd party) bibliographic search on the topic at hand...? Such lists could be easily appended to articles accepted for publication... ("Further reading"??? Natural History used to do that for its popular articles?)\n\nAn author could annotate the list with "signals" as mentioned in the legal profession -- and augment where omissions occur? \n\nI hear the groans -- but such efforts could be highly automated in the digital realm and could be rapidly appended to articles...\n\nBut more fundamentally, there is a basic question about why citations (or footnotes more generally) are ever included? This issue is made very concrete in considering the issue of citation stacking as data are combined and recombined in meta-analyses\n\nIt's obvious that for many scientists, citation -- as also has been the case with personal requests for "reprints"-- are a part of the way that lineages and/or communities of research are construed? One might dare to say that they are grooming behavior? \n\n
Avatar of: David Weinberg

David Weinberg

Posts: 3

June 26, 2009

....of other underlying topics.\n\nThere have been many excellent points made in the previous comments and it seems that they touch on multiple issues, including, but not limited to: citations to inform (on previous observations), citations to acknowledge (other contributors)and citations to promote (one's career). \n\nIn turn, I think this reflects the reality that publication itself serves multiple purposes (to inform, to drive discussion, to promote one's career). \n\nAnd in still another layer on top of the ones cited above is the reality that not all papers are equally significant to the field, although one could generate a heated debate on what qualifies as significant, for as Newton said ?If I have seen further than others, it is by standing upon the shoulders of giants.?\n\nIf there were limitless jobs and research funds for everyone, and if the body of literature was small enough, then a lot of the emotion surrounding this topic would probably go away, but then we would probably all just debate whether lesser quality science was being supported and whether it should be published, and who should judge, and... you get the idea.\n\nI would be interested to see a survey of The Scientist readers designed to show what roles of citations are considered most important to them. In the meantime, I think the topic generates a lot of healthy debate and sparks useful creativity.
Avatar of: Steve Simon

Steve Simon

Posts: 5

June 26, 2009

While the commentary is interesting, I feel obligated to mention an obvious, but unstated reason why scientists might feel that "their own work had been regularly or frequently ignored in the citations list of subsequent publications."\n\nPerhaps scientists, who are human after all, have a inflated perspective on the value and importance of their own work. Could it be that their work is not being cited because it isn't really worth citing?\n\nIf there is a problem with citations, asking about it in such a subjective way will not uncover it. Surely there must be an objective way to measure this.\n\nSteve Simon, P.Mean Consulting
Avatar of: Albert Henderson

Albert Henderson

Posts: 1

June 27, 2009

I saw little in the article or comments about the unwitting duplication of research or related subjects of methodology and design. Our focus should be on the quality of science.\n\nWhen editors and referees are reduced to bureaucratic lows, putting a quota on the number of citations, the message is they are unable to evaluate an author's understanding of the literature and to ferret out blind spots (as opposed to bias) in his/her work. \n\nTime and resources cannot be saved by a database search. A mechanical bibliography can be valuable, but it cannot replace the intelligent yield of an author's review. Scientists need not cite back to Darwin, Linnaeus, Copernicus, Accademia del Cimento, etc. But they must demonstrate sufficient reading and thinking in order to qualify for research and publication.\n\nYes, and the same goes for editors.\n\n\n
Avatar of: Abel Schejter

Abel Schejter

Posts: 1

June 29, 2009

Prof. David Keilin, FRS, discovered the cytochromes in 1924. Then he was told by a colleague that his finding reminded him of an old paper by some McMunn from Birmingham. He found the paper in a 1884 paper in the Transactions of the Philosophical Society. He found that McMunna had died quite some time ago. He took a train to Birmingham and went to visit McMunn's widow, where he learned the whole story, how her husband's discovery was poo-pooed by the famous German physiologist Hoppe-Seyler, and thereby forgotten. All this is told in detail in Keilin's autobiographic book "The History of Cell Respiration and Cytochrome" and in a short paper by Margoliash and Schejter in Trends in Biochem. Sci. 9, 64-67. Abel Schejter
Avatar of: VETURY SITARAMAM

VETURY SITARAMAM

Posts: 69

June 29, 2009

The most important issue in citation malpractices is the wrong science getting reinforced.This has not been instantiated by any one. When journals are run like clubs, as they no doubt do, the most important thing to do is not to quote and to claim falsely since citations are unfortunately linked to careers in many places.\nWhat is appalling about this is that journals run ethics and integrity groups whose purpose seems to be only to divert the attention from real misdemeanors. My encounter recently with COPE justifies this view with regard to a low quality review published in Annals of Botany with actually proven, wrong, science and the intransigent attitude of its editor. For details See http://blogs.nature.com/news/thegreatbeyond/2009/03/indian_researcher_charges_jour.html \nThe COPE to which I complained has now replied. It completely sidestepped the matters of wrong science as did the editor (wrong science is accepted with amazing grace and comfort) whose interest was apparently to protect the handling editor. COPE took great pains to ensure that Annals of Botany need not even invite a rebuttal, subject of peer review of course. It summarized that, ?In the present instance your grievance appears to be about your belief that this journal has engaged in institutionalized discrimination against third world researchers (sic). Clearly you are dissatisfied with the review (of Atkins and Macherel).? The wording was a clear retake of Yes Minister series: Sir Humphry Appleby talking about how organizations primarily cater to their relevant constituencies, the British editors as in this case. ?In this particular instance it is our view that the matter is essentially an editorial dispute and not one which COPE is empowered to consider, even if all the formal opportunities for complaint have been used.? COPE has clearly stated in its website that its member journals should adhere to editorial norms. Writing to me that, ?However, we recognize and accept that the journal is not under any obligation to publish it: although COPE encourages debate and the correction of the scientific record, we also respect an editor?s right to choose what to publish? squarely brings us to an open admission that editors are outside the purview of any public scrutiny. Should we presume that existing structures have neither the teeth nor the inclination to cater to what matters most, correct science and propriety? They serve their own groups, making a mockery of any scrutiny. I must admire the way COPE has given reprieve to Annals of Botany so efficiently removing any possibility of any scrutiny, now or ever. \n Citation, once it became a marketable commodity thanks to ISI, the impact of market forces on research have become tremendous. The current discussions do not reflect either incisive thinking or even naivety. These actually reflect the gullibility of the scientific world to repose faith where it is most risky, the publishing world. It is a pity that scientists, or G8 nations for that matter some of which find democracies in the developing world an inconvenience, have not learnt the basic lesson that common people have learnt through millennia of hegemony?that I can protect my freedom only by helping protect yours. This is the heart of the matter in proper citation practices. The propblem is not in the editorial interests whose priorities differ.\n
Avatar of: anonymous poster

anonymous poster

Posts: 85

July 21, 2009

All too often, I think, relevant citations are not cited because the author(s) are simply unaware of them. The scientific culture has to a large degree shifted from research based on what's already known to research in vacuuo. Too many wheels are being reinvented because the reinventers fail to do their subject area research before they embark on their lab research (and sufficient numbers of their peers are generally equally ignorant), and/or too many false leads are pursued, sometimes in a high-profile manner, because the pursuers (and again their peers!) are unaware of highly relevant prior work. Sometimes these "false leads" get published! Arrgghh!\n\nYes, some of this is due to the huge volume of published literature out there; and yes, some of this is due to the difficulty of learning about work published prior to 1985 (prior to the "electronic era"). But -- and this is a profound "but" -- some of this is simply and unfortunately a new scientific culture in the US that has developed over the past two or three decades, one that simply fails to respect or care about what has gone before historically. \n\nI think it's not only sad and intellectually shallow, but potentially catastrophic for US science.
Avatar of: richard burton

richard burton

Posts: 2

June 21, 2010

Here is a point that I think has not been made. \nLong ago I did a little survey of literature familiar to me (?Biologists mis-cite? Nature 1980;286:438.) quantifying a variety of different kinds of citation error ? with horrifying results.\nParticularly relevant here was the finding that many references to my papers were pointless, or else less relevant than omitted references to others papers of mine. \n

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement
Panasonic
Panasonic

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
Advertisement
Life Technologies