Advertisement

Rumblings over Science retractions

The conversation is not over regarding two recent retractions of papers on enzyme engineering. Two letters linkurl:published;http://www.sciencemag.org/cgi/eletters/319/5863/569b this month in Science say that the explanation of retraction issued by linkurl:Homme Hellinga's;http://www.biochem.duke.edu/faculty/homme-hellinga group at Duke University does not account for many of the errors in the original publications. The linkurl:Grantsmanship blog;http://writedit.wordpress.com/2008/02/01/science

By | March 26, 2008

The conversation is not over regarding two recent retractions of papers on enzyme engineering. Two letters linkurl:published;http://www.sciencemag.org/cgi/eletters/319/5863/569b this month in Science say that the explanation of retraction issued by linkurl:Homme Hellinga's;http://www.biochem.duke.edu/faculty/homme-hellinga group at Duke University does not account for many of the errors in the original publications. The linkurl:Grantsmanship blog;http://writedit.wordpress.com/2008/02/01/science-retraction-community-at-work/ has also collected long running commentary on the circumstances surrounding the retractions. On February 1, Hellinga's group issued a statement of retraction for their 2004 Science paper on redesigning ribose-binding protein (RBP) to catalyze triose phosphate isomerase (TIM) activity -- a reaction crucial to glycolysis in almost all types of cells. They explained that linkurl:John Richard,;http://www.chem.buffalo.edu/richard.php in the department of biochemistry at the State University of New York, Buffalo, found that the TIM activity of the new enzyme was due to wild-type contamination during his own experiments with the redesigned enzyme. A month later the group retracted a second paper on the same redesigned enzyme from the Journal of Molecular Biology. But the two letters, one written by Richard himself, published electronically on March 10, both noted that key measurements of enzymatic activity were wrong even if there had been wild-type contamination. In particular, the Michaelis Constant (Km) -- a measurement of enzymatic activity -- in the papers was an order of magnitude lower than the wild-type. Had wild-type contamination truly been the error, then the Km values would have been the same as wild-type Km values. "There's no way you can account for that, no explanation," linkurl:Jack Kirsch,;http://mcb.berkeley.edu/labs/kirsch/kirsch.html professor at the University of California, Berkeley, and author of one of the letters, told The Scientist. The 2004 Science paper "was a very important paper even though it was wrong in some sense," linkurl:David Baker,;http://depts.washington.edu/bakerpg/ from the University of Washington, told The Scientist. In 2006, Baker wrote a linkurl:feature;http://www.the-scientist.com/2006/7/1/26/1/ for The Scientist in which he cited Hellinga's groundbreaking work. Baker's group recently published papers in Science and Nature on enzyme design, citing Hellinga's paper in both. "I felt strange removing the reference because I felt the citation should be in there." He added that Hellinga's papers had formed the groundwork for much of what his group did. Richard told The Scientist that he discovered the enzyme's inactivity during a routine purification. Given the notoriety that Hellinga has gained from this work, he added, people want to know how he made such a mistake. The Science paper has been cited more than 160 times. In addition, Hellinga's group had reported that their redesigned enzyme could restore the activity of a TIM knockout E coli. If the in vitro enzyme showed no real activity, Kirsch asked, how could it have restored the knockout? Shortly after the original paper appeared in Science in 2004, Hellinga went to give a seminar in Berkeley to present his new findings. Kirsch said he brought up the issue with the Km and asked to see Hellinga's data but never received it. Hellinga did not respond to several calls for comment. In a February 13 linkurl:article,;http://www.nature.com/news/2008/080213/full/news.2008.569.html Nature reported that Hellinga told them that his lab had been using the wrong purification method during the experiments reported in Science in 2004. "A mistake was made, and nobody caught it -- including myself," Hellinga told Nature. "We were concerned it might not have been innocent, but it was." Duke University began an investigation of misconduct of the paper's first author, Mary Dwyer, in September of 2007 but cleared her on February 4. Steve Mayo, a prominent protein designer at Caltech, declined to comment for this article. Several other researchers who have cited Hellinga's papers did not return calls for comment.
Advertisement

Comments

Avatar of: bjoern brembs

bjoern brembs

Posts: 9

March 27, 2008

Whatever the reason for any particular retraction, there is no doubt that we will see more and more retractions in the future.\nWith decreased funding and the number of PostDocs looking for positions on the rise, the pressure to publish a lot in high journals is increasing. Just as in sports, with increasing stakes, the cheating will increase.\nMaybe we will soon need an international agency conducting unannounced lab-visits?
Avatar of: anonymous poster

anonymous poster

Posts: 8

March 27, 2008

The recent series of retractions in Nature, and now Science, which appear to result from likely fabrications is troubling. This trend might result from the "impact factor effect" where it is now believed so important to be published in "high impact" journals that some scientists may become willing to go to unethical lengths to generate the high-tech/already-anticipated-results articles which the high impact journals appear to favor. Fortunately, the meaningfulness of a journal's impact factor will diminish as articles from a wider list of journals becomes available through PubMed Central.
Avatar of: Gordon Couger

Gordon Couger

Posts: 23

March 27, 2008

We need better post postmortems on publisihsing errors. In the case of honest errors the ones that made them should explain how they made made them and how they would do things differently to avoid them so the community can learn from their mistakes as much as their successes. Often there is much more to learn from errors than incremental progress.\n\nIn the case of fraud some one should do the postmortem so others see the price paid for it. In the shades of gray in between we all need to learn how to avoid those areas and no comment isn't the way to do it.\n\nGordon Couger\nStillwater OK
Avatar of: anonymous poster

anonymous poster

Posts: 34

March 27, 2008

when you have a good idea, you make data to fit in. simple as that. in this case, it has been proved by other people after the publication, so the idea is right, maybe some the data is wrong, but as the first one to publish in science and nature, it is worthy. People, pay attention to the groundbreaking ideas with faith, not evidence. It is ideas help people, not individual piece of data. Look at recent retractions, you will know that.
Avatar of: anonymous poster

anonymous poster

Posts: 2

March 27, 2008

The writer of the post below, "hypothesis-driven research," seems to have a fuzzy understanding of the scientific method. While Hellinga's "idea" to computationally design enzymes has proven (in David Baker's hands) to be tractable, Hellinga cannot claim to have invented this concept. Moreover, he strikingly failed to implement it. Hellinga's systematic and disingenuous botching of routine experimental biochemistry has mislead his field and his colleagues. If Hellinga is indeed a visionary (which I dispute), then he is one whose ignorance of basic scientific concepts has managed to obscure whatever "ideas" so inspired him. He certainly had much to gain from making his "idea" work.\n\nCompetent, honest, scientists do not re-arrange their students' data to fit a pet hypothesis, however valid that hypothesis might be. While not outright fabrication, such behavior is equal, if more insidious, fraud. The more that I read of the analytical commentary regarding Hellinga's papers (especially the mutagenesis "data"), the more worried I become that such fraud is what happened here. \n
Avatar of: anonymous poster

anonymous poster

Posts: 2

March 27, 2008

The writer of the post below, "hypothesis-driven research," seems to have a fuzzy understanding of the scientific method. While Hellinga's "idea" to computationally design enzymes has proven (in David Baker's hands) to be tractable, Hellinga cannot claim to have invented this concept. Moreover, he strikingly failed to implement it. Hellinga's systematic and disingenuous botching of routine experimental biochemistry has mislead his field and his colleagues. If Hellinga is indeed a visionary (which I dispute), then he is one whose ignorance of his own field has managed to obscure whatever "ideas" so inspired him. He certainly had much to gain from making his "idea" work.\n\nCompetent, honest, scientists do not re-arrange their data to fit a pet hypothesis, however valid that hypothesis might be. While not outright fabrication, such behavior is equal, if not more insidious, fraud. The more that I read of the analytical commentary regarding Hellinga's papers (especially the mutagenesis "data"), the more worried I become that such fraud is what happened here. \n
Avatar of: anonymous poster

anonymous poster

Posts: 17

March 27, 2008

If US scientists is less polite (to each other), I would expect a lots more of high profile papers being retracted.\n\nThey are too polite or they are excercising "I dont touch yours, so you dont touch mine".\n\nThere was a paper on Science in 2000, and it took 07 years later for another group to say that, what they [Science paper's authors] have intepreted was wrong, and yet the Science paper was not retracted, and the finding that approved Science paper wrong is published on PLoS Biology. \n\nSad fact is that the groups which find out this fact are from EU.\n\nMy question is "does it mean that no US scientists has seen the fact over the last 07 years? or they saw it but failed to bring the paper into debate due to exercising the policy - I dont touch yours and you dont touch mine?"\n\nThose low profile wrong papers did not catch community's attention, we understand why [no one may try to replicate or follow the hypothesis], but a high profile paper didnt catch our attention we should ask why!\n\nWe must adopt a system to excercise the way that "if we know someone doing wrong things or trying to mislead others and we fail to report it to the justice system [in science] we should be punished as well".\n\nHypothesis driven research is fine, research must go that way, but too much oriented to a pre-defined hypothesis without being openned to other options that the data provided often lead to misconducts.\n\nPostdocs are pressurized cookers, believe me, sometimes they are pressured to cooked [data] the way that fit PI's hypothesis.\n\nWhy it is so the misconduct cases surfaced in South Korea were due to the PIs, and almost all what we have seen here in the States are mostly blamed on "postdocs"? \n\nAgain, this may be the "polite things between established scientists". Unless we find a away to keep PIs accountable, things will keep increasing.\n\n\n
Avatar of: anonymous poster

anonymous poster

Posts: 1

March 29, 2008

The previous post implies that there is a bias amongst US scientists to cover up each others' errors and/or falsifications. \n\nThe author of this post gives no evidence of this bias, except some vague and unintelligible reference to some unknown finding by US scientists that was overturned by EU scientists.\n \nI personally have found no evidence of a bias in US researchers covering up for one another. Might I remind this anonymous poster that John Richard (an American scientist) discovered and corrected an egregious error of Homme Hellinga (a scientist at a US institution, but trained in Europe.) Might I also remind the anonymous poster that it was an American scientist (Gerald Schatten) who was one of the first to allege that the data was fabricated in the infamous Hwang Woo-Suk stem cell case.\n\nIf you have evidence of a real cultural bias, state it clearly and cite real evidence. Otherwise, please stop these childish assertions that US scientists are somehow less ethical than their EU counterparts.
Avatar of: anonymous poster

anonymous poster

Posts: 17

April 7, 2008

I apologize if my comment meant EU scientist is more ethical than US counterparts. I can not afford for such statement since I know nothing about EU scientists. However, I did mean a fact that, it was the EU scientists who overturned the finding [by US scientists].\n\nFor the Hwang's problem, as far as I understand, it was a former fellow of Hwang's Lab who brought the problem into public attention [through MBC channel] which then led to the retraction request. I believe The Scientist has a good series of articles on this issue.\n
Avatar of: anonymous poster

anonymous poster

Posts: 9

September 11, 2008

I have an actual fresh example where I have discovered fraud in a Science paper from 2001 (I have the real McCoy but they have faked it). But that won't get me into Science (over 6 months after the original paper) but rather some obscure small journal that my tenure committe will sneeze at and take it negative that I put ATP molecules into that project instead of some "high-impact" work. \nOK, that is my problem. But, when it is published in this obscure journal people will still refer to the Science paper (not all of it is wrong, most data is OK so I don't think there will be a retraction, especially not after such a long time). \nSo the fraud will not be weeded out, but stay there forever.\n\n

January 14, 2010

Fraud in scientific research and publication is increasing. Misleading the public and misusing public funds should have consequences for both individual and institutions. Criminalization of scientific fraud is long overdue. Mandatory institutional training on the ethical conduct of research is necessary but not sufficient. In the end, the crucial element is accountability. Fraud and misconduct in research conducted with public funds should be detected by ORI and referred to the justice system for criminal and civil prosecution. Both the researchers AND their institutions should be penalized and required to make restitution of public funds. As one of the measures of success of a research institution is the number of dollars it receives in grants, institutions have a vested interest in denying wrongdoing and protecting researchers. Only when it costs the institutions money, they will have an economic incentive to establish effective measures to prevent and detect fraud. Random audits by the funding agencies and/or by Office of Research Integrity could be an effective tool in detecting fraud and act as a deterrent. \n

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
Advertisement
Life Technologies