Misconduct on the Rise

Retractions of scientific studies due to plagiarism, falsification, and other instances of researchers behaving badly have skyrocketed in the past decade.

By Bob Grant | May 21, 2012


Research misconduct and the fallout from such behavior is increasingly common, according to a new report compiled by a company that makes software to detect plagiarism in submitted scientific manuscripts. The makers of iThenticate—software that combs a database, called CrossCheck, with more than 25 million published articles—published the report, which collates previously published research on misconduct and plagiarism, and sprinkles in a few iThenticate customer testimonials.

A couple of years ago, iThenticate helped determine that plagiarism was a far more common occurrence in the scientific literature than anyone expected, and the new report confirms that finding with some standout figures: retractions have increased tenfold over the past decade, 1 in 3 scientists admits to questionable research practices, and $110 million was spent on misconduct investigations in the United States in 2010.

But beyond the regurgitated factoids, iThenticate's own data is a striking illustration of how common plagiarism may be in the scientific community. The report claims that iThenticate identified more than 10 million content "matches" to already-published work in manuscripts submitted in 2011 and 2012. The folks at iThenticate worked up a little infographic containing most of the information.

Add a Comment

Avatar of: You



Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo


Avatar of: Jordan Berg

Jordan Berg

Posts: 3

May 21, 2012

I was recently an Associate Editor for two conferences that require an iThenticate check on all submissions. At first glance the numbers are eye-popping: all manuscripts (out of 20 I handled) showed at least 15% duplication of previously published material, several were above 30%, and one was at 60%. But it turned out that  iThenticate (at least as configured for those conferences) flags the entire bibliography as potentially plagiarized, so scores of 15% to 20% meant essentially zero duplicate content in the body of the paper. In the end none of these papers were considered cases of misconduct. Even the 60% was more laziness than anything else. The authors had worked from an earlier paper, and left most of the boilerplate text intact. They did, however, have new data and new results to present, and they did not represent any previously published results as being new. Not a great effort on their part, but not scandalous either.

I liked having the reports available, but just citing the raw numbers is very misleading. 

Avatar of: hlarmus


Posts: 5

May 21, 2012

Acomment on Jordan Berg's comment:
I wouldn't call it laziness. If one has series of studies using essentially the same equipment and similar procedures, not changing those sections in subsequent articles seems appropriate and even helpful to the reader.

Avatar of: Ken Pimple

Ken Pimple

Posts: 10

May 21, 2012

I've made a quick-and-dirty summary of several reports on research misconduct, available at http://mypage.iu.edu/~pimple/b.... To my eye, the results tell us that we don't actually have any idea how much misconduct there is.

If I'm wrong, I really hope someone will tell me. I'd rather have good numbers than a huge range of estimates. I'd be delighted to learn that I've misinterpreted these studies.


Avatar of: Bill


Posts: 1457

May 21, 2012

I'm much less worried about plagiarism than out-right fabrications.

Avatar of: Derek Kane

Derek Kane

Posts: 1

May 21, 2012

These seem to be very strong accusations by a company that has a vested interest in having us believe there is a problem.

A quick review of citation Fanelli [5] seems to indicate that this info-graphic chose the most sensational method of presenting the information in the study. Citation [6] shows a level of misconduct much less than 33%. In both studies the large percentages are the portion of scientists who believe they observed questionable research practices among colleagues.

I don't see the value of publishing a summary of research by a company that benefits from a particular interpretation of that research. You could equally well link to the original research cited in the graphic and decide independently whether scientific misconduct is indeed on the rise and whether this company's approach to dealing with misconduct will be effective.

Avatar of: IkeRoberts


Posts: 9

May 21, 2012

 When citing the raw numbers is misleading, that is the same as cooking the data to reach conclusions that are not supported by the study. This practice is basic misconduct. I find it ironic that a company purporting to be fighting scientific misconduct engages in that very practice!

Avatar of: EllenHunt


Posts: 74

June 22, 2012


Avatar of: EllenHunt


Posts: 74

June 22, 2012

And I also. Self-plagiarism I really don't care about. They did the work. It's typically introductions, etc. I see those as in large part a side effect of the need to engage in maximum publications for one research project. It's also because of the page limits that grew up in the era of paper-publications when each page cost the publishers a significant amount of money.

I am in favor of going back to the way things were 100 years ago when you could submit and publish a 100 or 200 page paper and it was just evaluated on its merits. Lots of things go together that way these days.
My 20 cents.

Popular Now

  1. Dartmouth Professor Investigated for Sexual Misconduct Retires
  2. Theranos Leaders Indicted For Fraud
    The Nutshell Theranos Leaders Indicted For Fraud

    Federal prosecutors filed criminal charges that allege the company’s promise to revolutionize blood testing swindled investors out of hundreds of millions of dollars and put patients in danger.

  3. Laxative Causes Long-Term Changes to Mouse Microbiome
  4. Probiotics Prevent Cholera in Animal Models