Advertisement
Gene Tools
Gene Tools

Opinion: Ethics Training in Science

The NIH has required researchers to receive instruction about responsible conduct for more than 20 years, but misconduct is still on the rise.

By | May 14, 2013

WIKIMEDIA, SCHAAR HELMUTIn the late 1980s, on the heels of several high-profile scandals involving misconduct in scientific research, US policy makers initiated a move towards instituting formal instruction about responsible conduct and ethics in research (RCR). “Lack of formal discussion about responsible research practice and the ethics of research is a serious flaw in the professional training of young scientists and clinicians,” the Institute of Medicine of the National Academies stated in 1989. The assumption was that formal RCR training would reduce the incidence of fabrication, falsification, and plagiarism in research. Over the next decade, training programs evolved slowly, and by the turn of the century, RCR instruction for graduate students and postdoctoral fellows was firmly established in policies from the National Institutes of Health (NIH) and National Science Foundation (NSF). Today, RCR training constitutes its own industry, from experts who provide paid consulting on the development of RCR training curricula and programs to companies that specialize in online training modules.

But now, 20-plus years later, it is only fair to ask: Does it work? The simple answer is, “No.” Today, 1 in 3 scientists responding anonymously to surveys admits to “questionable” research practices; research misconduct cases handled by the US Office of Research Integrity (ORI) are at an all-time high; and retractions of scientific papers have increased exponentially since 2005. It should be noted that not all retracted papers involve foul play, but a recent study reported in PNAS surveying 2,047 biomedical and life-science papers revealed that 67 percent of retractions were directly attributable to misconduct. The rapidly rising incidences of retractions and misconduct contribute to erosion of the public trust, and are financially costly. In 2010, researchers estimated that a single misconduct investigation costs an institution $525,000.  If all allegations of misconduct reported to ORI in 2011 were investigated, the costs would exceed $125 million.

So why are we witnessing this alarming trend in retractions and research misconduct? It is unlikely that researchers are unaware that falsification, fabrication, and plagiarism are unethical practices. It is, however, quite possible that ubiquitous RCR programs are contributing to the trend. Over the past 20 years, technicians, graduate students, postdoctoral fellows, and faculty have become fluent in the language of misconduct. Today, most universities have their own Office of Research Integrity, a full-time Research Integrity Officer (RIO), and mechanisms that encourage anonymous reporting. Consequently, individuals may feel increasingly more comfortable about coming forward to report suspected research misconduct. In addition, plagiarism is being discovered at higher rates due to the use of sophisticated detection programs. For example, 80 percent of the findings of research misconduct issued by NSF over the past 10 years involve plagiarism.

As a working scientist and RIO, however, I believe that it’s not simply the better identification of questionable research practices that has led to these retraction and misconduct trends; additional factors are at play. First, and most importantly, the competitiveness of the environment for NIH funding has increased dramatically. From 1997–2003, success rates for NIH funding hovered around 30 percent. However, in 2011, overall NIH success rates for research projects fell to 18 percent, with success rates for new investigators at just 15 percent. A highly competitive environment no doubt breeds unethical behavior.

Retraction data from Nature (2011) and NIH success rates from ScienceInsiderTo make matters worse, the assessment system in the biomedical and life sciences is highly skewed by quantitative metrics. Individual and institutional research success is assessed by numbers of graduate students and postdoctoral fellows trained, number of funded grants, number of papers published in high-impact journals, and other citation metrics. Such an assessment system puts the focus on achieving these specific measures, rather than on simply doing high-quality research. Furthermore, the administrative and regulatory burden on research faculty continues to increase. The Chronicle of Higher Education reports that scientists spend 42 percent of their time on administrative tasks. Despite this time sink, they are still expected to maintain the same high research output, magnifying the need to do more research in less time. Administrative duties can also take researchers’ attention away from the bench, potentially allowing errors to creep in. Finally, a majority of biomedical and life science researchers depend on “soft” money, such as grant funding, to support themselves, generating intense economic incentives for scientists to generate high-quality data and high-impact publications.

These manifold pressures have created a perfect storm: a hypercompetitive and time-stressed research environment. Under these conditions, some fraction of investigators may increasingly rationalize experimental “sloppiness” and questionable practices, or commit outright misconduct in order to attain success. Plotting retractions of published papers as a function of NIH success rate suggests we have reached such a threshold: when only around 20 percent of grants are funded, the numbers of retractions skyrockets. (See figure.) The notion that the competitive research environment is contributing to the trends in retractions and misconduct is not novel, but it is important to fully appreciate the impact of the current environment, as it suggests that RCR training by itself cannot be effective.

Mitigating the storm

So what are the solutions?  First, the ever-increasing regulatory and administrative burden faced by researchers needs to be dealt with. To this end, the National Science Board has recently created a task force to help address these issues, but universities must also provide resources and adopt strategies—such as using electronic database systems, dynamic forms, and protocol-tracking software—to help researchers comply with the regulatory requirements. Second, mentors need to be actively engaged in RCR training. Some in the bioethics community believe that with the advent of RCR programs, many researchers have relinquished their mentoring responsibilities, but science still remains an apprenticeship model and mentors retain considerable influence. To facilitate moving RCR training back into the lab, the University of California, San Diego, has incorporated “Train the Trainer” in their RCR approach. According to their website, this program “facilitates workshops and expands the ability of faculty, at the departmental level, to mentor their trainees directly, using research ethics examples relevant to their specific discipline.”

Finally, and most importantly, we need more money. An increase in the NIH budget would not only help ease the hypercompetitive environment, it is necessary for the continued improvement in the health and wellness of the country. This is the “century of biology,” and there is enormous potential for dramatic breakthroughs and innovations in the biomedical and life sciences. Improving the NIH funding environment is simply the right thing to do.

James Hicks is a comparative and evolutionary physiologist in the Department of Ecology and Evolutionary Biology at UC Irvine. His research focuses on the evolution of the cardiopulmonary system. He is also Associate Vice Chancellor for Research and the campus RIO.

Advertisement

Add a Comment

Avatar of: You

You

Processing...
Processing...

Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo

Comments

Avatar of: Neurona

Neurona

Posts: 33

May 14, 2013

I have been in research science since 1980.   It has become so cut throat and based on number of publications that I'm surprised there aren't MORE ethics lapses. I was told by a former department head that he didn't care if I published crap or double-published papers; it was all about numbers, not quality, to him. And he specifically linked that to my job security.   We have a system that predisposes its denizens to fraud.  Until that changes, it won't get any better.   

Avatar of: SaG

SaG

Posts: 1

May 14, 2013

How about making the jobs of researchers less dependent on NIH funding by having their employers (Universities) not hire them into soft money positions in the first place? 

When a med school requires you to provide 60% or more of your salary on grants you have to wonder why they are being hired int he first place.  Certainly not for their teaching or service expertise. 

This is the root problem not the lack of NIH $.  When a scientist only has to worry about supporting his/her summer salary the level of stress drops dramatically.

Avatar of: Paul Stein

Paul Stein

Posts: 121

May 14, 2013

Neurona and SaG are absolutely correct.  The entire system of conducting biomedical science, from obtaining funding to publishing to obtaining jobs is totally broken and unsustainable.  Unfortunately, practically every academic instutution is conducting its affairs in an identical fashion.  If you think yours is any different, think again.  Until there is a total reworking of the way science is conducted, and especially on the individual institutional level, misconduct will only continue at the same level as in the graph above.

Avatar of: Brian Hanley

Brian Hanley

Posts: 24

May 14, 2013

The training is horsmanure. This "training" is exactly the same as the training against rape and sexual assault in the military. It is nothing but a check box and should be completely eliminated. It makes no difference at all. It's not even a joke.  

The victims of it, grad student and post-doc scientists that are good and honest, are being raped by these operators. And universities protect the perpetrators unless they can't get away with doing that. (Yes, Virginia, former JAG officers are perfect for running academic misconduct and whistleblower investigations - they know what is expected.) 

Just like the military, the institution has responsibility for investigating the wrong doing of its own. 

Only one thing will make a change. DOJ must get involved. The responsibility for taking reports and investigation must be taken away from grant receiving institutions - period. That responsiblity must be turned over, and rulings on cases must be made by a transparent outside agent of board. Those persons cannot be part of the revolving door of academic administrators either. 

Nothing else will make a difference. These people are defrauding the taxpayer out of billions in grants and wrecking the lives of scientists entering the profession. The only way to stop it is with force. 

Avatar of: Ken Pimple

Ken Pimple

Posts: 22

May 14, 2013

Neurona, SaG, Paul Stein, and BPH are all right (except maybe BPH's idea of having the Department of Justice investigate misconduct - but I don't know).

We should align our incentive system with what we want to happen. Currently, incentives are to get grants and publish papers. The incentive to do good research and expand the scientific knowledge base is secondary at best. If our first priority (as shown by the incentive system) were honesty and good science, there would be much less misconduct and many fewer retractions.

All of this said, I hate it when people blame RCR instruction for misconduct. I believe that most RCR training is probably useless, but some is very good. I also think that the NIH and NSF mandates are probably counter-productive. But why would anyone think that government mandates to teach RCR would reduce misconduct? Asking whether RCR instruction "works" and concluding that it doesn't because there's still plenty of misconduct out there. This begs the question of what it means "to work" in this context. There are many ways that RCR instruction could work (and be worthwhile), but none of them would be measured by a nation-wide drop in research misconduct. 

I agree with Dr. Hicks that "individuals may feel increasingly more comfortable about coming forward to report suspected research misconduct" thanks to RCR instruction. I take that to be one way that it might have "worked." 

Avatar of: Richvn

Richvn

Posts: 1

May 14, 2013

James Hicks makes excellent points about the burdens on scientists today. It is a shame that he backed up his argument with a graphic that isn't good evidence for what he believes. I've tried to explain why the comparison isn't meaningful in this blogpost (although I agree with Hicks about the pernicious influences of very low success rates).

Avatar of: asmith

asmith

Posts: 2

May 15, 2013

Formal instructions on RCR will not solve the problem unless it is reinforced with significant penalties. It is a question of moral deficits in the perpetrators and even the institutions which may knowingly cuddle them. The mere fact that they willingly conceal the reality of their corrupt practice until faced with the possibility of 'discovery'is evident that they are aware of their despicable turpitude. And then, and only then do they scamper to execute a retraction. Flawed hypothesis, sloppiness in research, inadequate technologies etc. is one matter. Unbiased Study Groups and peer review may pick these up, but "misconduct and fraud" is another totally  different matter. It is deliberately intended to conceal and deceive for personal gain, and the cost is enormous. As mentioned above, these costs (direct, indirect and opportunity costs to others) can run into hundreds of thousands, even millions of dollars.

I recall during a Post-Doc. stint, mentioning to my mentor and PI some concerns about the quality of work comming out of our lab - his reply to me was "<<myname>>!, the important thing is to get the money, let those who come afterwards clean it up". Of course I was livid, having been myself in prior "clean up" modes or where a "clean up" was not even possible.

If the average "Joe Blo" in the streets write a bad check for $20 s/he could be facing jail time, these guys have swindled hundreds of thousands, even millions and are walking away with a slap on the wrist. SHAME!; SHAME!; SHAME!
 

Avatar of: primativebeliever

primativebeliever

Posts: 10

May 16, 2013

in conculsion give us more money and we will be more honest?  I saw what went on in a research lab. the solution is just like life be profitable or go out of business. out in the woods or in the streets real science, honest science is conducted every day. cannot cheat the laws of nature and get away with it. survival of the fittest or time and chance? Be dishonest with nature it has laws and time waits for no man. Before I plug in 480V I am very honest with what I am doing or BOOM. would you like your chef to be as honest as you are, they are in a hurry too? mind numbing!

Avatar of: ClevelandKen

ClevelandKen

Posts: 7

May 18, 2013

The predisposition you mentioned Neurona is absolutely the case. On one side, it's fortunate that we have technology that can track such misconduct as plagiarism, but on the other, this system is emblematic of many problems in American industry.

Follow The Scientist

icon-facebook icon-linkedin icon-twitter icon-vimeo icon-youtube
Advertisement

Stay Connected with The Scientist

  • icon-facebook The Scientist Magazine
  • icon-facebook The Scientist Careers
  • icon-facebook Neuroscience Research Techniques
  • icon-facebook Genetic Research Techniques
  • icon-facebook Cell Culture Techniques
  • icon-facebook Microbiology and Immunology
  • icon-facebook Cancer Research and Technology
  • icon-facebook Stem Cell and Regenerative Science
Advertisement
Horizon Discovery
Horizon Discovery
Advertisement
The Scientist
The Scientist
Life Technologies