John Labbe

In 2004, media reports and US congressional investigations revealed that dozens of intramural scientists and officials at the National Institutes of Heath had received substantial amounts of cash and stock options by consulting with drug and biotech companies, many of which had dealings with the agency. High-profile problems with prescription drugs such as Vioxx and the controversy over antidepressant use in adolescents suggested that clinical trials conducted at the nation's most respected universities and medical centers might not tell the whole story.

It seemed to be a banner year for conflicts of interest and ethical gaffes, and not just in the United States. In Canada and Europe, the line between research and commercial activities became increasingly blurred, which heightened public sensitivity to potential conflicts of interest (COI) in biomedical research.

This heightened awareness can make it seem as if COI problems are increasing, bioethicists say, just as...


Concerns over institutional COI are not limited to the United States. Last November, a panel of medical experts commissioned by the Canadian Association of University Teachers warned that outside pressures were putting the integrity and independence of as many as 20,000 clinical faculty and researchers in that country at risk.2

The report was commissioned after Nancy Oliveri, a professor of medicine at the University of Toronto, had raised concerns about the risks of a new drug she was helping to develop. As a result, her career was nearly ruined. But it later emerged that the University of Toronto had engaged in negotiations for several multimillion dollar donations with the drug company involved. The issue of institutional COI "is just being grappled with now," says Wendy Baldwin, executive vice president for research at the University of Kentucky, Lexington, and former deputy NIH director for extramural research. "But it will take a fair amount of grappling to come clear on definitions and how to handle it."

Because the federal and provincial governments are offering to match contributions from businesses, "they are driving the institutions into the hands of the private sector," says Dickens. This "mercenary incentive" is steering universities away from their historical role as sources of objective knowledge, he adds. "It's becoming a mounting storm."

In Europe, scientists, universities, and governments are also struggling with biomedical ethics issues, albeit not to the same degree as their counterparts in the United States. Issues in Europe include the use of human tissue3 and the propriety of conducting research on human subjects in developing countries. But commercializing research is causing increased concern.

"Broadly speaking, the issues involve how to prevent scientific misconduct," says Imogen Evans, research strategy manager for clinical trials at the Medical Research Council in London. "Universities are being encouraged to make commercial use of their discoveries," she says. "This is blurring the once clearly-strict dividing lines between universities and commerce."

While cognizant of COI issues, European university researchers are seeking more robust interactions with companies, which is leading to increased ethics concerns. Janez Potocnik, European Commissioner for Science and Research, recently commented at a January conference that, "Ethics is fast becoming an integral part of governance, particularly for scientific research in Europe – within the EU research framework and also more generally."


In the United States, the growth of industry-supported biomedical research and clinical trials has contributed to a greater scrutiny of COI issues, says Mildred Cho, associate director of Stanford University's Center for Biomedical Ethics. "There's definitely been a trend developing over the past 25 years affecting universities and research."

Much of this stems from the Bayh-Dole Act of 1980, which facilitates interactions between universities and the private sector and encourages US universities to commercialize federally funded research results.4 Most major US universities have made licensing intellectual property and technology a priority and, in some cases, a significant revenue stream.5

Because of Bayh-Dole, "there is more potential for intersecting financial streams, and the awareness that many things could cloud one's objectivity in research," Baldwin says. But such potential conflicts can be managed. "It's not a big thing; it's life. You've got to deal with it," she adds.

But doing so is complicated by the magnitude of the interactions. In the past, clinical trials were mostly conducted at academic medical centers with government funds. Today, 75% of all funding for clinical trials in the United States comes from corporate sponsors. The controversies over Vioxx and pediatric antidepressants were fueled, in part, by allegations that drug companies did not reveal adverse health effects uncovered in their industry-sponsored clinical trials.

In many cases, academic researchers have been barred from writing about or discussing results from industry-sponsored clinical trials. In December 2004, the American Medical Association passed a resolution opposing "clinical trial gag clauses" and encouraging physicians to present scientific findings "free of corporate interference." This year, the US Congress will consider legislation requiring drug companies and medical device manufacturers to post all clinical trials data in public databases.

"I see this as an issue of conflict at the university level," says Shreefal Mehta, a professor of biotechnology management at Rensselaer Polytechnic Institute, Troy, NY. The US Supreme Court essentially said as much by refusing to hear an appeal of Madey v. Duke,6 in which a Federal court determined that Duke University had made a "business of research."

Caplan blames the US government for ignoring warning signs of impending ethical disasters. "There was a lot of hand-wringing and promises made over the past six years" to change the federal ethics oversight infrastructure, he says. But little changed because the administration and Congress have an "antiregulatory and prore-search stance." Caplan doesn't expect much to happen this year, either. "I don't think we'll see rules issued for how data safety-monitoring boards should operate or [for developing] better, standardized informed consent forms for human subjects," he says. "We may get an adverse event registry, but I'm not betting on it."


Some US universities are already making changes, spurred in part by the 1999 death of Jesse Gelsinger in a gene therapy clinical trial at the University of Pennsylvania.7 A number of subsequent government and academic studies have recommended ways to strengthen oversight mechanisms, including those of institutional review boards (IRBs).

"There has been a lot of tightening up of conflict-of-interest policies, especially when it involves research on human beings," says David Korn, senior vice president for biomedical and health sciences research at the Association of American Medical Colleges (AAMC) in Washington. "No university wants to be caught with an allegation of cutting corners because of financial interests if some unfortunate event happens in clinical research," Korn says. "It's just a very, very uncomfortable position for any institution to be in."

A recent AAMC survey8 found that US medical schools have made "significant progress" since 2001 in clarifying and strengthening their COI standards in clinical research. But the survey also disclosed room for improvement: 40% of medical schools do not require researchers to disclose significant financial interests in oral presentations of research results, while 41% of schools with standing COI committees do not evaluate significant financial interests prior to granting final IRB approval.

In the United Kingdom, a "significant number" of university ethics committees do not scrutinize all their research, according to a recent study9 funded by the Nuffield Foundation. In a survey of 78 universities, researchers at King's College London found that one-quarter of the universities did not have a formal policy in place, and slightly less than half had not designated a person to be responsible for research ethics.

In May 2004, Harvard Medical School voted to prohibit full-time faculty from taking previously allowed executive positions with businesses "engaged in commercial or research activities of a biomedical nature."10 In December, Stanford University approved measures to lower thresholds for disclosing financial interests, and it redefined guidelines for research on human subjects. Arthur Bienenstock, vice provost and research dean, told the Stanford Report,11 "I think as an institute it would be really wise for us to ensure that the faculty know the source of funding for their research."

The best way to deal with potential conflicts is to avoid them, Dickens says. "But we're way past that. So instead we need transparency. This means public vigilance, institutional vigilance, government vigilance, and peer scrutiny."

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!