Several recent, very public cases of scientific fraud have led the public to question the efficacy of manuscript review by biomedical journals. I do not think that the process is fundamentally flawed. I do think, however, that it can be improved.

For centuries, the editors of scientific journals have relied on expert academic peer reviewers to determine whether the conclusions drawn by the authors of a manuscript are supported by the data presented. Now, editors can take manuscript review one step further, by determining whether the digital image data presented are accurate representations of the original data. I believe they should use this ability to help ensure the accuracy of the data they publish.

Electronic workflows

As a result of the revolution in electronic communication, many journals now have completely electronic workflows, whereby authors submit all text and figures as electronic files. This provides for efficient transfer of information and...

Setting and enforcing guidelines

As editors implement electronic workflows, they have a responsibility to set guidelines for their authors on the proper handling of digital image data. It is essential to have clear guidelines, as some level of image manipulation is accepted practice-for example, image cropping or limited adjustment of brightness and contrast. The boundaries between acceptable and unacceptable manipulation, however, must be made clear to authors.

After these guidelines are introduced, editors have a responsibility to enforce them. To do so requires the establishment of definitions of misconduct, procedures for identifying misconduct, and policies for handling misconduct.

At the Journal of Cell Biology (JCB) we examine all digital images in all accepted manuscripts for evidence of manipulation. For black and white images this involves simple adjustments of brightness and contrast in Photoshop (see figure, part A). For color images, we use the "Levels" adjustment in Photoshop to compress the tonal range and visualize dim pixels (see figure part B).

We have created standards for acceptable manipulation,1 and we perform an initial investigation with the authors if we believe those standards may have been violated. To do so, we request that they submit the original data to the journal for comparison to the prepared figure(s) in question.

Investigating misconduct

Editors of some high-profile biomedical journals have recently voiced the opinion that journals cannot be investigative bodies.2,3 I disagree. Clearly, journal editors cannot get access to authors' lab notebooks, but they certainly have a right to examine the raw data corresponding to any information presented in a manuscript, especially when it is possible to determine empirically whether the raw data have been manipulated. This adds another layer to the review process beyond traditional peer review.

During our three and a half years of screening experience at the JCB, we have had to revoke the acceptance of 1% of papers that passed peer review because we detected fraudulent image manipulation that affected interpretation of the data. We do not take such cases lightly. Four editors must agree on a determination of fraudulent manipulation: the managing editor and three academic editors, including the monitoring editor, a senior editor, and the editor in chief. We do not consider the element of intent; acceptance is revoked if any conclusion in a paper is called into question by the manipulation.

Twenty-five percent of all our accepted manuscripts have at least one figure that must be remade because we detect "inappropriate" manipulation, that is, the manipulation does not affect the interpretation of the data, but it violates our guidelines for presenting image data. This indicates a widespreadmisunderstanding of the line between appropriate and inappropriate manipulation, which will have to be addressed during the training of students in the responsible conduct of research. In almost all cases, we have been able to resolve incidents of image manipulation ourselves; only on very rare occasions have we needed to request the help of an institutional investigative body.

The Hwang case

One of the supplemental figures that Hwang and colleagues published in the now infamous stem cell cloning paper4 contained manipulated images. The image in the figure, part B is from the third row of Supplemental Figure S1B in that paper. It purports to show negative staining for a particular cell-surface marker in four different cell lines. A simple adjustment of tonal range clearly shows that the two middle images are identical. The minor differences in pixel structure are due to image compression.

It is likely that we would have identified this duplication with our routine screen. It is important to note, however, that this would only have led us to request the original data from the authors, who could have dishonestly claimed to have made a clerical error and provided different images. This illustrates a potential limitation of our investigative capabilities, but editors will be surprised at how easily a deception can unravel if they start asking questions.

Other types of data

What about other types of data besides image data? It is clearly more difficult to determine whether numerical data have been misrepresented, fabricated, or falsified. There are, however, clues to potential problems with these data that should raise red flags with peer reviewers and editors. These include exceedingly small data sets, exceedingly narrow and/or uniform error bars on a graph, or the presentation of data from a "representative experiment."

<figcaption>SIMPLE IMAGE FORENSICS CAN REVEAL MANIPULATIONS A) For black and white images, brightness and contrast adjustments can reveal background inconsistencies that are clues to manipulation. In the top panel, the intensity of the band in lane 4 has been adjusted. In the bottom panel, lane 5 is a duplicate of lane 2. Note the rectangular boxes around the manipulated bands. B) For color images, tonal-range adjustments can reveal manipulations. The original shows what appears to be negative staining in four separate cell lines. In the adjusted image, the two middle panels are clearly duplicates. The minor differences in pixel structure are due to image compression. (Part B reproduced with permission from Ref. 4) Credit: © 2005 AAAS</figcaption>
SIMPLE IMAGE FORENSICS CAN REVEAL MANIPULATIONS A) For black and white images, brightness and contrast adjustments can reveal background inconsistencies that are clues to manipulation. In the top panel, the intensity of the band in lane 4 has been adjusted. In the bottom panel, lane 5 is a duplicate of lane 2. Note the rectangular boxes around the manipulated bands. B) For color images, tonal-range adjustments can reveal manipulations. The original shows what appears to be negative staining in four separate cell lines. In the adjusted image, the two middle panels are clearly duplicates. The minor differences in pixel structure are due to image compression. (Part B reproduced with permission from Ref. 4) Credit: © 2005 AAAS

One possible way to address problems with numerical data would be to require authors to submit all their raw data for comparison to the prepared figures. However, doing this comparison would be much more time consuming than image screening. At a minimum, journals should set guidelines for presentation of numerical data that promote accurate graphing practices.

Protecting the published record

We at the JCB advocate that journal editors take a proactive approach to detect potential misconduct and resolve it before publication. It is not enough to respond to allegations of misconduct made by others, and it is certainly not necessary to pass on all such suspicions to an investigative body, as others have advocated.5

Despite the additional safeguards we have put in place, if someone is determined to publish fraudulent data, they may well succeed. It is important to keep in mind that our screening can pick up manipulations done to image files after they have been acquired, but it will not pick up adjustments to the data made during acquisition (e.g., using settings on a microscope). Even with these limitations, it is a cop-out for an editor to say that the review process can never be perfect, when the publication of fraudulent work comes to light.6 Editors have a responsibility to do what they can to protect the published record, and they can now do something beyond peer review.

To editors who argue that they do not have the time or funds to do this kind of screening, think about the effort expended in dealing with a high-profile case of fraudulent research. The vast majority of biomedical journals are owned by commercial publishers, who make considerable profits from them. These companies should bear the cost of improving manuscript review as part of the responsibility of participating in scholarly publishing. Such enhanced review has the potential to save considerable time and public funds wasted by scientists forced to debunk published fraudulent research, when they could have been used instead to make real progress.

Mike Rossner is managing editor of the Journal of Cell Biology.

References

1. M. Rossner, K. Yamada, "What's in a picture? The temptation of image manipulation," J Cell Biol, 166:11-5, 2004. 2. R. Smith, "Investigating the previous studies of a fraudulent author," Brit Med J, 331:288-91, 2005. 3. L.K. Altman, W.J. Broad, "Global trend: More science, more fraud," The New York Times, Dec. 20, 2005.4. W.S. Hwang et al., "Patient-specific embryonic stem cells derived from human SCNT blastocysts," Science, 308:1777-83, 2005.5. U. Savla, "When did everyone become so naughty?" J Clin Invest, 113:1072, 2004.6. BBC News, "Cancer study patients made up'," Jan. 16, 2006.

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!