The nationwide experiment will initially include around 100,000 volunteers.
Molecular and Cellular Biology has found numerous errors after launching a retrospective sweep of the figures it’s published in recent years.
June 12, 2017|
WIKIMEDIA, COENAs part of a comprehensive—and uncommon—plan to maintain a squeaky-clean literature, Molecular and Cellular Biology has picked through its archives from the past several years to find troublesome figures, duplications in particular. Last month, the journal began to publish the first retractions and corrections to shake out of this quality-assurance effort.
“These corrections will probably be going on for several more issues to come,” says Roger Davis, the journal’s editor-in-chief and a researcher at the University of Massachusetts Medical School. In May, the journal published one retraction and in June another retraction and eight corrections. None of the corrected or retracted articles were flagged by the post-publication peer review website PubPeer prior to the notices being published in the journal.
Like The EMBO Journal and the Journal of Cell Biology, which have led the way in dedicating staff to scrutinizing figures, Molecular and Cellular Biology prospectively analyzes submissions for inappropriate image manipulation, and has done so for years. Last year, it stepped up its efforts to hunt for duplications within papers before they get published, and then took the unusual step of applying this process to already-published papers going back to 2010. The work involves a combination of software (the US Office of Research Integrity provides forensic image analysis tools) and visual detection by staff members.
“Frankly, [following up on PubPeer comments] is keeping us more than busy,” says Bernd Pulverer, chief editor of The EMBO Journal. He says he’d love to be able to do a retrospective screen, but with limited resources his staff has focused on catching errors before they make it to print. “I would applaud them,” he says of Davis’s team.
Davis says the retrospective screen started about a year ago and finished last fall. Since then, the journal has been contacting authors to work out the appropriate fixes. For the most part, the problems appear to stem from simple mistakes, Davis says, and “the most general reaction we receive is actually quite positive. First, people are horrified there is a problem. Also, there is an enormous willingness on the part of authors to try to fix it.”
The volume of errors “is significant,” he adds. “It’s more than we anticipated.” Davis says the rate is likely similar to what others reported last year when they plumbed 20,000 papers across 40 journals looking for duplications and found 3.8 percent contained copied-and-pasted images.
Rebecca Alvania, the executive editor of the Journal of Cell Biology, says duplications are some of the most challenging errors to find because she’s not aware of any efficient software that can detect them. “It’s mostly a manual process.” Indeed, the “sweat equity” of manually picking through papers is high, says Davis, “but the results of having a clean journal is worth whatever the cost.”
Davis says he’d like to use the results from the clean-up to see if the journal’s prospective screening process has improved submission quality. In the case of The EMBO Journal, says Pulverer, post-submission image review has done nothing to make authors more careful. “We’d hope people would learn, but it just seems stable,” he says of the error rate his team uncovers. “It’s amazing. There’s a real lack of education.”