FLICKR, JJACKOWSKIAsk a scientist—any scientist—what irks them most about publishing and they are sure to mention peer review. The process has been blamed for everything from slowing down the communication of new discoveries to introducing woeful biases to the literature. Perhaps most troubling is that few believe peer review is capable of accomplishing what it purports to do—ensuring the quality of published science.
Indeed, several studies have shown that, in actuality, peer review does not elevate the quality of published science and that many published research findings are later shown to be false. In response, a growing number of scientists are working to impose a new vision of the scientific process through post-publication review, the process of critiquing science after it has become part of the literature.
Reviewing published work is, of course, nothing new. Scientists have always been welcome to publish contradictory findings, for example, contact the papers’ authors directly, or write a letter to the journal’s editor. However, because all are lengthy processes that likely will never be heard or seen by the majority of scientists, most scientists do not participate in formal reviews.
A small number of scholarly journals have launched online fora for scientists to comment on published materials. Uptake, however, has been slow for a number of reasons—chief of which is the inconvenience of commenting journal by journal.
“If you want to comment on a Nature paper, you have to go to the Nature site, find that paper, and comment. If you want to comment on a PLOS paper, you have to go to a different website, and so forth,” said Stanford University’s Rob Tibshirani, professor of health research and policy and statistics. “It’s a major time investment, particularly when people may never see the comments.”
Likewise, social media platforms, blogs, and other websites—such as Zotero, CiteULike, and Mendeley, to name a few—have also seen only scattershot commenting activities, at best.
Frustrated by these inefficiencies, Tibshirani is one of several scientists behind the development of PubMed Commons, a new post post-publication peer review system housed on the oft-accessed National Center for Biotechnology Information (NCBI) biomedical research database. The Commons, announced today (October 22), allows users to comment directly on any of PubMed’s 23 million indexed research articles, much in the way people review films on Rotten Tomatoes, evaluate restaurant service on Yelp, or grade purchases made on Amazon.
Tibishiri said an organized post-publication peer review system could help “clarify experiments, suggest avenues for follow-up work and even catch errors.” If used by a critical mass of scientists, he added, “it could strengthen the scientific process.”
David Lipman, who heads up NCBI, said the development of the Commons was a driven by “consistent and increasing requests from PubMed users.” Approximately 2.5 million to 3 million people access the online resource each day, according to Lipman.
Comments made through PubMed Commons will be covered by Creative Commons Attribution license and available for anyone to browse. However, only those who have authored PubMed-indexed comments are able to comment on others’ papers. For those who are eligible to comment, the process is simple: after registering, users can review the article in a comment box that follows each article. The comment box also allows users to post replies to existing comments and rate whether existing comments are useful.
In addition, PubMed Commons will aggregate comments to create a “hot list” that draws attention to papers that are trending.
PubMed Commons has already been privately beta-tested by approximately 250 users and is currently open for additional testing by National Institutes of Health (NIH) grantees. NIH will evaluate comments before officially green-lighting the system. If the decision is to make the system fully public, NCBI will provide a public API so that publishers and other groups can make these comments useful to a wider community.
For now, scientists eager to comment directly through PubMed can do so via PubPeer, a website funded and run by an unknown team of scientists that allows first and last authors of published articles to comment on almost any scientific article published with a DOI or preprint in the arXiv. A free browser plug-in is available for download that that show PubPeer comments directly on PubMed.
Just more than a year old, PubPeer has already demonstrated the power of post-publication peer review in improving the research underlying publications. The most noted example to date is a comment, posted by a person known only as “Peer 1,” who identified several figure-related and typographical errors in a highly publicized Cell paper that reported the creation of human embryonic stem cells through cloning. Though study’s authors maintain their conclusions, they did issue an erratum addressing various formatting errors raised by PubPeer users.
By allowing for anonymous comments, PubPeer aims to create an open, debate-friendly environment, while maintaining the rigor the closed review process currently used by most journals. Its creators, who describe themselves as “early-stage scientists,” have also decided to remain anonymous, citing career concerns. “A negative reaction to criticism by somebody reviewing your paper, grant or job application can spell the end of your career,” a representative from PubPeer explained by e-mail. “We don't want to take that risk and we understand that many commenters do not want to either.”
Detractors say that anonymous comments all too often support destructive discourse. Earlier this year, for example, the Huffington Post and The Miami Herald decided to ban anonymous comments while, just last month, Popular Science chose to disable commenting altogether because of excessive trolling. PubPeer hopes to mitigate concerns stemming from anonymity by reviewing each comment to ensure it directly addresses a paper’s data before it is publicly posted.
PubMed Commons chose to circumvent the downfalls of anonymity by requiring all users to register and identify themselves. Still, Lipman noted, there were “cogent and compelling arguments for both anonymous and identifiable commenting.”
“Those wanting anonymous posts were concerned that many scientists, especially junior researchers, would be reluctant to make critical comments. But those opposed to anonymous comments believed that the quality of interchange would be higher if commenters were required to identify themselves,” he said, adding that the group was “very much split” on the decision.
User feedback will determine whether PubMed Commons continues to require users to identify themselves. “User experience will determine the direction we take [PubMed Commons],” said Lipman. “This is meant to be a community resource.”