Opinion: Reimagining the Paper

Breaking down lengthy, narrative-driven biomedical articles into brief reports on singular observations or experiments could increase reproducibility and accessibility in the literature.

By | May 2, 2016

PIXABAY, HOLDENTRILSThe scientific journal article—or “paper”—is 351 years old. Papers have had an incalculable impact on science, increasing collective engagement and the rate of knowledge dissemination.

In 1665, the Royal Society commissioned its secretary, Henry Oldenburg, to publish and edit the Philosophical Transactions of the Royal Society—arguably the first peer-reviewed scientific journal. Before then, scientific knowledge was disseminated through two approaches: oral communication or the publication of lengthy, esoteric books.

Journals filled the gap between these two extremes; they were a hybrid medium for scientific communication, offering both the sense of urgency conveyed by oral communication and the public recognition associated with book publication. The restructuring of scientific knowledge into periodical journals also led to the birth of the paper as the foundational unit in reporting new scientific findings.

In its original conception, the paper described a specific observation or experiment. The publication of papers in journals shortened the time between observation and knowledge dissemination, accelerating further discovery and the evolution of scientific concepts and methods. This complemented the peer-review process by limiting the ability of false scientific beliefs from finding early adopters without swift public challenge.

But papers have since become more complex, negating many of these benefits—and even creating new challenges. Shifting the focus of a paper from narration to empiricism would lead to more robust communications and higher data quality.


Today’s papers are often products of several years of work carried out by teams of scientists. Instead of detailing an observation or experiment, they are collections of several empirical observations interwoven with deductive or inductive logic. Examination of the number of data figures per paper in top biomedical journals revealed a two- to four-fold increase over the past three decades. Today’s papers seem to be gradually morphing into the books they were intended to replace.

The complexity of scientific papers, especially in biological sciences, is possibly a consequence of the rapid increase in scientific knowledge, which has made novel hypotheses increasingly complex. However, it is also likely that this complexity is due to a culture of competition. Papers are getting longer and longer because a complete “story” is now a requisite for publication in many high-impact journals.

The increasing length of papers has led to two primary problems that hinder scientific progress. First, the time it takes to generate enough data to reach the standards of high-impact journals prevents some scientists from sharing important data in a timely manner. The other problem, which is often overlooked, deals with the structure of scientific knowledge.

Papers are the foundational unit of structure in science. The more complex they become, the more difficult it is to discern the empirical data they contain. It is thus becoming increasingly difficult to identify what experiments have already been performed or whether certain findings have been independently verified.

Consequently, information retrieval is far from precise. Even the most sophisticated scholarly search engine suffers from a relatively low precision and recall measure; many query results are irrelevant while related results are often missed. Therefore, scientists struggle to keep up with relevant findings in peripheral fields and are unable to identify important literature when they move to a new field. This issue is only amplified by the fact that there are more than 25 million scientific articles indexed in PubMed (with more than 1 million published annually since 2011). The simpler a scientific paper is, the easier it is to quickly archive, retrieve, and utilize.

Over the past decade, there have been several attempts to increase the speed and robustness of scientific communications. These approaches have focused on either publishing data that is often excluded from journal articles on websites like figshare, or skipping the peer-review process through preprint servers like arXiv. In both cases, the quality of peer review is often sacrificed for promptness and convenience, thus discouraging large-scale adoption.


The introduction of a new medium that disseminates peer-reviewed communications of single empirical observations could address many of the shortcomings of the current publication model.

Scientists could, for example, publish their findings through journal-associated online portals where they undergo peer review. More elaborate “hypothesis papers” could take the place of the current research paper in which various pieces of data are logically connected to produce a generalizable conclusion. This radical shift from publishing hypothesis-driven narratives to pure empiricism would, in my opinion, provide a durable solution to the problems facing scientific publishing.

The fragmentation of the modern paper into its empirical constituents would increase the speed of scientific exchange, increase collective engagement, and introduce a new level of resolution to scholarly information retrieval. It would encourage scientists to publish more reliable data (since every figure would be evaluated independently). The decoupling of logic and observation would also decrease experimenter bias for positive results that fit an overarching hypotheses. Finally, by decreasing the time and effort needed to publish, scientists might be more inclined to publish negative or confirmatory results.

One potential criticism of this model is that it might dampen the competitive edge in science by fragmenting rewards and increasing transparency. Scientific competition has long been established as an essential element for maintaining skepticism, quality control, and a high rate of discovery. However, the level and extent of competition within the scientific community has reached pathological levels. It has led to the production of irreproducible data, fraud, and the rampant exploitation of trainees.

Fragmenting the scientific paper would reallocate publication-associated rewards, allowing for a more even distribution. The approach I propose would not eliminate competition; rather, this approach would beget competition driven by quality rather than reward.

Innovation and discovery are rarely a consequence of individual genius. They are outcomes of collective intelligence constructed through efficient communication. Decreasing the quantum of scientific publishing and shifting its focus from narration to empiricism would increase the speed of scientific dissemination without sacrificing quality.

Ahmed Alkhateeb is a postdoctoral research fellow at Harvard Medical School and Massachusetts General Hospital.

Add a Comment

Avatar of: You



Sign In with your LabX Media Group Passport to leave a comment

Not a member? Register Now!

LabX Media Group Passport Logo


Avatar of: afsinfo


Posts: 1

May 3, 2016

After re-making textbooks from deep sources of knowledge into cribsheets for the tests, pre-highlighted by the publishers to show the main points, and leaving out the (important) subtleties, now comes the call to simplify the presentation of research papers to show the main points cleanly, clearly, and maybe even, when the publisher's typography department gets hold of it, with pre-published highlighting...

I would argue that this fundamentally mis-presents science.  Almost every paper makes an incremental, nuanced, contribution to science.  The subtle points made by the author are essential to its intellectual integrity, and to presenting the reasoning and purpose of the research. 

If the scientific paper is to be teased apart into a cleaner presentation, extreme care must be taken to ensure that the essential character, scientific reasoning, and qualifications and limits of interpretation are preserved.  Publishing history does not make one hopeful.

Avatar of: twangcn


Posts: 5

May 3, 2016

Good idea! Today, paper is very formative publication wasting numerous valuable resources, such as experimental costs, repeated work, reading time and wood-made paper.

Avatar of: nslavov


Posts: 11

May 4, 2016

My comments are directed to this statement:  

"These approaches have focused on either publishing data that is often excluded from journal articles on websites like figshare, or skipping the peer-review process through preprint servers like arXiv. In both cases, the quality of peer review is often sacrificed for promptness and convenience, thus discouraging large-scale adoption."


-- Preprints do not intend to skip peer-review. Thousands of preprints undergo traditional peer review and are ultimately published in traditional journals. Preprints intend faster communication and hopefully more inclusive peer review.       


-- Preprints have been widely adopted and are prominent, at least if citations can be construed as a measure of prominence.   

Popular Now

  1. A Coral to Outlast Climate Change
  2. Antarctica Is Turning Green
  3. First In Vivo Human Genome Editing to Be Tested in New Clinical Trial
  4. How to Tell a Person’s “Brain Age”