ABOVE: andrzej krauze

Iam no fan of anesthesia. The feeling of being rendered unconscious to facilitate the manipulation of my body, only to be reanimated afterward, gives me, like many people (I assume), the heebie-jeebies. But alas, anesthesia is a medical necessity. It has made lifesaving surgeries and once-dreaded dental procedures pain-free and relatively routine for more than 150 years. My own medical care, not to mention that of billions of other people and animals, has benefited greatly from this chemical control of consciousness.

Beyond my personal misgivings, anesthesia’s development into a widely accepted medical protocol illustrates an interesting, if outmoded, avenue of innovation—let’s call it efficacy sans mechanism. As described here by scientists Emery Brown and Francisco Flores in their dispatch from the front lines of anesthesia research, in the mid-19th century, dentist William Morton successfully put a patient under general anesthesia (using ether vapor, in this case)...

As the new application for ether took the global medical community by storm, other anesthetics—many of them ether derivatives—were added to the surgeon’s toolbox. But it wasn’t until the 1980s that scientists began to parse the specific mechanisms of action for a variety of anesthetics, some of which had been part of standard medical practice for more than a century. Even today, more than 170 years after the first successful general anesthetic was administered, science is still uncovering the intricacies of how these drugs work in the brain.

The arc of discovery for anesthesia stands in stark contrast to our current framework for biomedical research. The time that stretches between the identification of potentially therapeutic compounds and their use in the clinic is now measured in decades, not months. Rigorous testing—for safety, efficacy, and dosage—lies between the bench and the bedside. Through this extensive study, a drug’s mechanism of action is typically uncovered and dissected.

Even today, to be sure, understanding a drug’s mechanism is not a prerequisite for approval, and there are established mechanisms for accelerating the clinical use of biomedical breakthroughs. (See “Picking Up the Pace,” The Scientist, January 2016.) But could we imagine a modern scenario in which a drug was adopted as swiftly after its first successful clinical use as ether was? Likely not. And that’s a good thing. The tale of the medical revolution sparked by general anesthetics via a long-abandoned model of drug development sounds quaint to our ears—even nostalgic. But it was an exception, not the rule. For every ether-soaked success story, history is littered with countless other tales of unproven medical treatments causing severe and widespread harm.

Even though modern researchers have tools, technologies, and biological insights that would have been utterly fantastical to their 19th-century counterparts, the danger of untested treatments is greater now than it was then. As the steamship has given way to the internet, word of untrialed medical approaches spreads faster than ever before, meaning that the potential to do harm is amplified. One has only to look to recent upticks in antivaccine sentiment or the rise of spurious supplements for illustrations of the corrosive power of spreading unverified scientific knowledge via modern modes of communication.

Even when interventions do work, it’s important to understand the mechanism. In the case of general anesthesia, researchers have been hard at work digging into the nuts and bolts of the revolutionary drugs ever since that first successful application. Over the past several years, the resulting insights are feeding back into clinical practice, honing the application of modern anesthetics. This heartens me. Even if being put under still gives me the willies.

Bob Grant

Editor-in-Chief

Interested in reading more?

Magaizne Cover

Become a Member of

Receive full access to digital editions of The Scientist, as well as TS Digest, feature stories, more than 35 years of archives, and much more!