Opinion: What the History of Blood Transfusion Reveals About Risk
Opinion: What the History of Blood Transfusion Reveals About Risk

Opinion: What the History of Blood Transfusion Reveals About Risk

Every medical intervention—even one with a centuries-long history—brings dangers, some of which become clear only later.

Paul A. Offit
Sep 1, 2021

ABOVE: © ISTOCK.COM, RYANJLANE

When making decisions about medical procedures, we yearn for complete information about the risks and benefits. Unfortunately, we will never know everything. The question is, when do we know enough to be reasonably assured that undergoing a particular procedure or taking a particular drug is worth the risk? Consider blood transfusions, for example. When would you have deemed blood transfusions safe enough to consent to one?

We’ll start at the beginning. 

Decision Point #1

In 1665, Richard Lower solved a fundamental problem of blood transfusions: clotting. Once exposed to air, blood quickly clots, making it impossible to transfuse. After draining large quantities of blood from a dog and causing it to go into shock, Lower’s solution was to use a series of tubes to connect an artery from a healthy dog to a vein of the dying dog. It worked, saving the dog’s life. 

BASIC BOOKS, SEPTEMBER 2021

Two years later, four blood transfusions were performed by linking the arteries of lambs and calves to the veins of people suffering from diseases ranging from persistent infections to schizophrenia. Each of these animal-to-human blood transfusions were accompanied by fever, chills, back pain, darkened urine, nose bleeds, and an intense burning sensation at the site of the transfusion. Called “transfusion reactions,” no one knew what caused these symptoms. Nonetheless, all four recipients lived through the procedure, and some felt better afterward. 

Would you have chosen to have received a blood transfusion in 1667?

Actually, you wouldn’t have had the choice. In 1667, Pope Innocent XI signed an order banning the procedure for Catholics, arguing that doctors performing it were playing God. Two years later, the French parliament enacted its own ban, and 11 years after that, the English parliament did the same. More than 200 years passed before anyone dared to try again. 

Decision Point #2

Once the bans faded, doctors began to experiment with human-to-human transfusions, but transfusion reactions were still a problem. Then, in 1901, Karl Landsteiner, a young researcher working in Vienna, Austria, found the cause of transfusion reactions. Landsteiner took serum and red blood cells from colleagues and identified two different proteins (A and B) on the surface of red cells, which could be present alone or in combination, producing the A, B, and AB blood types. Blood without either of these proteins was labeled type O. Landsteiner found that serum from someone with type A blood destroyed red cells from someone with type B blood, and vice versa, causing potentially fatal reactions. Landsteiner’s findings allowed for the first successful human-to-human transfusions.

In 1907, Reuben Ottenberg, a 25-year-old doctor at Mount Sinai Hospital in New York, became the first person to transfuse blood from one person to another using Landsteiner’s blood types as a guide. 

Would you have chosen to have received a blood transfusion in 1907?

Unfortunately, some blood transfusions with the right type of blood still caused serious reactions. As it turned out, Landsteiner wasn’t quite finished. In 1919, he identified yet another protein on the surface of red blood cells: Rh (so-called because he found it in rhesus monkeys). This was especially a problem during pregnancy when mothers who were Rh negative carried a baby who was Rh positive, with occasionally fatal results. As a consequence, marriages between Rh-negative women and Rh-positive men were prohibited. But better donor matching brought about by Landsteiner’s discoveries had virtually eliminated transfusion reactions.

Decision Point #3

By the 1930s, physicians had syringes, needles, stopcocks, and glass tubes coated with paraffin that eliminated the need for direct artery-to-vein transfusions. Further, by adding a 0.2 percent solution of sodium citrate, blood could be prevented from clotting, allowing blood to be stored. Blood banks were born and blood transfusions became more commonly available.

Would you have chosen to have received a blood transfusion in 1930?

Around this time, it became clear that the risks of this procedure did not end with transfusion reactions. By the late 1930s, measles, malaria, and syphilis infections had been linked to blood transfusions; many of these cases were fatal. The number of transfusion deaths, however, paled in comparison to the toll from a blood-product-borne outbreak that occurred in the early 1940s. 

In March 1942, the Office of the Surgeon General noted a growing incidence of jaundice (yellowing of the skin caused by liver disease) among US army personnel. All those affected had recently received a yellow fever vaccine that contained human serum as a stabilizing agent. This serum had been obtained from nurses, medical students, and interns at Johns Hopkins Hospital in Baltimore, several of whom had a history of jaundice and one of whom was sick at the time of the donation. When the dust settled, more than 330,000 servicemen had been infected and 1,000 had died from hepatitis. It was one of the worst single-source outbreaks of a fatal infection ever recorded. The incident highlighted the dangers of transferring blood or blood products from one person to another. 

Decision Point #4

It was not until 1964 that Baruch Blumberg discovered the cause of the 1942 outbreak: hepatitis B virus. By 1971, a blood test was available to detect it. In 1972, the FDA mandated that all blood be screened for the presence of hepatitis B as well as for measles, malaria, and syphilis. Blood transfusions were now safer and easier to perform than ever before. 

Would you have chosen to have received a blood transfusion in 1980?

As it turned out, hepatitis B virus wasn’t the only hepatitis virus that could contaminate donated blood. Hepatitis C was later found to cause about 90 percent of transfusion-associated hepatitis. In less than 12 months in the early 1980s, hepatitis C virus infected 180,000 people who had received blood transfusions, killing 1,800. 

Then another virus entered the United States blood supply. This particular virus was so feared, so vilified, and so misunderstood that citizens worried that they could catch it not only from receiving blood, but from donating it: HIV. By March 1983, more than 1,200 cases of the disease caused by the virus, now called Acquired Immune Deficiency Syndrome or AIDS, had been reported in the US, including 17 transfusion cases. By the end of that year, the country was up to 3,000 cases and 1,300 deaths. Between 1978 and 1985, 29,000 Americans who had received tainted blood transfusions had developed AIDS. Most would die from the infection. As a consequence, blood donations in the United States plummeted. Today, one-third of Americans still believe that people can catch this virus by donating their blood.   

Decision Point #5

In August 1984, French researcher Luc Montagnier isolated HIV. In August 1984, a test for HIV was developed. By April 1985, this test was routinely used by blood banks across America. Several changes were also made to the handling and processing of blood in the wake of the HIV tragedy. Today, requirements for heat, solvent, and detergent treatment of blood have dramatically reduced the likelihood of contamination with certain viruses. Indeed, no cases of hepatitis B virus, hepatitis C virus, or HIV have been associated with blood products since 1985.

Would you receive a blood transfusion today? 

Whole blood is now routinely tested not only for hepatitis B, hepatitis C, and HIV, but also for bacteria such as Treponema pallidum (which causes syphilis) and viruses including West Nile and Zika. But while this has decreased the possibility of transmitting infections, it hasn’t eliminated it. 

Many potential infectious agents might still be present for which routine testing isn’t performed. For a variety of reasons, pathogens such as prions (which cause mad cow disease), Epstein-Barr virus (the cause of mononucleosis), cytomegalovirus (which causes birth defects), parvovirus B19 (which causes a rash, fever, and anemia), Ebola virus, dengue virus, chikungunya virus, and coronaviruses such as SARS-CoV-1, MERS-CoV, and SARS-CoV-2 are not subject to routine testing. 

Every three seconds, someone in the world is transfused with a stranger’s blood. In the United States alone, 16 million units of blood are transfused into 10 million people every year. Nowadays, for those who require a blood transfusion, the benefits clearly outweigh the risks. But as I detail in my new book, You Bet Your Life: From Blood Transfusions to Mass Vaccinations—The Long, Risky History of Medical Innovations, progress toward safe and effective blood transfusions, as with other therapies, requires risks that sometimes cost patients their lives. It can be tempting to try to avoid risk altogether, but the truth is that there are no risk-free choices—only choices to take different risks. 

Paul A. Offit is a professor of pediatrics at the Children’s Hospital of Philadelphia. Read an excerpt from You Bet Your Life.