Although the practice of routine, effective blood transfusion became a reality only in the 20th century, interest in transferring blood from one individual to another has a lengthy history, largely based on the idea that blood must carry the life force of the body since its excessive loss often was fatal. Initial restorative attempts consisted of drinking blood. Pliny the Elder wrote that during Roman times it was fashionable to rush into the arena to drink the blood of fallen gladiators, while other notables such as Galen advised that drinking the blood of animals would cure certain maladies. The idea of transferring blood intravenously from one person to another took root after Harvey’s description of blood circulation due to the pumping action of the heart in 1628. However, sporadic attempts to perform blood transfusions over the next several hundred years were often disastrous for both the blood donor (frequently a dog or a sheep) and the human recipient. In these early days, transfusion was particularly frowned upon in some medical circles because it ran contrary to a favored treatment of the time, bloodletting (often with leeches), leading to a ban on the procedure in France and England for a time in the late 1600s.
Nonetheless, attempts at transfusion persisted. This image from 1828 shows a patient receiving blood directly from a donor through a device called a gravitator, the invention of Dr. James Blundell. Anticoagulants were not available at this time, and as might be expected, the procedure often failed because the blood clotted in the transfusion apparatus. Moreover, even if the blood found its way into the patient, it was more likely to do harm than good due to transfusion reactions caused by donor incompatibility.
The modern era of transfusion medicine began in 1901 with the description of the ABO blood group antigens by Karl Landsteiner. Subsequent technical advances during the early 1900s, including anticoagulation, blood typing and crossing, and blood preservation and storage, made blood transfusion a standard of care and saved the lives of many patients and wounded soldiers. However, other problems cropped up during the latter part of the 20th century, particularly infectious complications, which came to the fore during the human immunodeficiency virus (HIV) epidemic in the 1980s and 1990s. Tragically, it is estimated that as many as 25,000 people were infected with HIV through contaminated blood products prior to the development of screening tests. Fortunately, HIV transmission is now rare, but certain transfusion-related complications persist, and the words of Robert Beal, a past leader in transfusion medicine, still hold: “Blood transfusion is like marriage: it should not be entered upon lightly, unadvisedly or wantonly or more often than is absolutely necessary.” Thus, the history of transfusion medicine is a dramatic and colorful affair that ultimately bends toward success but with many missteps along the way.1