4 of the Most Important Health Advances of the 20th Century
Archeological evidence shows that the practice of medicine has been a part of history almost as long as the human race, with ancient cultures like the Egyptians, Romans, Babylonians, and Chinese all experimenting and developing remedies and procedures to cure the illnesses and heal wounds. Academics believe that the system of modern medicine that we know today rose from the advances made by scientists and physicians of the 19th century, the era during which cell theory was developed and used to explain the spread of contagious diseases.
In addition to the development of cell theory, the 19th century also brought with it developments in the science behind physiology, chemistry, and pharmacology. It also served as the incubator of major clinical schools, where doctors could research and hone their craft. Together, these developments created a platform upon which modern medicine could grow.
Listed here are four of the most important health advances developed by medical professionals in the 20th century:
1. The Birth Control Pill
The birth control pill was a revolutionary step in women’s health because it was the first contraceptive to truly allow females some autonomy over their own reproductive capabilities. It took 10 years and over $2 million in research grants to develop the first oral contraceptive, Enovid, which received approval from the Food and Drug Administration in 1960. Unfortunately, single women couldn’t obtain the pill until 1972, when the Supreme Court legalized it for all women regardless of marital status in the case Baird v. Eisenstadt.
The development of the pill paved the way for further advances in women’s contraceptive options, including the Intra-Uterine Device, emergency contraception, vaginal rings, and hormone patches. Today, more than 100 million women around the world rely on the birth control pill to give them the power to choose when to start a family. Additionally, giving women more control over their reproductive systems has resulted in smaller families and longer intervals between births, both of which have led to better infant health and lower rates of maternal and infant mortality.
2. Blood Transfusions
Before the medical community fully understood human blood groupings, blood transfusion success rates were low. Many physicians administered the blood of animals to patients in an effort to treat their illnesses, often resulting in the development of additional diseases or fatalities. While the first successful human blood transfusion to a patient was performed in 1818 by James Blundell, it was 20th-century Austrian physician Karl Landsteiner’s discovery of human blood groupings that allowed the blood transfusion to develop into the relatively simple, life-saving procedure it is today.
Landsteiner’s discovery in 1901 gave pathologist Ludvig Hektoen the necessary knowledge to suggest that the blood typing of donors should be matched with that of its recipients in 1907. By 1914, medical researchers were able to preserve donated blood for longer periods of time by adding anticoagulants, and the first programs soliciting voluntary blood donations were established by the 1940s.
Today, almost 32,000 units of blood are given to patients each day in the United States, saving the lives of around 4.5 million people per year.
3. Organ Transplantation
Physicians in India were completing basic organ transplant procedures as early as 800 BC in the form of skin grafts for flesh wounds and burns. However, the more advanced form of organ transplantation that we know today was first successfully performed in 1905 by Eduard Zirm, who restored the eyesight of a man blinded in an accident through corneal transplantation.
In the early decades of the 1900s, initial organ transplant experiments involved attempting to transfer the kidneys of goats, pigs, and monkeys into individuals suffering from renal failure. The first successful transplant of an internal organ would not occur until 1954, when an identical twin donated a kidney to his brother. Twelve years later, researchers developed anti-rejection drugs that contributed to the successful transplants of a kidney, liver, and lung to three separate patients in 1962 and 1963.
Just 50 years later, medical technology has advanced to the point that contemporary surgeons are now capable of transplanting faces.
Though medical professionals as early as the 1600s had postulated that properties within mold could be used as a pharmacological treatment, the first step in the development of antibiotics was made in 1928 by Alexander Fleming, a British professor. The presence of mold and the absence of staphylococcus on several glass plates in his laboratory led him to conclude that the mold had produced a substance capable of killing the deadly bacteria. He named this substance penicillin, and these findings allowed Oxford professor Dr. Howard Florey and his team of scientists to develop fluid extracts from the penicillium mold, thus creating the world’s first antibiotic in an injectable form.
Penicillin played a major role in the treatment of soldiers during World War II, and its initial discovery in the 20th century has led to the development of a plethora of modern antibiotics, such as Amoxicillin. Although increasing antibiotic resistance remains a major concern, this treatment continues to fight bacterial infections, and new developments are being made in the field each year.
Additionally, recent laws like the GAIN Act are encouraging researchers to focus their efforts toward the development of new antibiotics to fight resistant strains of bacteria. The most recent new antibiotics are Teixobactin and Lactoferrin.