The 5 Greatest Medical Breakthroughs That Transformed Humanity

Can you believe there was a time when doctors didn’t wash their hands between performing an autopsy and delivering a baby?

Modern medicine has taken a long time to get to where it is today, with some major discoveries that propelled our understanding forward. Without the following medical breakthroughs, med students might still be learning that disease is caused by bad smells and that vaccinations have no scientific backing.

 

1 | Hand Washing

The first biggest medical breakthrough on our list is hand washing.

Yes, hand washing. While it may sound incredibly strange today, not even 200 years ago, the sanitary benefits of washing your hands were not yet known.

In 1846, a young German-Hungarian doctor named Ignaz Semmelweis wanted to solve the problem of maternal mortality due to puerperal fever, known as childbed fever. He began collecting data in two maternity wards—one staffed by male doctors and med students, and one staffed by female midwives.

Semmelweis discovered that women died at a rate nearly five times higher in the clinic staffed by men. This was after ruling out his first two hypotheses—the first being that women should give birth on their sides, and the second that a priest walking by the beds and ringing a bell at night scared the women so badly that they developed a fever.

After a vacation taken out of frustration, he returned to find out his friend and colleague, a pathologist, had fallen ill and died after pricking his finger while performing an autopsy. Semmelweis realized that his friend also died from puerperal fever, meaning women were not the only ones infected.

Semmelweis further realized that the main difference between the two maternity wards was that the doctors performed autopsies while the midwives did not.

Semmelweis then hypothesized that doctors had cadaverous particles on their hands from performing autopsies when they delivered babies, and these particles were getting inside the women who would then develop puerperal fever and die.

He then ordered the medical staff to wash their hands and medical instruments with chlorine solution. This caused the rate of puerperal fever to fall massively—from 18.27% to 1.27%.

But, unfortunately, the male doctors and med students didn’t appreciate the blame falling on them. Eventually, Semmelweis was fired. He spent his professional life trying to convince the medical establishment to use a chlorine solution to wash their hands and medical instruments, but due to his rude and abrasive nature, he was not convincing, and it did not catch on. The doctors of the time did not appreciate being told directly and aggressively that they were responsible for killing patients.

After suffering a nervous breakdown and developing a mental condition, he was committed to a mental asylum by his colleagues when he was 47. He protested and was savagely beaten by the guards, where it’s thought he sustained a hand injury that became infected.

In an ironic and cruel twist of fate, he died two weeks later. The autopsy revealed sepsis, an extreme reaction to an infection, which is essentially the same disease Semmelweis fought against throughout his professional life.

 

2 | Germ Theory

The next medical breakthrough is germ theory.

Germs, the microscopic bacteria, viruses, fungi, and protozoa that can cause disease, were not always known to cause disease. As late as the 1800s, some physicians still believed evil spirits were the cause of disease, but the most mainstream theory of the time was the miasma theory, which suggested that diseases were caused by “bad air” or “foul odors.” And anyone who disagreed was at best misguided and at worst controlled by evil spirits.

In 1861, Louis Pasteur published his germ theory of disease. Building off of Pasteur’s work, British surgeon Joseph Lister realized that compound fractures, also known as open fractures, were more likely to become infected because of their exposure to environmental microorganisms.

In 1865, he started to spray carbolic acid on wounds, dressings, and surgical tools, and overall hospital mortality caused by infectious disease dropped dramatically. Although Lister said it was because of his method of antisepsis, it was not proven conclusively.

It wasn’t until the late 19th century that Robert Koch firmly established that a particular germ could cause a specific disease.

Koch used a microscope to study the blood of cows that had died of anthrax, finding rod-shaped bacteria he believed were the culprit. He then injected the bacteria into mice, which also became ill with anthrax. Using this method, he also discovered the germs responsible for tuberculosis and cholera.

Combined with Koch’s findings, Lister’s approach to surgery finally took hold, and wound morbidity and patient mortality diminished.

 

3 | Anesthesia

Speaking of surgery, this brings us to our next medical breakthrough—anesthesia, which made complex surgical procedures possible.

Before modern anesthesia, patients were often conscious and endured excruciating pain during surgery. Surgeons couldn’t offer their patients much more than alcohol, opium, or something hard to bite down on, like a bullet, which can’t have been good for teeth.

While early anesthesia can be traced back to the ancient times of the Babylonians, Incas, Chinese, and Greeks, the first official use of modern anesthesia occurred on October 16th, 1846, in the surgical amphitheater of Massachusetts General Hospital in Boston.

A dentist named William T. G. Morton used sulfuric ether—a substance he dubbed Letheon after the Lethe River, which in Greek mythology erased painful memories—to successfully anesthetize patient Glenn Abbott, who needed a vascular tumor removed.

Letheon inspired multitudes of surgeons to create a wide array of lifesaving surgical procedures that would not have been possible without anesthesia, such as neurosurgery, major orthopedic surgery, complex abdominal surgery, and many others.

Of course, not all types of surgery could be performed with general anesthesia. Cataract surgery, for example, still had to be performed without general anesthesia since ether and chloroform caused patients to vomit—something that obviously can’t happen during eye surgery.

What was the solution? Cocaine. Not exactly a breakthrough that stuck. And it certainly didn’t help that many doctors were also self-medicating with it, too. Pretty sure that’s not what Jesus meant when he said, “Physician, heal thyself.”

In the 1880s, Austrian ophthalmologist Carl Koller started soaking his patients’ eyes in a cocaine solution, which relieved his patients of all pain. But after the number of patients dying from accidental overdoses skyrocketed, cocaine quickly fell out of favor.

And that was far from the only strange medical treatment from history. Check out our article on The Most Bizarre Medical Treatments From the Past.

 

4 | Antibiotics

Next is arguably the most significant medical breakthrough in the history of medicine: antibiotics. Before antibiotics, common bacterial infections like strep throat or even minor cuts could be life-threatening.

Antibiotics are compounds produced by bacteria and fungi that are capable of killing, or at least obstructing, competing microbial species. The first true antibiotic was discovered in 1928 by Alexander Fleming, but it wasn’t until the 1930s and 40s that Howard Florey and Ernst Chain made penicillin a life-saving drug.

In addition to treating infectious diseases, antibiotics made many medical procedures we take for granted today possible, including organ transplants, open-heart surgery, as well as chemotherapy and other cancer treatments. They also made previously life-threatening bacterial infections like tuberculosis, sepsis, and pneumonia completely treatable, adding about 20 years to a human being’s life expectancy.

That said, the overuse of antibiotics has led to the rise of antimicrobial resistance, or AMR, which has rendered some infections untreatable. A recent study from the Global Research on Antimicrobial Resistance, or GRAM Project, predicts that 39 million deaths will be directly attributable to bacterial antimicrobial resistance between 2025 and 2050, and annual deaths directly caused by bacterial AMR will rise from 1.14 million in 2021 to an estimated 1.91 million in 2050.

Let’s hope that new microbial natural products, or NPs, which are compounds that are unrivalled in their chemical diversity and effectiveness as antibiotics, are one of the next major medical breakthroughs.

 

5 | Vaccinations

And now for a controversial medical breakthrough that should not be controversial at all—vaccinations.

Long before modern vaccines, between the 10th and 15th centuries, people in China, India, and some regions in Africa practiced variolation, which involved intentionally exposing healthy individuals to smallpox in the form of pus or dried scabs to induce a milder form of the disease, providing subsequent immunity. These infected pustules would be inserted into a healthy person’s nose or rubbed into a scratch on their skin.

While this did offer some protection from severe smallpox, the success of variolation was incredibly varied—pun intended. This process was obviously not without its risks, and many who underwent it developed severe smallpox, passed the disease on to others, or died.

Nevertheless, variolation’s popularity spread to Europe, particularly England, and even the American colonies.

It wasn’t until Edward Jenner discovered the potential of vaccination in 1796 that variolation fell out of favor. Jenner found that inoculation with cowpox provided protection against smallpox, with a much lower risk of serious side effects. Vaccination soon replaced variolation, leading to the eradication of smallpox and the near-eradication of polio. Vaccinations have also significantly reduced the incidence of other diseases like measles, mumps, rubella, and diphtheria.

That said, for vaccinations to work, people need to be vaccinated. These diseases are still out there, and they’re still dangerous.

Take the measles outbreak that started in late January of this year in Texas, which, at the time of this video, has infected nearly 500 people in Texas alone and is believed to have spread to New Mexico, Oklahoma, Kansas, and Mexico. It has already claimed the lives of three Americans, two of them children, and all of them unvaccinated.

Polio, a highly contagious infection caused by the poliovirus that can cause total paralysis in a matter of hours, was declared eradicated in the Americas in 1994 by the World Health Organization, thanks to vaccines created by Jonas Salk and Albert Sabin in the 1950s and 60s, respectively.

Poliomyelitis, often shortened to polio, is an insidious disease that can remain in the environment for months. About 70% of people infected with Polio don’t show any symptoms. Some show flu-like symptoms, about 5% develop meningitis, and only 1 in 200 to 2,000 develops permanent paralysis. By the time a paralytic case develops, there could be over 1,000 people already infected.

In July 2022, the CDC reported that a case of paralytic poliomyelitis was confirmed in an unvaccinated adult in Rockland County, New York. Circulating polioviruses genetically related to this case were identified in 89 wastewater samples from at least five New York counties.

So, how do you stop polio and other diseases from coming back? Vaccinations.

Who knows what medical breakthroughs will be discovered—or disregarded—in the future?

Do you want us to cover cutting-edge medical breakthroughs like gene therapy, regenerative medicine, and artificial intelligence next? Leave a comment below.

Facebook
Twitter
LinkedIn
Email

Leave a Reply