Back in the 18th century, it was a wonder how anyone ever survived a trip to the doctor. Many didn’t. England’s drug stores were stocked with bulls’ penises, frogs’ lungs, and powdered Egyptian mummy, which was evidently used against tuberculosis. Syphilis, known as the “Great Pox,” was treated with mercury. Never mind that it made you slobber and eventually go mad. The Scottish physician John Brown, the author of “Elementa Medicinae,” simply gave his patients roast beef, opium, and booze. Many people thought he was pretty much a genius.
Vaccination, too, arose during this frenzied period of trial and error, which, if you squint a little, looks a lot like the early days of the Covid-19 outbreak, when desperate doctors were willing to prescribe just about anything to save their patients. What makes vaccines different? For one, they actually work. The story of the Covid-19 vaccines, more than anything, is about how medicine has evolved from a spooky art to a rigorous science.
In the 18th century, one of the major diseases doctors were contending with was smallpox. A periodic scourge of humanity, it caused disfiguring pustules on all those unlucky enough to suffer from it. The most common form killed around one in three people it infected, and those who survived sometimes ended up blind. It was caused by a virus, but people back then didn’t even know what viruses were. Around the world, the longstanding method to prevent it involved grinding up the scabs of a person who had a mild case and then inhaling it like snuff or else rubbing into a cut on the skin, called variolation. Variolation caused high fevers, rashes and sometimes death. Sure, it could protect a person from smallpox, but it was not something anyone wanted to voluntarily subject themselves to if they didn’t have to.
Over in England in 1774, the village of Yetminster was facing a growing smallpox epidemic and variolation was their only option. A farmer by the name of Benjamin Jesty was living in a stone house in the village center with his wife, two boys, and a baby girl. Like many people of that era, he was aware that dairymaids often emerged unscathed from smallpox epidemics. Jesty himself had two dairymaids, who failed to contract the disease even when they had taken care of stricken relatives. Previously, however, the women had reported developing pustules on their hands from the cows they were milking.
Jesty himself had suffered from this mild infection, known as cowpox, which was caused by a virus closely related to smallpox, but his family hadn’t. He grabbed a stocking needle — used for knitting — and headed out on a mission to find a herd of cattle with cowpox. He found them about two miles away at the pasture of a man named Mr. Elford. He bent down underneath one of the docile creatures and poked at a lesion on its udder. Then, he turned to his wife and inserted the needle just below her elbow. Her arm swelled up and she developed a fever that lasted a week before recovering. He repeated the procedure on the two young boys, ages two and three, who fared better. Jesty’s family never suffered from the disease, despite multiple epidemics passing through their village. And when a local surgeon variolated the boys with actual smallpox 15 years later, they showed none of the typical symptoms.
Jesty’s tale was just another bit of folklore until another man, Edward Jenner, would end up rallying the medical community around this concept a few years later. Like Jesty, he had heard stories of milkmaids avoiding smallpox, but he was a physician learning the ropes of the scientific method. “Don’t think, but try,” his mentor told him. “Be patient, be accurate.” In 1796, he started testing out the method on several subjects, including the eight-year-old son of a local worker. Rather than obtaining the pus from a cow udder, he first took it from the hand of a young woman with cowpox lesions. All of them proved to be protected by the pus.
Jenner didn’t know exactly how or why it worked, but he saw its potential as “becoming essentially beneficial to mankind.” For a while, children were just vaccinated arm-to-arm. Jenner would stick the cowpox pus under the skin of one volunteer, and, a week later, when a new blister erupted, he would retrieve it for the next person. Not the most sanitary method, it would later lead to outbreaks of syphilis and hepatitis. When Jenner wrote about his successes, he called the method variolae vaccinae, which is just Latin for cowpox. But eventually, the term evolved to refer to all vaccines.
Crushed by negative news?
Sign up for the Reasons to be Cheerful newsletter.For the next two centuries, vaccinology had a reputation for being somewhat unscientific. While the buttoned-up chemists working in the pharmaceutical world in the middle of the 20th century were churning out easy-to-synthesize drugs like ibuprofen, vaccinologists were an odd breed, brewing up strange concoctions that relied on wild hunches. The basic idea of vaccination was to subject the body to a simulation of the wild pathogen, something that our immune system would learn to recognize, but that was not dangerous in itself. The measles and mumps vaccines were cultured in chicken eggs; some flu vaccines were made by growing viruses at cold temperatures; the first hepatitis B vaccine was made by purifying and sterilizing the blood of people who inject drugs and gay men who had the illness.
But that hepatitis vaccine, Heptavax-B, introduced in the 1980s, was actually a huge step forward because it delivered to the body just a piece of the virus, a virus protein. This more targeted approach eliminated many of the risks and downsides that came from giving people a whole virus, even one that had been weakened, split in two, or otherwise mangled. It also had the benefit of stimulating the body to produce higher numbers of the antibodies that could neutralize the real virus and fewer of the immune system misfires that could lead to a dangerous reaction.
A few years after the introduction of Heptavax-B, the key protein would no longer be isolated from human blood but would be manufactured in genetically engineered yeast — a scientific first for vaccines. Indeed, it was the advancement of gene sequencing and splicing techniques throughout the 1970s and 1980s that would finally bring vaccinology into the modern era. This revolution is part of what allowed scientists to safely develop the Covid-19 vaccines in record time. The mRNA and adenovirus-vector vaccines currently approved in the United States contain the genetic instructions to make just the telltale spike protein of the coronavirus.
That brings us to one last strange innovation, which is what allows those genetic instructions, at least when they come in the form of mRNA, to sneak across our cell membranes. Katalin Karikó, the Hungarian biochemist credited with one of the key mRNA innovations, began her scientific career behind the Iron Curtain when reagents were scarce. She once had to follow a step-by-step recipe from the 1950s to extract a key ingredient from cow brains in order to make tiny bubbles of fat known as liposomes, which could shuttle drugs across cell membranes. Scientists dreamed of doing the same with RNA, but that molecule has a negative charge to it, and the lipid couldn’t have a permanently positive charge or it would destroy the cell membrane. In the late 1990s and early 2000s, they began to find tricks to coax the RNA and lipids to combine to form tiny, solid spheres, known as lipid nanoparticles, which the vaccines from Moderna and Pfizer depend on.
We’ve certainly come a long way since Benjamin Jesty’s days. Unlike ivermectin, hydroxychloroquine, or whatever unproven therapies your uncle is touting on Facebook, the Covid-19 vaccines have now, in all likelihood, prevented hospitalization or death in millions of people. They represent the culmination of decades of research and scientists know how and why they work better than many widely used drugs. There’s no longer any reason to keep living in the dark ages. You can just get vaccinated.
This article was originally published on Undark. Read the original article.