How Vaccine Technology was Revolutionised by COVID-19

Posted by Phil Heler on April 30, 2024

Overall, infectious disease mortality rates plummeted spectacularly and by the 1970s, it appeared that infectious diseases had been marginalized as a threat to life.

Mohammed Ali once said, ‘It isn’t the mountains ahead to climb that wears you out; it’s the pebble in your shoe.’ We all have anxieties whether we like it or not. As someone said  ‘anxiety is like a rocking chair. It gives you something to do, but it doesn’t get you very far.‘ Yet we all have worries anyway.

According to one of the best know marketing research organisations (IPSOS) the things we worry about most in 2024 are inflation (34% of us), healthcare (44%) and the risk of military conflict between nations (35%). To make us even more worried our risks can even be quantified.

The National Security Council (or NSC) in the U.S. outlines our lifetime risk to the most common threats. The likelihood of being affected by cardiovascular disease is 1:6 or cancer 1:7.  Then there are random events such as a motor vehicle accident (1:93), a bicycle accident (1: 3,546) and death by  cataclysmic storm 1:20,098. But in general throughout history our greatest concerns have always been three things. Health(care), food security and warfare. Each often occurs as a direct consequence of the other (as in Gaza today).

Had IPSOS or NSC gone back in time and performed a survey in the Palaeolithic period (2.5 million years ago to 10,000 B.C) we faced familiar dilemmas except the outcomes were more serious. There was a 75% risk of dying of infection and 25% chance of starvation.

Risk of infection was an age-old problem until the discovery of antibiotics in 1928. Meanwhile our genes taught us to consume and gorge on energy dense and protein rich food whenever it was available, something we are genetically hardwired to do even now. Then came the age of empires. Warfare in human culture is as old as time itself.

In ancient Rome in a rapidly expanding empire that stretched over 2 million square miles at its peak, wars were waged constantly.  There are roughly 3,028 recorded wars in the history books and the Romans were very good at it. As a Republic (501 B.C. to 31 B.C.) they fought 160 wars of which they won 120. As an Empire (31 BC to 476 A.D.) their armies chalked up 132 victories and sustained only 35 losses (# kickass).

 

 

Unsurprisingly most Roman males served a period of conscription. The risks of dying in battle were between 5% and 30% depending on if you were on the winning side. If you were not a soldier, assassination was commonplace if you were a person of note (Et tu, Brute?). Even by the standards of their bloodiest brutal genocidal campaigns such as the Gallic Wars in 50 B.C., your biggest threat was always disease and pandemic.

What created the conditions for the decline of the Roman Empire was not war itself, it was the Antonine Plague. It killed between a third and a quarter of the entire Roman population. The plague devastated the empire’s professional armies, decimated aristocracy and cut deep swathes through the peasantry. Abandoned farms and depopulated towns dotted the countryside from Egypt to Germany.

Wars waged across different countries and continents throughout the Middle Ages as did devastating pathogens such as the bubonic plague that came and went. Then came the Industrial Revolution.

Within just six generations we experienced an unforeseen improvement in the standard of living. Although conflict was never far away, the challenges of healthcare were met head on. A variety of significant public health advances contributed to a doubling of life expectancy (between 1850 and the present day).

Traditional vaccination technology allowed us to rid ourselves of age-old adversaries that had haunted us since antiquity. Yellow Fever, smallpox and diphtheria became a thing of the past. Traditional vaccines were based on dead or weakened viruses (or part of their protein coat). Scientists became famous for their discoveries.

Max Theiler, in 1951, was awarded the Nobel Prize in Physiology or Medicine for developing the yellow fever vaccine. Yellow fever epidemics struck the United States many times in the 18th and 19th centuries. The disease was imported by ship from the Caribbean. Yellow fever epidemics caused terror, economic disruption, and some 100,000-150,000 deaths.

 

 

Even in the 20th century smallpox killed more people than WW1 or WW2 (or indeed any other pathogen). It is estimated that 300-500 million died before it was eventually eliminated by 1977. In the 1920s there were between 100,000 to 200,000 cases of diphtheria each year in the U.S alone. The diphtheria vaccine was the first vaccine in the new age of medicine. It was introduced in the UK in 1940 when cases reduced from  46,000 (1940) to 962 in 1950.

Overall, infectious disease mortality rates plummeted spectacularly and by the 1970s, it appeared that infectious diseases had been marginalized as a threat to life. Medical research moved on to tackling degenerative diseases, particularly cancer and cardiovascular disease.

However new ominous diseases began to emerge such as HIV in the 1980s, SARS and MERS. Eventually if you can bear to think about it came COVID-19 in 2020. These new diseases were borne by viruses that had the ability to mutate so quickly that they outpaced traditional vaccine technologies.

COVID-19 brought about the largest vaccination programme in human history. Over 12.7 billion doses were administered. With these rapidly developed vaccines came controversy, suspicion fuelled by social media amidst widespread speculation about rare side effects.

Although we did not fully appreciate it at the time the use of mRNA vaccines represented the greatest progression in molecular biology and vaccine technology since it first evolved. In fact, the two people (Drew Weissman and Katalin Karikó) who pioneered the technology were awarded the Nobel Prize in Physiology and Medicine in 2023.

Traditional vaccine technology is painstaking and slow. A typical vaccine development timeline is 5-10 years, and sometimes longer, to assess whether the vaccine is safe and efficacious in clinical trials. Then it must complete the regulatory approval processes and be manufactured and distributed. Herein lies the problem.

World War I lasted 4 years. It ended on November 11th, 1918, led to a staggering 40 million military and civilian casualties. Spanish flu, which started in February 1918, lasted half the time, and killed more people than any other illness in recorded history. This event made COVID-19 look like a small pothole in the road. Mind you there were 215,787 potholes recorded in Derbyshire between 2020-22.

Industrialisation brought with it global transportation, mass media, mass consumption, urbanisation, and mass global warfare. All ideal situations in which communicable disease can spread. The more people a virus infects the greater the possibilities for mutation. Every time a virus infects someone and replicates, the more room for error and increased risk of chance of a significant mutation.

About 30 years ago scientists came up with a really good idea. Could vaccines be made more simply? The Holy Grail was that what if, instead of injecting a piece of a virus into the human body, you harnessed a body’s cellular machinery to make a piece of the virus instead? This would then train the immune system to recognise the virus.

RNA is basically a lesser-known cousin of DNA. Like DNA, RNA is a molecule found in all cells. In our cells, genetic information encoded in DNA is transferred to messenger RNA (mRNA), which is used as a template for protein production. mRNA travels to the cell’s manufacturing machinery (or ribosomes) and takes with it the instructions for proteins that we need. The idea of an mRNA vaccine is to send our cellular machinery the instructions to make viral protein instead.

 

 

First, we had to make the mRNA. This was the easy bit. The second part, delivering the injected mRNA to the body’s cells so that it could be absorbed, took 30 years.  Scientists had to learn how to encapsulate mRNA in microscopic capsules to protect it from being destroyed in the blood.  Next, they had to learn how to modify mRNA, so it did not generate a massive immune response. Lastly, they had to learn how to encourage immune system cells to absorb the mRNA as it passed by in our bloodstream and manufacture huge quantities of viral protein.

The Human Genome Project completed in 2003 was also a real watershed moment. It meant that we could read the genetic sequence of any living organism. So, we could in effect take any virus and read its sequence and select the genes we needed to copy. This meant that all the pieces were theoretically available. This in turn meant a vaccine could in principle be made for any infectious disease by inserting the right piece of mRNA sequence for that disease.

It took COVID-19 to provide momentum to bring everything together. Necessity pushed mRNA vaccine technology to the threshold of working. Within weeks of COVID-19 being identified and all its genes determined, along with those that encoded for the spike protein, the results were published on the internet. Within hours scientists all around the world started work and within 11 months the first mRNA vaccines were approved. Previously no vaccine has ever been developed in four years.

The impressive flexibility and speed with which mRNA vaccines can be developed means that we can tackle most infectious agents or new variants. Quite literally we are now in the age of ‘plug and play’ vaccines. We just need the genetic code of a

Exit mobile version