With the recent news of the COVID-19 vaccine earning approval from the Food and Drug Administration (FDA), military families and the public alike have responded with a range of emotions. As of August 23, the COVID vaccine will be mandatory soon for active duty and reserve military members of the United States. This has caused a strong reaction by the public, citing an infringement of rights, among other negative comments. However, mandatory vaccines are nothing new for the military. In fact, during basic training, before deployment or redeployment, soldiers receive any number of shots.
Currently, service members are mandated to receive chickenpox, hepatitis A, hepatitis B, seasonal influenza, mumps, measles and rubella, polio, diphtheria, tetanus, and pertussis vaccinations. Other vaccines are mandatory based on deployment locations.
Let’s take a deeper look at the history of how the military has dealt with disease in the past, as well as how they helped keep their fighting forces well.
The Revolutionary War
The earliest reports of using vaccines – or their predecessor – in the U.S. military come from the Revolutionary War when soldiers were dying in droves from Smallpox. It was the sickness, George Washington feared, that would defeat them rather than the British themselves. This was a further problem in that the British were more immune to the sickness and may have even used it as a weapon.
To combat smallpox, Washington ordered “mandatory inoculation,” which was done by placing a small amount of smallpox under the skin. At the time this was known as variolation; the smallpox vaccine was later invented in 1796. Forces that had been inoculated were kept top secret, lest they be attacked while covering. The process was spread from camp to camp until results were seen with vastly improved health and an increased immunity against smallpox.
This was the first in the world with an organized program to prevent smallpox.
By 1898, yellow fever was becoming a threat for the U.S. military during the Spanish-American War. To help combat the disease, the Army created the Yellow Fever Commission, which researched the sickness and validated the theory that it was spread by mosquitoes. This ongoing research eventually led to a vaccine. Other countries had to remove their fighting forces due to the spreading of disease, while the U.S.’s new knowledge helped keep soldiers healthy.
Post-WWI and Onward
During World War I, influenza was the enemy but it wasn’t until 1945 that the first approved influenza vaccine was available to service members. Civilians were given the green light to receive the flu shot in 1946.
After World War II, an adenovirus vaccine was developed in 1956 after military trainees were showing acute respiratory issues compliments of the virus. Adenovirus infections can be deadly and cause severe respiratory distress.
In 1998, President Bill Clinton and his administration required the inoculation of all military with the anthrax vaccine. The anthrax vaccine came with some controversy, however. The Department of Defense halted anthrax vaccines due to unapproved changes to the manufacturing process and several lawsuits filed by current and former service members. The vaccine was made mandatory again in 2007 but only for certain deployments in areas where bioterrorism is highly suspected. This same directive applies to the smallpox vaccine.
Today, researchers are working on advancements in HIV and malaria treatments. The military HIV Research Program (MHRP) at the Walter Reed Army Institute of Research is not only looking for an effective treatment, but they are looking into prevention as well. Separately, researchers are contributing to the testing of the leading candidate for a malaria vaccine.
The military’s research has helped create multiple vaccines; in most cases, soldiers are required to receive them to join the military. However, some exceptions are granted, based on personal and medical history.