Can Vaccines Cause Peanut and Other Food Allergies?

By: Cynthia Sanchez

Do Vaccines Cause Peanut and Other Food Allergies in Children?

No child health topic is being more hotly debated in the United States right now than mandatory vaccinations — and the side effects, sometimes quite serious or deadly, that many parents believe are a direct result of vaccines given to infants and young children.

Greater numbers of children than ever before are developing food allergies in the U.S., and parents are wondering if heavy vaccine schedules are to blame. This subject is difficult to write about. For starters, it was only a few decades ago that children worldwide and in the U.S. were getting seriously sick, permanently disabled or dying from viral and bacterial infections for which no vaccines existed — or in the case of poorer countries, the vaccines may have existed, but parents did not have the resources to get their children vaccinated. Either way, lack of vaccinations for different diseases was claiming the good health and the lives of many children globally.

Some Relevant Notes from the History of Vaccines

Because many more vaccines were developed in the second half of the 20th century, to add to the ones already in existence by then, childhood mortality from infectious disease has now been markedly reduced in wealthier countries. As a result, many people have forgotten or don’t know just how bleak things were in our world before methods of immunization were discovered and vaccines started being developed.

There was a time in world history when highly contagious diseases like smallpox, and others, claimed the lives of hundreds of millions, or tens of millions of people over a block of time. Children were especially at risk of dying when they contracted these infections. And there wasn’t a thing that anyone could do. One husband and wife might lose several children to infectious diseases. Family members and entire families succumbed to these illnesses frequently. The misery and heartache that infectious diseases caused in decades and centuries past were truly immeasurable.

Smallpox, the biggest infectious killer of all time, decimated Europe’s population, as well as the American continent’s native population, both in the 1700’s. Powerful kings, as well as regular folks, died from this disease. And most of those infected who did not die were disfigured with many skin and facial pockmarks. Some people would also get lesions in their eyes and go blind. Smallpox was still infecting and killing people in the African continent as recently as the 1970’s.

But now, we don’t hear about smallpox, also called variola, anymore. Thanks to a vast vaccination campaign carried out by the World Health Organization, the horrible scourge was finally eradicated in 1980. Present-day children are not vaccinated against smallpox, because the last recorded case was in 1977 (in Somalia).

People now have the luxury never to have even heard of smallpox. But it was a very hard-won luxury.

Not enough can be said about the heroic and extreme sacrifice that some medical doctors, vaccine pioneers from centuries past, had to make in their personal and professional lives, as they tested and developed ways to confer immunity from diseases to their patients. Trial and error, the only way by which vaccines could have been developed, was fraught with frequent fatal outcomes for patients.

At the birth of this branch of medicine — immunology — the first vaccines developed, for smallpox, consisted of matter from smallpox lesions in infected people, and later in history, from cowpox lesions in animals. This biological fluid, which contained antibodies made by an infected person’s (or animal’s) immune system to fight the illness, was transferred to a small cut on an uninfected person’s skin.

People immunized with smallpox matter might then never get the illness, or they might come down with a less-severe case of smallpox, or they still might get a full-blown infection and die. Later on, when doctors started using cowpox matter, the inoculated often developed cowpox, a less virulent illness than smallpox that did not kill them, and that made them immune against the deadlier smallpox. But people in either of the two groups could sometimes catch another deadly infectious disease, and die from that, if the immunizing matter that they received was contaminated with another deadly pathogen, such as tuberculosis.

So, a small percentage of people who got smallpox immunizations two-plus centuries ago died from them, either from the pathogen they had been given to immunize them, or from other germs present in the vaccine. And still, many people, including some who were wealthy and powerful, were willing to take those chances, knowing that if they were not vaccinated, they stood a chance many times greater of dying from smallpox.

The development of vaccines, and the vast improvements made on them, were extremely difficult to achieve. The discovery of vaccines that now save the lives of countless people globally is considered by health authorities to be one of the greatest contributions ever made to public health.

Back to the Present

Unfortunately, new viral and bacterial infections, and new strains of existing ones, continue to periodically turn into epidemics, and so, new vaccines are developed and administered to children. Some shots are being given to babies at a younger age than ever — beginning on the day that they are born.

The number of vaccines now recommended or mandated for children in the 50 United States is as high as 49 doses of 14 vaccines between birth and age six; 69 doses of 16 vaccines by age 18. Eight are recommended for infants just two months old, including the polio and tetanus shots, which are considered strong vaccines.

As the number of shots that young children were given doubled within a couple of decades at the end of the 20th century, something else started happening with far greater frequency, as well: food allergies (and allergies in general). These can be a great nuisance and lessen a child’s quality of life substantially; and in some cases, they can be deadly.

The Rise in Food Allergies, as Vaccines Have Multiplied

Some holistic health practitioners, and many parents, believe that the fast growth in the incidence of food allergies is the result of the many vaccines now given to children.

A large 2011 study designed by allergy experts from respected American universities surveyed randomly selected parents across the country; it found that about 8 percent of all U.S. children 18 and younger — 1 in 12 — are allergic to at least one food. This was a higher percentage of children allergic to foods than previous similar studies had estimated.

The top nine food allergens, in order of prevalence, were found to be:

  • Peanut, the most common, affecting 2 percent of children
  • Cow’s milk
  • Shellfish
  • Tree nuts
  • Egg (in children 3 and under, egg is the second most common food allergy)
  • Fin fish (this category includes dozens of varieties, such as sole, mackerel, flounder, herring, cod, shark)
  • Strawberry
  • Wheat
  • Soy

The researchers also found that severe reactions to these foods are common. Some food allergies can be mild and fade over time; however, many times, reactions are more dangerous and can be deadly.

Forty percent of the children in the study experienced severe symptoms, including wheezing, asthma attacks, anaphylaxis (where airway muscles constrict and the person cannot breathe), shock (a sudden drop in blood pressure, which reduces the amount of oxygen and nutrients that can reach organs), and respiratory failure (the respiratory system is unable to maintain either oxygen, carbon dioxide or both within normal ranges). These reactions occur suddenly and are life-threatening; medical attention is required immediately.

The above study, published in Pediatrics, found that food allergies were most common in preschoolers, with the highest number seen between ages 3 and 5. Perhaps not coincidentally, this age group receives the most shots in the shortest time, counting influenza shots, between ages 3 and 6, in preparation for attending preschool and kindergarten. Though babies can start developing allergies with the first onslaught of vaccines that they get at 2 months.

Teenagers, and in particular, boys, were the most likely to experience severe, life-threatening reactions. One of the researchers noted that this group may be reluctant to ask about ingredients in a dish when out with friends, in an effort to fit in. As a result, more food allergy fatalities occur among teens and older children.

The Possible Connection Between Food Allergies and Vaccines

Dr. Tim O’Shea, DC, a San Jose, CA, chiropractor who promotes holistic health on his website, The Doctor Within, is not a fan of vaccines. He notes that peanut allergy was unheard of before 1900. But now, 1.5 million U.S. children have this allergy, and peanuts have emerged as the number-one cause of death from food reactions, capable of producing anaphylaxis.

Peanut oil is a commonly used “excipient.” An excipient is a carrier fluid in vaccines which prolongs the effectiveness of the immunizing agent by releasing it gradually. Peanut oil has been preferred because it works well in conferring vaccination immunity, and because it has been widely available for many decades.

Before 1900, doctors used a lancet — a short knife-like instrument — to make a small incision on a person’s skin, and apply an immunizing agent to the cut, which resulted in a gradual mixing of the agent into the person’s blood, and which did not require any carrier fluids. By 1900, however, doctors had switched to the hypodermic (“under the skin”) needle to inject vaccine compounds into patients — a more direct route to the blood, and one requiring carrier molecules and substances to boost effectiveness (the latter are called adjuvants).

Serum Sickness

Dr. O’Shea adds that right at the turn of the last century, when the hypodermic needle replaced the lancet for immunizations, a new disease called “serum sickness” started affecting thousands of children, following injections. The symptoms of this illness included fever, rash, gastrointestinal problems, joint pain, fainting, shock, and it could cause death. The connection between serum sickness and injections was acknowledged in the medical literature of the day.

O’Shea notes that serum sickness was the first mass allergenic event in history, and that the entire field of modern allergy evolved from the study of this illness that was caused by vaccines.

In the early 1900’s, the term serum sickness was used to describe reactions in some patients who had been injected with antibody proteins produced by animals or bacteria, which neutralize specific pathogens, and which were used to treat diphtheria or scarlet fever.

Serum sickness is sometimes seen in our modern day when foreign proteins are used to immunize against a few diseases, including diphtheria, pneumococcal bacteria and rabies. Symptoms usually occur 1-2 weeks after the person is administered the offending agent.

A similar syndrome, “serum sickness-like reaction” (SSLR), presents with fever, rash and arthritis, and occurs several days to weeks after taking certain non-protein drugs, including some antibiotics (examples: penicillins and related drugs). A higher incidence of SSLR has been seen in children treated with cefaclor, compared to other antibiotics. In addition, children have a greater risk of getting SSLR from taking antibiotics, especially cefaclor, than adults.

What some antibiotics and vaccines have in common is the introduction of foreign, whole proteins into the blood; intact proteins in the blood are a known trigger for allergic reactions.

Although most cases of serum sickness from immunizations will resolve in a few days, subsequent exposure to the offending agent within a 10-year period may cause a more severe reaction, with anaphylaxis, shock and cardiovascular collapse (a lack of pulse, loss of consciousness and anaphylaxis) as possibilities.

We see, then, that from the beginning of hypodermic vaccinations, to the present, there have been serious and life-threatening possible reactions involved, which are often identical to ones produced by severe food allergies.

Peanut Oil as an Excipient

Immunologists started using refined peanut oil as a carrier agent in penicillin shots because it released the antibiotic gradually into the system. According to Dr. O’Shea, allergy to penicillin became common after this, and it was a recognized sensitivity to the carrier oil. Although refined, the Food and Drug Administration concluded that there would always be some traces of intact peanut proteins in these oils. For this reason, doctors were directed to inject vaccines into muscles, instead of intravenously, to lessen the chance of a reaction.

By 1953, as many as 12 percent of Americans were allergic to penicillin — that is, allergic to the peanut oil. But being that it was an effective drug for treating life-threatening bacterial infections, getting it was worth the risk of developing an allergy.

By the mid-1960’s, peanut oils started being used as excipients in vaccines. By 1980, they were the preferred carrier oil for vaccines. Many children started becoming allergic to peanuts.

But it wasn’t until the early 1990’s that peanut allergies in American children began reaching epidemic proportions. What changed between 1980 and the 1990’s? The number of mandated vaccines doubled: from 20 in 1980, to 40 in 1995.

Incidences of childhood food allergies doubled between 1980 and 2000, and they have doubled again since then. Just like the number of vaccinations went from 20 to 40 to the current 69.

Dr. O’Shea adds that just as we wouldn’t feed peanuts to a newborn because their digestive system cannot yet process most foods, it makes no sense to inject peanut proteins into the infant’s body.

And it’s not just allergies to peanuts that are of great concern to many parents nowadays. As we noted, numbers of tree nut, soy, fish, egg and other food allergies have also risen. This may be explained by the “mixed oils,” soybean oils and fish oils that are often used in vaccine formulations; egg allergies might be explained by the fact that some vaccines, such as the measles vaccine, are cultured in chick embryo cells.

(Cow’s milk protein sensitivities are often the result of babies being fed milk-based formulas, instead of immunity-strengthening breast milk exclusively, for their first six months of life. Children could also develop some food sensitivities if they are fed solid foods before they are six months old.)

Conclusion

All those reactive excipients, in addition to the weakened or dead microbes, their toxins or microbial surface proteins — which give the immunity — being administered multiple times, are already a whole lot for newborns, infants and young children to contend with. And we haven’t even begun to address the other vaccine ingredients — the mercury, aluminum, formaldehyde, ethylene glycol.

Mercury is highly toxic to the nervous system, and indisputably, mercury that is ingested from environmentally contaminated foods such as fish, while it may accumulate in a person’s system over time and possibly do harm, cannot produce such a strong potential, quick reaction as mercury that’s injected directly into a child’s body. This mercury has a lesser chance of being eliminated by the digestive system, and is received at a very young age, when the nervous system is still developing fast. Many parents with autistic children are convinced that mercury in vaccines is wholly or at least partly responsible for their children’s irreversible neurologic condition.

Other parents blame vaccines for other nervous system disabilities that their youngsters developed soon after vaccines were received. And there are parents who have lost their babies to sudden infant death syndrome (SIDS, which is a sudden death from no apparent cause), who feel that vaccines were to blame.

The bottom line is, do we want to go back to the days when children and adults were dropping dead by the millions from infectious diseases? No, we do not. But is it reasonable for parents to expect — and to demand — that vaccines be made safer, and that no unnecessary ingredients or shots be given to children? Absolutely.

The National Vaccine Information Center reports on its website that as of 2013, more than $2.5 billion have been paid out to families of children who were injured or who died after receiving vaccinations in the U.S. But the suffering such children and their families have endured is something that cannot be quantified in dollars.