How America Got Allergic To Everything
This is one in a series of articles. For more on this subject visit The Daily Meal Special Report: Is Our Food Killing Us? Diet, Nutrition, and Health in 21st Century America.
According to a study released by the Centers for Disease Control in 2013, food allergies among children increased approximately 50 percent between 1997 and 2011. Another study conducted by the Jaffe Food Allergy Institute (JFAI) at Mount Sinai Hospital in New York City indicated that between 1997 and 2008, the rate of reported peanut allergy in children more than tripled, from one in 260 to one in 70 children. It's not just the United States, either. According to the World Allergy Organization Journal, many countries including Australia, Japan, China, Korea, and Norway have reported a rise in food allergies in the last decade.
Food sensitivity, from mild intolerance to true food allergy, is on the rise, but why?
There are only eight foods considered to be true food allergens: peanuts, tree nuts, milk, eggs, wheat, soy, fish, and shellfish. According to the Food Allergy Research and Education Center, they account for 90 percent of all food-allergic reactions in the United States. In addition, there are several other common allergens contained in food — and FARE notes that "a person can be allergic to virtually any food." Some people are particularly sensitive to food additives, like the sulfites used to preserve dried fruit, which can trigger asthma attacks.
A true allergic reaction is an immune response, which can be life-threatening, that occurs when the immune system mistakenly targets a harmless food protein. Symptoms range from a mild rash to anaphylaxis, which has a rapid onset and is potentially fatal. The first effective response to a reaction is epinephrine (also known as adrenaline), which halts the reaction.
Again, it's not the food that's hurting you. It's your immune system playing "bad cop" to a protein that's in the wrong place at the wrong time.
Presumably, food allergies have always been with us. But why are they so much more prevalent today? Research is ongoing, but the so-called hygiene hypothesis provides some explanation. According to this theory, the likely culprit is our overly sanitary lifestyle, which has left our immune system less practiced in fighting common bacteria and viruses.
Without developing immune tolerances through early exposure, our bodies overreact to harmless substances like pollen, or the ingredients in various allergenic foods. Still, according to Dr. Scott Sicherer, professor of pediatrics, allergy, and immunology at the JFAI — while the hygiene hypothesis is perhaps the most popular theory — allergies, and certainly different levels of allergic reaction, develop out of "many factors that interplay with each other and the genetic background of individuals."
[pullquote:left]There is no known way to prevent the development of food allergies, although a 2013 study published in the Journal of the American Medical Association demonstrated that eating nuts during pregnancy might reduce a child's risk for nut allergies.
According to Dr. Sicherer, some food allergies, like those to eggs, milk, wheat, and soy, can be outgrown, but allergies to peanuts, nuts, fish, and shellfish resolve less often.
Yet another form of food allergy, which Dr. Sicherer identified as one of the most common, is called pollen-related food allergy, or oral allergy syndrome, which sometimes affects those with allergies to pollens like birch or ragweed. Symptoms are generally limited to the mouth, lips, and throat, and are usually not severe.
"The reason this happens is that the proteins in the pollen look similar to certain proteins in fruits or vegetables," said Dr. Sicherer. "For example, birch pollen-like proteins are in many pitted fruits, carrots, and strawberries. Ragweed-like proteins are in melons. However, if the food is heated, it breaks down the proteins and there are usually no more symptoms. A raw apple might trigger symptoms but apple juice or apple sauce does not. This allergy, again, is not from pollen on the fruit but rather a similar protein found in the fruit."
Next, it's impossible to address the issue of food intolerance without looking at gluten, the darling of the food sensitivity world in recent years. According to a report from the National Foundation for Celiac Awareness (NFCA), an estimated one out of 133 Americans, or about one percent of the population, has celiac disease, an auto-immune disorder caused by a reaction to gluten and related proteins. According to some estimates, as much as 83 percent of the American population might have some level of gluten intolerance. The condition is prevalent enough in the United States and around the world that Alice Bast, president and CEO of the NFCA, says that a term more appropriate than "gluten intolerance" would be "non-celiac gluten sensitivity" or just "gluten sensitivity."
Since 1950, the prevalence of celiac disease has increased four-fold, according to Bast, and again, there are many theories as to why this is the case. One theory connects sensitivity to the increase of gluten in our diets, while another also points to the hygiene hypothesis. The research for non-celiac gluten sensitivity is still quite new, and suggests that there may be several factors contributing to this less severe condition.
In 2013, physician David Perlmutter wrote a best-selling book called Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar — Your Brain's Silent Killers, in which he accused gluten and carbohydrates of being at the root of many ailments, including Alzheimer's, depression, and ADHD. Though his research has been highly controversial, Perlmutter told a reporter at The Atlantic that his book has "never not been on the bestseller list, frankly." He added, "It may seem draconian, but the best recommendation I can make is to completely avoid grains."
But how many Americans even know what gluten is? According to a recent survey, many are woefully (and sometimes hilariously) misinformed. Here's one gem:
"It's the crunchy bits on top of pasta, potatoes, or French fries that give them flavor, but it's actually really bad for you."
For clarification, here's Dr. Bast:
"Gluten is a protein found in wheat, rye, and barley.It's important to note that in their natural form, oats do not contain gluten. However, commercial oats are typically contaminated with unsafe level of gluten through the growing and manufacturing processes. The biggest misconception about gluten is that it is bad for everybody. Gluten is simply a type of protein and there is no evidence that gluten itself is harmful to people who do not have celiac disease or another gluten-related disorder. Misinformation about the perceived benefits of the gluten-free diet stimulates thousands of people, if not millions, each year to try the diet without first ruling out celiac disease. This is highly problematic as there is a lack of knowledge about the true signs, symptoms, and risk factors of celiac disease and an accurate diagnosis cannot be gained while eating gluten-free."
Despite its status as one of the most pervasive food fads in decades, "going gluten-free" is not only entirely unnecessary for the majority of the population, but can also be considerably less healthy than the alternative. Many products marketed as "gluten-free alternatives" are higher in fats and sugars than their counterparts. In other cases, companies whose products have always been gluten-free have simply jumped on this lucrative bandwagon, and adapted their advertising to a gluten-wary audience.
Essentially, the reason that gluten-free diets, food products, and even make-up, have seemingly altered our approach to nutrition, is not because of a substantially greater health need, but because going gluten-free is a consumer-driven trend.
Dr. Darshak Sanghavi, an associate professor of pediatrics and the former chief of pediatric cardiology at the University of Massachusetts Medical School, noted in the New York Times that while only about one percent of people suffer from celiac disease, a quarter of consumers want gluten-free products. Dr. Sanghavi's article addresses a seemingly basic approach to gluten that still bears repeating:
"When teens go gluten-free, they are much more likely to become overweight, and to eat less fiber, calcium, and iron but consume more fat.
Mark Lang, a food marketing professor at Saint Joseph's University in Philadelphia recently explored the issue in a February 2014 opinion editorial in the New York Times.
"Gluten-free products will go through a developmental/introductory stage, a rapid growth stage for three to five years, and then level off, and possibly decline, to their long-term level," wrote Lang. "I think that ultimately this category will stabilize to a level consistent with demand associated with about 10 percent of the population."
The influence of food sensitivity on American public health has grown to such prominence in the last several decades that it affects the way we eat, shop, and see the world. And while half of the allergies we experience during childhood are likely to disappear later in life, the other half are here to stay. How did America become so allergic to all these foods that seem like they should just be, well, food? It seems we've been coddling our immune systems with an excess of hygiene. In turn, our bodies have become overly protective, misguided machines.
As for the whole big, expensive, and oversaturated to-do about gluten, chances are slim that you need to avoid gluten, so don't make the switch without finding out if it's medically necessary.
Karen Lo is an associate editor at The Daily Meal. Follow her on Twitter @appleplexy.