Allergies are the immune system’s response to normally harmless environmental factors. The list of possible allergens is very long, but the most common examples are a wide range of foods (most often milk, wheat, soy, peanuts, nuts, and seafood), flower pollen (hay fever), sunlight, metals, insect bites, and animal fur. Medications and chemical substances also can cause allergic reactions which tend to be more serious.
An allergy is an overreaction (hypersensitivity) of the immune system. When the body detects an allergen, it will produce a large amount of IgE antibodies to target it. This, in turn, spurs mast cells to produce histamine and other inflammatory chemicals, which then causes allergy symptoms. So, it’s not the pollen or cow’s milk that harm your body, but your immune system’s overreaction to them.
Typical allergy symptoms, such as in the case of hay fever or asthma, include red eyes, a runny nose, breathing difficulties, itching, skin rashes, and swelling. In severe cases, these symptoms can be life-threatening (anaphylaxis), though this is rare. In the United States, fatal anaphylaxis caused by insect bites, latex allergy, and food allergies is estimated to be only around 200 people annually. Another 400 Americans die every year from penicillin allergy.
Allergies most often occur in children and young adults. Having one type of allergy also makes it more likely that a person is allergic to other things. According to the US CDC, in 2007 29% of children with one or more food allergies also had asthma, compared to only 12% in children with no food allergies. A similar trend applies to the relationship between food and skin allergies.
Given that evolution had some 500,000 years to perfect the immune system of Homo sapiens, it seems very strange that it would allow the body to attack and even kill itself. You’d think evolution shouldn’t make such clumsy mistakes.
In fact, the question of why humans have allergies is closely tied to the question of why allergies have been on the rise in developed countries over the past century. It increasingly seems that allergies weren’t part of the original plan and instead are acquired through humans’ increasingly higher and cleaner living standards.
This “hygiene hypothesis” holds that improved hygienic conditions fail to expose babies and toddlers to bacterial, viral, and worm infections. The immune system thus doesn’t get enough exposure and programming, so that later in life it may overreact to harmless substances like flower pollen or peanut butter.
“Hygiene” here doesn’t mean washing your hands too often or not getting the flu often enough as a child. Instead, the hygiene hypothesis refers to microbes that used to live in our guts and other organs since our earliest hunter-gatherer days. They coevolved with us in symbiotic relationships. Over time, parasites, like the hookworm, not only became tolerated by the human body but also supported the human immune system. This theory is also called the “old friend hypothesis”. Cleaner food and water in developed countries got rid of these old friends.
Take the hookworm, for example. Nowadays this parasite is rarely found in American digestive systems, but it still inhabits the guts of 700 million people in the developing world and can be found in almost all surviving hunter-gatherer societies. Studies on mice and humans have found that a protein of the hookworm blocks the neural pathways that trigger some allergies. For over a decade now, research has tried to use the hookworm and other parasitic worms for the treatment of asthma and other autoimmune conditions. Similar research efforts are looking into bacterial therapy with pre- and probiotics.
The importance of a healthy gut microbe flora extends beyond allergies. Some scientists also think there’s a micro-biotic connection with the growing prevalence of multiple sclerosis, leukemia, and a range of autoimmune disorders. Research is still at a nascent stage, but once the gut microbe-allergy relationship is better understood, more effective allergy treatments may become available.
The hygiene hypothesis isn’t universally accepted by all scientists. Another school of thought believes that allergy sensitivity — i.e., the likelihood of developing a form of allergy — is hereditary. If your parents had allergies, you’ll also likely become allergic to something. Data findings support the view that allergy sensitivity is inherited. It also needs to be noted that environmental pollution may play a role in some allergies. A clear-cut case is an asthma, which can be triggered or worsened by air pollution.
But even if hereditary genetics play a role, the question remains: how did your parents develop allergies in the first place? The hygiene hypothesis so far has offered the most plausible answer, and the scientific consensus on it has been growing.
As you probably have guessed from reading the previous section, allergies indeed are on the rise. A trend like this one can’t be easily measured year to year, but a look at long-term data of allergy-related hospital discharges and drug prescriptions provides some evidence. For example, food allergies among American children rose from 3.4% in the period of 1997 to 1999 to 5.1% in the 2009–2011 period. Likewise, during the same timeframe, the prevalence of skin allergies jumped from 7.4% to 12.5%. The only allergies that did not see a significant increase were respiratory allergies (asthma, hay fever), which remained at around 17%.
As allergies most often continue into adulthood, the rising prevalence of childhood allergies suggests that the share of American adults with at least one allergy is on the rise. But there also are adult-onset allergies which occur suddenly in adulthood. About 10% of American adults suffer from food allergies, a quarter of whom did not have allergies as children or at least didn’t notice any. That is 7 million US adults that first developed food allergies in adulthood.
Whether adult-onset food allergies are on the rise isn’t yet fully understood, since data collection only began in recent years. Scarce data also makes it difficult to observe the long-term trend in adult hay fever, though doctors say that the prevalence is increasing. In the UK, whose population has one of the highest hay fever rates in the world (20%), data by the National Health Services show a steady increase.
Nowadays there are many pharmaceutical options for controlling and suppressing respiratory and skin allergies. Acute allergic shock (anaphylaxis) to food can also be treated. But, how would you avoid developing allergy sensitivities from the start?
Well, by the time you are reading this, it’s probably too late. As an adult, there’s not much you can do, since your chances of getting allergies were determined during early childhood and, most likely, already starting in your mother’s uterus.
You can, however, take some steps to protect your children from developing allergies. Having more than one child, for example, makes a difference. In support of the hygiene hypothesis, research has shown that a single child is more likely to develop allergies than a child from a home with many siblings, who actively exchange microbes between each other. Starting daycare and kindergarten early has a similar positive effect. The same rationale applies to keep animals. If you want to lower your children’s risk of getting allergies later in life, get pets and spend as much time as possible outside in nature.
Avoiding excessive use of antibiotics may also be a good idea. There’s strong evidence that too much antibiotic treatment during early childhood makes it more likely that a person develops allergies or asthma later in life. This is because antibiotics have a devastating effect on gut flora, killing both bad and good bacteria.
If you want to learn more about allergies and how to avoid them, talk to an immunologist or your general doctor.