Mankind has long been in search of the perfect diet, believing it to be the key to immortality. Food is health-giving but also dangerous, and for those inclined to worry about their health, what to eat is a constant source of anxiety. Should I eat this, should I avoid that? Is this food the elixir of life or a deadly poison (it is usually something in between the two)?
A new source of concern is about food is allergy, the prevalence of which has increased greatly in recent years. It is especially worrying in children. In Britain, for example, 5 – 8 percent of children now have a proven food allergy, and the number has doubled in the last decade. Not surprisingly, mothers have responded by not giving their infants and young children some of the foods to which allergies often develop, such as peanuts.
While this might seem reasonable, experimental studies have shown that the reverse policy – that of giving children peanuts early in infancy – actually confers protection against the development of allergy. What is not known, however, is how long such protection lasts, and whether it wears off once the children are no longer given peanuts as part of their diet. An experiment reported in a recent edition of the New England Journal of Medicine attempts to answer this question.
Infants and children who were at high risk of allergy were divided at random into those given peanut products during infancy until 5 years of age and those whose diet carefully avoided such products. After the age of 5, all children were placed on a peanut-free diet for a year and then re-tested for allergy to peanuts.
Parental compliance to the protocol of the experiment was far from perfect. 90 percent of children who had previously avoided peanuts continued to do so but only 69 percent of those who had been given peanut products as part of their diet and avoided them subsequently did so for the twelve months the experimental protocol required. But the results of the experiment were nonetheless clear: of those who completed it, 18.6 percent of the peanut avoidance group demonstrated a peanut allergy on skin test whereas only 4.8 percent of the peanut consumption group did so. In other words, the protection against allergy conferred by the early consumption of peanuts lasted for at least 12 months after consumption had ceased. The difference might have been even greater had consumption continued into the sixth year of life. Avoidance is thus the wrong policy.
In another paper in the same edition of the Journal, the original experiment was repeated but with different allergenic foods such as egg, peanut, cow’s milk, sesame, white fish and wheat. The results varied according to whether they were analyzed statistically for all those enrolled in the study or all those who completed it as the protocol required. Only 32 percent of the parents who were told to introduce the various foods into the diet of their infants managed to do so. The factors that were statistically associated with non-compliance were non-white race, the parental perception of symptoms caused by introduced foods, maternal psychological difficulties and the presence of eczema at the start of the experiment.
If the results were analyzed according to all the subjects enrolled in the experiment there was no statistically significant protective effect of early introduction of allergenic foods, but a protective effect was found in those who completed the protocol. The experiment also showed a general protective effect of a high dose of egg-white in the diet.
The two papers offer no explanation as to why food allergies have increased in prevalence in late years; but the difference between the results of those started on treatment and those finishing it is an important lesson for doctors in everyday practice.