Over the past several decades, several nutrients have gone through popularity phases. Vitamins C and E had their heydays. The current “rock star nutrient” is vitamin D. And if there is a “bad boy” nutrient, it would be iron. Although it hasn’t hit the tabloid press yet, this bad boy has an intimate relationship with vitamin D.
QUESTION: Why is vitamin D so popular?
ANSWER: Study after study indicates that most of us are vitamin D deficient, even in Hawaii. Plenty of sun exposure is supposed to trigger vitamin D production in the skin, but many people with substantial sun exposure have been found to have low vitamin D status. Because vitamin D deficiency has been linked to a multitude of ailments, from diabetes to depression, physicians are checking blood levels of the vitamin and often recommending supplements.
Q: Why is iron a “bad boy”?
A: The reputation stems primarily from Dr. Jerome Sullivan’s 1981 theory proposing that iron accumulation in the body increases the risk for coronary heart disease. More than 30 years later, Sullivan’s theory remains popular but unproved, with many studies supporting and refuting it.
Nonetheless, the fear factor is in place, and iron is commonly considered a nutrient to be avoided. Concerns about limiting iron exposure are well deserved for the about 1 out of every 200 to 300 people who have a genetic iron overload condition like hemochromatosis. In these people too much iron is absorbed by the intestines, leading to toxic levels accumulating.
In contrast, iron is the most commonly deficient nutrient around the world, including the Unites States. The effects of iron deficiency are severe, primarily affecting women and children.
Q: What is this secret affair between vitamin D and iron?
A: When vitamin D status is assessed, a specific form of the vitamin is measured in the blood: 25-hydroxy vitamin D. It is produced primarily in the liver, where a slight alteration to vitamin D takes place. This essential step is required for vitamin D to have its beneficial functions in the body.
This is where the secret affair happens. As it turns out, this chemical step requires iron. Based on this known biochemistry, if iron status is low, the formation of 25-hydroxy vitamin D is likely to be impaired.
Q: Is there any evidence that low iron status could be contributing to the vitamin D deficiency epidemic?
A:
Yes, but the evidence is limited. In 1986, British researchers noted that iron and vitamin D deficiencies frequently coexisted in a group of Asian toddlers. Israeli researchers reported in 1992 that giving vitamin D supplements to 6- to 24-month-old infants and toddlers proved of no benefit when the infants also were iron deficient. Vitamin D status only improved when supplemental iron was also given.
In light of these studies and the fact that iron is the most common nutrient deficiency in the United States, it seems clear that our iron bad boy is the ideal partner for our rock star nutrient. At the least, it seems that vitamin D researchers and physicians should be measuring indices of iron status along with vitamin D.
———
Joannie Dobbs, Ph.D., C.N.S., and Alan Titchenal, Ph.D., C.N.S., are nutritionists in the Department of Human Nutrition, Food and Animal Sciences, College of Tropical Agriculture and Human Resources, University of Hawaii-Manoa. Dobbs also works with University Health Services.