Micronutrient supplementation in children and adolescents in low and middle income countries

Prof. A Prentice
5 min read /
Growth & Development Malnutrition

Micronutrients are a group of nutrients that include macrominerals (such as calcium), trace minerals (such as iron and zinc), and vitamins. The ‘micro’ in the name does not refer to their size but to the small amounts required by the body.

Filling in the gaps

Of the 28 traditionally recognized micronutrients, dietary supplements of iron, iodine and vitamin A have been the most studied, followed by zinc and folate (especially in pregnancy). A lot of work has gone into studying populations around the world in which deficiencies in these leading micronutrients are associated with poor health outcomes. And while we can’t logistically hope to ever know everything about the impact of micronutrient supplements on health outcomes, there is still much more that we can learn about the dynamics between the two.

For instance, we know that giving vitamin A supplements to young children reduces mortality, but we don’t fully know how this protection works in terms of the effects of vitamin A on the immune system. Some might say that it doesn’t matter, as long as supplementation does the job. But what if we could do the job even more effectively if we understood the precise mechanism, or mechanisms, involved. Given that only limited funding is available for micronutrient research, it is necessary to focus on filling in the gaps in our knowledge that are most crucial.

Pulling back the curtain on iron and anaemia

There has been some controversy about vitamin A supplements and the importance of zinc as a micronutrient but the biggest conundrum right now is overcoming anaemia. You would think this issue would be easy to address. After all, iron supplements are cheap to produce and distribute. But as we are discovering, iron supplementation can actually lead to a rise in disease outcomes in children, adolescents and young, pregnant mothers. It is important that we get this right, and UNICEF in India have made great strides towards tackling the very high rates of anaemia in teenage girls by implementing weekly supplementation.1

The danger with iron is that it not only feeds the human body but also the micro-organisms that invade it. This leads to a battle between the microbes and the human host for nutrients, the result of which might favour infection. In pregnancy, this competition can be further complicated by the nutritional needs of the unborn baby, and the consequences of anaemia in pregnancy can be serious for both mother and child. Further research is needed to understand the influence of iron on the risk of infection. From my own experience of treating young people in Africa, it seems likely that there is a higher risk of infection when using cheap, readily available iron supplements (which are very quickly absorbed in the intestines, potentially leading to iron overload) rather than the more advanced, expensive supplements available elsewhere.

Sophisticated thinking for a crude problem

In West and Central Africa, where I do most of my work, around 130 out of every 1000 babies are born to teenage mothers.2 There is clearly a pressing, unmet need among these women, and other young people, for an inexpensive test that measures the level of iron in the body and indicates whether it is safe to give a supplement. Existing measures, such as dietary iron intake, are too crude to be useful and sophisticated thinking is required.

Doctor with African girl_blog

The discovery of hepcidin, a new hormone thought to be a ‘master regulator’ of iron metabolism, is especially exciting as it acts as a sort of gatekeeper. Hepcidin seems to be able not only to sense how much iron the body needs, but also whether the body is in danger of infection when giving that iron. If a person is iron deficient, hepcidin is switched off and iron is freely absorbed through the intestine. If a person is overloaded with iron, hepcidin is switched on and intestinal absorption of iron is essentially cut off.

There is a lot of work still to do to standardize testing for hepcidin, but early studies suggest that the hormone could be our most important ally in identifying people with iron deficiency. What’s more, it may be possible to develop a diagnostic test that measures hepcidin levels in samples of saliva rather than blood, which would reduce the associated costs and improve access to the test for people in low and middle income countries. The question of whether the body is safe and ready to receive iron could be answered by something as simple as licking a diagnostic strip of paper.

Key references:

1. UNICEF Briefing Paper Series: Innovations, Lessons and Good Practice. The Adolescent Girls Anaemia Control Programme: Breaking the inter-generational cycle of undernutrition in India with a focus on adolescent girls.

http://www.kcci.org.in/Document%20Repository/14.%20Adolescent%20Anaemia%20Control%20Programme.pdf

2. United Nations Population Fund (UNFPA). Motherhood in Childhood: Facing the challenge of adolescent pregnancy. In: UNFPA State of World Population 2013.

http://www.unfpa.org/webdav/site/global/shared/swp2013/EN-SWOP2013-final.pdf

Dr. Andrew Prentice

Andrew Prentice

About Author