By Nachiket Mor
Very interesting materials suggesting that while Sickle Cell Anaemia was probably an evolutionary response to the presence of Malaria, Iron Deficiency Anaemia probably began on account of a move to a more agrarian lifestyle but survived and was strengthened on account of the fact that mild to moderate Iron Deficiency Anaemia offered some protection against infectious diseases of various types including Malaria.
Sickle Cell Anaemia and Malaria
Scientists believe the sickle cell gene appeared and disappeared in the population several times, but became permanently established after a particularly vicious form of malaria jumped from animals to humans in Asia, the Middle East and Africa. In areas where the sickle cell gene is common, the immunity conferred has become a selective advantage. Unfortunately, it is also a disadvantage because the chances of being born with sickle cell anaemia are relatively high. For parents who each carry the sickle cell trait, the chance that their child will also have the trait – and be immune to malaria – is 50 percent. There is a 25 percent chance that the child will have neither sickle cell anaemia nor the trait which enables immunity to malaria. Finally, the chances that their child will have two copies of the gene, and therefore sickle cell anaemia, is also 25 percent. This situation is a stark example of genetic compromise, or an evolutionary “trade-off.”
Iron Deficiency Anaemia and Malaria
During the evolution of the genus Homo, with regard to species habilis, erectus and sapiens, malaria infection played a key biological role, influencing the anthropological development too. Plasmodia causing malaria developed two kinds of evolution, according to a biological and philogenetical point of view. In particular, Plasmodium vivax, Plasmodium malariae, and Plasmodium ovale, would have either coevolved with human mankind (coevolution), or reached human species during the most ancient phases of genus Homo evolution. On the other hand, Plasmodium falciparum has been transmitted to humans by monkeys in a more recent period, probably between the end of Mesolithic and the beginning of Neolithic age. The authors show both direct and indirect biomolecular evidences of malaria infection, detected in buried subjects, dating to the Ancient World, and brought to light in the course of archaeological excavations in some relevant Mediterranean sites. In this literature review the Authors organise present scientific evidences: these confirm the malarial role in affecting the evolution of populations in Mediterranean countries. The people living in several different regions on the Mediterranean Sea sides, the cradle of western civilisation, have been progressively influenced by malaria, in the course of the spread of this endemic disease during the last millennia. In addition, populations affected by endemic malaria developed cultural, dietary and behaviour adaptations, contributing to decrease the risk of disease. These habits were not probably fully conscious. Nevertheless it may be thought that both these customs and biological modifications, caused by malarial plasmodia, favoured the emergence of groups of people with a greater resistance against malaria. All these considered factors decreased demographical impact, influencing in a favourable way the general development and growth of civilisation.
Iron Deficiency Anaemia and other diseases
Iron deficiency, with or without iron deficiency anaemia, is so ubiquitous that it affects all populations of the world irrespective of race, culture, or ethnic background. Despite all the latest advances in modern medicine, improved nutrition, and the ready availability of cheap oral iron, there is still no good explanation for the widespread persistence of iron deficiency. It is possible that the iron deficiency phenotype is very prevalent because of many factors other than the commonly cited causes such as a decreased availability or an increased utilisation of iron. Several thousand years ago, human culture changed profoundly with the agrarian revolution, when humans turned to agriculture. Their diet became iron deficient and new epidemic infections emerged due to crowding and lifestyle changes. There is convincing evidence that iron deficiency protects against many infectious diseases such as malaria, plague, and tuberculosis as shown by diverse medical, historical, and anthropologic studies. Thus, this change of diet increased the frequency of iron deficiency, and epidemic infections exerted a selection pressure under which the iron deficiency phenotype survived better. Multiple evolutionary factors have contributed in making iron deficiency a successful phenotype. We analyse some of the recent findings of iron metabolism, the theories explaining excessive menstruation in human primates, the unexplained relative paucity of hemochromatosis genes, the former medical practice of “blood-letting,” and other relevant historical data to fully understand the phenomenon of iron deficiency. We suggest that, due to a long evolutionary persistence of iron deficiency, efforts at its prevention will take a long time to be effective.
Iron Deficiency Anaemia and Infectious diseases
An evolutionary perspective suggests that iron deficiency may have opposing effects on infectious disease risk, decreasing susceptibility by restricting iron availability to pathogens, and increasing susceptibility by compromising cellular immunocompetence. In some environments, the trade-off between these effects may result in optimal iron intake that is inadequate to fully meet body iron needs. Thus, it has been suggested that moderate iron deficiency may protect against acute infection, and may represent a nutritional adaptation to endemic infectious disease stress. To test this assertion, we examined the association between infection, reflected by C-reactive protein, a biomarker of inflammation, and iron status, reflected by transferrin receptor (TfR) and zinc protoporphyrin to heme ratio (ZPP:H), among school-age Kenyan children, and evaluated the hypothesis that moderate iron deficiency is associated with lower odds of infectious disease. TfR > 5.0 mg/l, with sensitivity and specificity for iron deficiency (ZPP:H > 80 micromol/mol) of 0.807 and 0.815, was selected as the TfR definition of iron deficiency. Controlling for age and triceps skinfold thickness (TSF), the odds ratio (OR) for acute viral or bacterial infection associated with iron deficiency (compared to normal/replete) was 0.50 (P = 0.11). Controlling for age and TSF, the OR for infection associated with an unequivocally iron replete state (compared to all others) was 2.9 (P = 0.01). We conclude that iron deficiency may protect against acute infection in children.