|Home | About | Journals | Submit | Contact Us | Français|
Conceived and designed the experiments: MMP AEL CM. Performed the experiments: CM FGS DLL RMG. Analyzed the data: MMP AEL CM JS. Contributed reagents/materials/analysis tools: MMP AEL RMG. Wrote the paper: MMP CM.
It is well acknowledged from observations in humans that iron deficiency during pregnancy can be associated with a number of developmental problems in the newborn and developing child. Due to the obvious limitations of human studies, the stage during gestation at which maternal iron deficiency causes an apparent impairment in the offspring remains elusive. In order to begin to understand the time window(s) during pregnancy that is/are especially susceptible to suboptimal iron levels, which may result in negative effects on the development of the fetus, we developed a rat model in which we were able to manipulate and monitor the dietary iron intake during specific stages of pregnancy and analyzed the developing fetuses. We established four different dietary-feeding protocols that were designed to render the fetuses iron deficient at different gestational stages. Based on a functional analysis that employed Auditory Brainstem Response measurements, we found that maternal iron restriction initiated prior to conception and during the first trimester were associated with profound changes in the developing fetus compared to iron restriction initiated later in pregnancy. We also showed that the presence of iron deficiency anemia, low body weight, and changes in core body temperature were not defining factors in the establishment of neural impairment in the rodent offspring.
Our data may have significant relevance for understanding the impact of suboptimal iron levels during pregnancy not only on the mother but also on the developing fetus and hence might lead to a more informed timing of iron supplementation during pregnancy.
The clinical importance and prevalence of iron deficiency (ID) make the understanding of this micronutrient deficiency an important challenge for both the scientific and medical communities. Iron is an essential micronutrient and ID affects more than 2 billion people around the world. ID occurs in many forms ranging from marginal tissue iron depletion to the most severe form of iron deficiency anemia (IDA). It has been estimated that, globally, 50% of anemia can be attributed to ID. IDA ranks at number 9 among 26 mortality risk factors and accounts for over 800,000 deaths and 35 million disability-adjusted lost life years . North America alone bears 1.4% of the global burden of ID and IDA and it has been estimated that 35–58% of healthy women show some degree of ID, with a higher prevalence during pregnancy , , , , . This high prevalence of ID during pregnancy seems at odds with the practice of routine iron supplementation as part of the prenatal care provided in most developed countries. Factors that contribute to the still high prevalence are complex and include concerns of early iron supplementation generating oxidative stress , a low compliance rate (50%) of taking iron supplements even with optimal motivation and guidance due to the undesirable adverse effects , , , and a rise in risk factors like type-2 diabetes and obesity that can cause ID and IDA despite adequate food intake , , , .
ID and untreated IDA during pregnancy have many negative consequences for the offspring and have been shown to be associated with a higher incidence of low birth weight and prematurity , , , , , long-term cognitive abnormalities such as language learning impairments and behavioral changes , , , , , , , , , , , , , , , , , alteration in thermoregulation , changes in lipid metabolism , stroke and seizures , , altered motor function and coordination , , , and in many cases alteration of Auditory Brainstem Responses (ABRs, , , , , , a measure of nerve impulse conduction in the auditory system). Gestational ID also has been shown to change iron homeostasis in the offspring resulting in increased risk of developing ID later in life despite adequate nutrition , , . It is alarming that the reported prevalence of ID for US children under 2 years of age is estimated to be 25% , , , , which might well be an underestimation due to the difficulty in diagnosing ID in the absence of anemia.
Despite the recognition that ID during pregnancy can have multiple adverse effects on the developing fetus, it remains elusive to what degree and during which gestational time window maternal ID has to occur to affect fetal development to a degree that leads to functionally relevant long-term impairments.
We addressed this question using a highly controlled animal model system that allowed us to stage the initiation of ID during pregnancy using a defined feeding regimen and to analyze the fetal development as well as the neural function in the young adult offspring. Our prior studies on the effects of IDA during pregnancy suggested that IDA affects a very early arising precursor cell pool. The impairment of this cellular population is likely to contribute to an ultimate disruption of proper CNS development during postnatal development . Based on those studies, we now address the fundamental questions of whether (i) the generation of neural impairment in the offspring is limited to cases of maternal IDA as opposed to ID without anemia and whether (ii) the time window during which the iron depletion occurs in the dam determines the degree of functional impairments in the offspring.
One of the major challenges in understanding the impact of iron on development is the difficulty of measuring iron concentrations in the developing embryo. Animal models allow us to gain insight into the relationship between maternal iron intake and fetal iron levels. We established a feeding protocol in which rat dams were provided a customized iron-deficient diet that led to maternal iron deficiency (ID) but not to severe iron deficiency anemia (IDA). As shown in Figure 1A, this diet did not generate a discernable difference in hematocrit values in the pregnant dams compared to control dams, although hemoglobin (Hb) and red blood cell counts (RBC) were slightly below the control range by 21 days of gestation. Even though severe anemia was absent in the dams, we found progressively reduced serum iron levels, ranging from a >40% reduction at 15 days of gestation up to an 82% decrease at 21 days of gestation, relative to controls (CTLs; Figure 1B).
To determine the impact of the decrease in serum iron concentration in the dam on the iron status in the developing embryo, we measured the iron concentration of whole embryos using atomic absorption spectroscopy (AAS; Figure 2A). While we did not see a significant reduction in tissue iron concentrations at E13, at E15 embryos from iron deficient dams had only 44% of the levels of iron seen in control embryos. Despite a developmental increase in total iron concentration in the iron deficient embryos from an average of 8 ng Fe/mg tissue at E15 to approximately 14 ng Fe/mg at E19, these values were far below the normal Fe concentrations that increased from 19 ng Fe/mg at E15 up to 63 ng Fe/ml at E19. This lack of iron accumulation in the embryos was correlated with the respective total body weights. At E13, when the embryos did not yet show a significant difference in Fe concentration compared to controls, there also was not a difference in the weight of the embryos. At E15, E17, and E19, however, there was a persistent lack of weight gain in the iron deficient embryos compared to controls, rendering ID fetuses nearly 40% lighter than control fetuses at E15. By E19, the weight increased although ID embryos never reached the same weight as controls (Figure 2B).
In order to determine whether a continuation of the maternal feeding protocol postpartum has a persistent impact on the developing pups, we maintained the dams on the diet they received throughout gestation and tested the iron status of the pups one week after birth. As expected, the iron status of the pups continued to deteriorate and after one week serum iron concentrations were reduced by 63% and hematocrit levels showed a 40% decrease (Figure 2C, D). While the reduction of iron concentrations in whole embryos and in the serum of postnatal animals suggested a change in overall iron homeostasis, our functional readout was focused on the impact of iron deficiency on brain development. We were, therefore, particularly interested in the status of brain iron levels in postnatal animals. AAS analysis of two central nervous system (CNS) regions that are particularly easy to harvest and allow for a good volumetric normalization (spinal cord and cerebellum) confirmed that there was a significant decrease in the concentration of iron in these tissues (Figure 2E). The decrease in iron concentration was more pronounced in the cerebellum, which had a much higher base-line level than the spinal cord.
Having shown that the early maternal ID severely impairs the iron homeostasis and weight of the offspring, we next asked whether this gestational insult also results in impaired CNS development in the offspring. Studies in humans, monkeys, and rodent animal models have suggested functional impairments in neural signaling as measured by impaired conduction velocity in the auditory system. This measurement of nerve conductivity is particularly useful as it allows for a unified and non-invasive analysis of a stereotypical neural response profile that results in a wave diagram with each wave representing the neural activity along the ascending pathway (see for review , , , ). Specific ABR wave parameters like interpeak latencies between P2 and P1 are thought to be a reflection of the myelination status , , while changes in P3 and P4 interpeak latencies are also suggestive of impairments in synapse maturation , , , . Measurements of Distortion-Product of Otoacoustic Emissions (DPOAE) can be used to exclude impairments that are due to a peripheral loss of hair cell function that would also affect the general neural output , , , .
To determine the impact of the early maternal ID on the CNS development of the offspring, we conducted an ABR analysis of the ID and normal offspring at P40–45 days, a time point at which the auditory system is fully developed. As shown in Figure 3A, ABR latency analysis revealed significantly increased P2-P1 interpeak latencies in ID offspring compared to CTLs at all frequencies examined, ranging from 0.25±0.18 ms to 0.49±0.034 ms (* p<0.0001). When we compared the interpeak latency values of the individual animals in the ID group to control animals, it was apparent that iron deficient animals showed a more pronounced variability across all animals compared to controls (Figure 3B). We also analyzed the interpeak latency of P4-P1 to determine whether the impairment affects other aspects of auditory information processes detectable by ABR analysis. As shown in Figure 3C, interpeak latencies P4-P1 were also significantly increased in the ID group. In order to exclude possible defects of inner and outer hair cells that would influence the ABR measurements, we also conducted a DPOAE assessment. As shown in Figure 3D, DPOAE amplitude values of ID group were indistinguishable from the control group at all frequencies which strongly suggested normal hair cell function. Wave morphology was not different compared to CTL and hearing thresholds were not elevated in ID animals (data not shown). Taken together, our data show that animals exposed to ID during gestation develop an IDA that is associated with a significant defect in neural function as determined by ABR latency analysis.
The dietary regimen used so far in our study (ID -2wks) was selected to model a pre-pregnancy state that is defined by suboptimal tissue iron concentrations of the dam prior to conception that does not result in a stage of severe maternal IDA by the time the pups are born. While this dietary regimen led to neural impairment in the offspring, the development of IDA in the offspring, lack of weight gain, and lower core body temperature were all factors that could have contributed to the ABR latency defect. In order to begin to better define which of these factors might play a defining role to the development of neural impairment in the offspring, we changed the onset of the timing of the iron restriction along specific gestational windows. As outlined in Figure 4, we generated three additional groups of rats in which the iron-deficient diet (2–6 µg Fe/g) was introduced at the beginning of pregnancy (ID E0), and at the beginning of the 2nd (ID E7) or the 3rd (ID E14) trimester. All experimental groups remained on the iron-deficient diet during lactation and after weaning until the pups reached postnatal day 45, when the animals were subjected to hematological analysis along with weight and temperature measurements followed by an ABR measurement as outlined before. Control group (CTL) consisted of age-matched rats whose dams were maintained on a control diet which contains 240 µg Fe/g and is identical to the iron deficient diet in all other ingredients.
As shown in Figure 5A, the hematological analysis revealed that the offspring from all the dietary groups had established severe anemia by the time of the ABR measurement at P45. All groups showed a significant reduction in hematocrit values, hemoglobin concentrations, red blood cell counts (RBC), mean corpuscular volumes (MCV) and serum iron concentrations relative to the CTL group. Other parameters that could impact neural function were also affected but to various degrees depending on the time when the ID diet was introduced to the pregnant dam. As shown in Figure 5B, all iron deficient groups showed a significant decrease in body weight, while only the ID-2wks offspring presented with a significant decrease in core body temperature compared to controls. Core body temperatures were not affected when the iron restriction started at embryonic day 0 (ID E0), day 7 (ID E7) or day 14 (ID E14) and were comparable to control values (Figure 5C).
To monitor whether the delayed onset of the maternal iron restriction changes the functional neural impairments in the offspring we had detected in the ID-2wks dietary group, we performed ABR testing of the P45 offspring as outlined before. In all three groups DPOAE analysis revealed again that outer hair cell function was comparable to control animals (data not shown). ABR tests in the ID E0 and ID E7 offspring showed prolonged P2-P1 interpeak latencies ranging from 0.17±0.06 ms up to 0.39±0.05 ms compared to CTL values. ABR latency responses at the highest frequency (24 kHz) did not reach statistically significant levels in the ID E7 group (Figure 6A and 6B). There was also a decreased severity of the ABR defects with a later occurring insult. Surprisingly, when pregnant females were exposed to iron restriction at the beginning of the third trimester (ID E14), and the offspring was subjected to ABR testing, we did not find any statistically significant interpeak latency differences in the ID E14 group compared to the control group across all frequencies (Figure 6C).
We also analyzed the P4-P1 interpeak latency differences to determine whether the impairment affects the entire brainstem auditory response in such a manner that changes observed at later peaks are even greater in magnitude than those detected through analysis of the P2-P1 interpeak latencies. As shown in Figures 6 and and7,7, ID animals showed a relatively larger increase than CTL when comparing the P4-P1 latencies (0.25–0.9 ms) to the P2-P1 latencies (0.5 ms - >1.5 ms). While all P4-P1 latencies, except one, were increased by over 1 ms in the ID E0 group (Figure 7A), the increase in latency was less severe in the ID E7 group. Nonetheless, even in this group, the alterations seen in the P4-P1 analysis were greater than in the P2-P1 analysis and also were significantly increased compared to control animals at all frequencies tested (Figure 7B). In contrast, and consistent with the measurements of P2-P1 changes, the ID E14 animals did not show a significant difference in P4-P1 interpeak latencies compared to the control group (Figure 7C).
Taken together, our data show that dietary regimens that are not severe enough to cause a transition of maternal ID to severe IDA still disrupt fetal iron homeostasis as early as E15 and result in severe IDA in the offspring. We provide evidence showing that maternal iron restriction that occurs prior to the onset of pregnancy and during the first trimester has the most severe impact on the neural development of the offspring. Maternal ID during the second trimester still negatively affects the neural conduction velocity (measured by ABR) of the offspring but to a lesser extent. Our data further demonstrate that the development of neural impairments in the offspring is not dictated by the presence of IDA, low serum iron concentration or low birth weight, but is determined by the time window during which the dam received the iron-deficient diet.
The purpose of this study was to determine the impact of dietary maternal iron deficiency (ID) on fetal development, with the main focus on the identification of critical periods of gestation when the developing CNS is most vulnerable to maternal ID. The animal model we established was intended to mimic the extremely prevalent human condition of marginal maternal ID that occurs in an estimated 30 to 50 percent of pregnancies worldwide. After careful titration of the iron concentration in the diet, we were able to establish a model where the pregnant dam was rendered iron deficient but did not develop severe anemia during pregnancy.
The analysis of the most prolonged form of diet restriction, in which iron levels were already reduced prior to the onset of pregnancy led to some unexpected findings. For example, we found that despite the fact that the dams received the iron-deficient diet not only prior to conception but during the entire pregnancy, the dams did not develop severe anemia that had an impact on hematocrit levels, which remained in the normal range. However, maternal serum iron concentrations decreased continuously throughout pregnancy, confirming iron depletion. Upon more detailed examination of the maternal blood via HESKA Hema True Veterinary Hematology System, we observed minimal reductions in the measurement of mean corpuscular volume and hemoglobin compared to control values. Taken together, these results are consistent with ID and the absence of severe anemia.
This dietary model thus mimics a situation that would not cause any level of clinical concern in the human population, as the absence of severe anemia would not prompt any immediate intervention and is likely to remain unnoticed. We believe that this is an important aspect of our study as it underscores the need for monitoring the status of iron beyond the level of anemia.
Iron restriction to the dam two weeks prior to conception was associated with lower iron concentrations in embryonic tissues as early as embryonic day 15 (E15). The embryos maintained a very similar low iron level throughout gestation without an appreciable increase in serum iron concentrations suggesting that maternal iron stores, that apparently were not severely depleted, did not compensate for the continuously increased demand for iron by the embryo. Our observation seems to argue against the often proposed notion stating that the developing embryo is partly protected from the maternal ID through compensatory changes in the iron transport mechanisms of the placenta, which minimize the impact of ID in the fetus , , , . However, our data is consistent with (i) recently published data, suggesting that gestational ID might have a restrictive effect on the constitution of adequate fetal iron stores, ,  and with (ii) the observation by Naghii et al., indicating that fetal brain tissue can be iron depleted even in the absence of anemia in the pregnant mother . Furthermore, studies from the Georgieff laboratory have shown that a transgenic model of hippocampus-specific iron deficiency without any signs of anemia also resulted in altered neuronal development as well as significant cognitive dysfunctions (reduced spatial recognition memory performance and procedural memory) , , . Wu et al. have also shown that a short period of perinatal iron deficiency results in altered associative behavior in adult rats without anemia . These studies support our notion that anemia is not necessarily a defining predictor of neural impairments that are associated with iron deficiency.
Our model of gestational ID was not only associated with persistently lower body weights as early as E15 but was also associated with functional impairments as determined by the Auditory Brainstem Responses (ABR) analysis of neural conduction velocity in the offspring. The increase in P2-P1 interpeak latencies without hearing threshold changes suggested that, at least in part, impaired myelination could be one explanation for our findings. While disruptions in myelination have been demonstrated in rodents with prenatal and lactational ID , , , , , , ,  and hypomyelination is also recognized as a defining feature associated to perinatal ID in prematurely born infants , the greater increase in latency changes seen also by analyzing P4-P1 interpeak latencies (which reflects the entire brainstem) is consistent with findings in the literature suggesting that neuronal components could also be affected by iron deficiency.
As we established a functional criterion for the “diagnosis” of neural impairment that could clearly be associated with the maternal dietary intake, it was possible to design a timed dietary restriction study that cannot be performed in humans but is critical for understanding the window of vulnerability. We generated maternal dietary groups that were defined by the onset of iron restriction solely based on food source. Using the ABR test as a functional readout of the neural CNS integrity of the offspring, we determined that the maternal exposure to an iron-deficient diet either prior to conception, at the start of first trimester, or at the onset of second trimester had a significant negative impact on the ABR latency response in the offspring, placing the window of vulnerability for the fetus in the first two trimesters of gestation. Interestingly, the negative CNS effects on the fetus were more severe when the ID was initiated at the beginning of pregnancy compared to ID initiated prior to the onset of pregnancy. An important result of our research was that the offspring from pregnant rats receiving the ID diet at the beginning of the third trimester exhibited normal auditory nerve conduction velocities as reflected by ABR recordings despite IDA being present in all experimental rats at the time of testing. This result might shed some light on seemingly contradictory results found in humans; some investigators reported no alterations of ABR latencies in ID children with or without anemia , , while others found that IDA in 6 month old babies produced abnormally prolonged ABR latencies that persisted years after the children's iron status was corrected with iron therapy , . Our study indicates that there is a precise window of vulnerability during which ID can affect the fetal CNS leading to postnatal neural impairments. The lack of defect seen in some studies could be a consequence of the insult occurring outside the window of vulnerability, rather than a perceived general insensitivity of newborns to iron restriction. Without the establishment of the maternal iron status during pregnancy or at least cord blood iron levels, it cannot be resolved whether the initiation of ID and the possible onset of a later apparent pathology happened in utero. Therefore, any conclusions about the importance and/or need of pre- and postnatal iron supplementation have to be made with caution.
The window of vulnerability, which we have defined in our studies, also offers an important clue for the underlying cellular impairment and seems to be consistent with our previous work on the study of cellular targets of ID . As illustrated in Figure 8, ID during embryonic CNS development seems to affect neural precursor cell populations leading to an imbalance of specific precursor populations, which might be ultimately responsible for reduced oligodendrocyte numbers, abnormal myelination, and neural pathologies. We have shown previously that the differentiation and proliferation of embryonic glial precursor cells can be modulated by exogenous iron concentrations . Interestingly, the timing of the generation, expansion and differentiation of these precursor cells coincides with the window of vulnerability we have defined in our diet regimens. In the rat embryo, gliogenesis starts around day E13.5 with the generation of embryonic glial restricted precursor cells (GRPs), which shortly before birth give rise to oligodendrocyte precursors (OPCs) that persist postnatally . Even slight alterations of proliferation and/or of differentiation abilities of precursor cells (GRPs and OPCs) occurring before the time of oligodendrocyte generation could translate into a myelination defect. The notion of early embryonic cells being especially responsive to suboptimal iron levels is also supported by studies from Badarocco et al., whose interesting experiments using intracranial apotransferrin injections suggest that iron may affect oligodendrocyte development at early stages of embryogenesis rather than during late development . The defects that are associated with iron deficiency are, however, unlikely to be restricted to the glial population. As shown in Figure 8, neuronal progenitor cells develop in a similar time window and could equally well be targets of iron deficiency. The cognitive impairment seen in the offspring is most likely a combination of an impact on both neuronal and glial embryonic progenitor cell pools.
The absence of a functional defect in late onset ID suggests that, if the diet is introduced at embryonic day 14 (red arrow) and maintained, the time point of iron depletion is past the period when the most vulnerable embryonic populations expand and generate their progeny. Under these conditions, the impact of ID on glial cell generation and/or neuronal maturation does not seem to be severe enough to result in detectable auditory dysfunction.
It should be pointed out that despite the lack of an ABR latency defect in our study at the late gestational time point (ID E14), we cannot exclude the presence of other abnormalities. For instance, a number of studies in rodent and primate models of late gestational and postnatal ID reported significant and persistent behavioral abnormalities and neurochemical disturbances , , , , , , , , . As development occurs along a temporal continuum, it would be expected that deficiencies occurring at different times cause different types of outcomes.
The timed maternal iron restriction coupled with a persistently deficient diet used in our model, led to another surprising finding that sheds some new light on the role of anemia in iron deficiency. Due to the associated fatigue, muscle weakness, and dyspnea during exertion, anemia is considered a confounding factor in various behavioral tests and raises criticism on the accuracy of the results and their interpretation. The ABR electrophysiological test we employed in the evaluation of neurodevelopmental impairments induced by ID seems not to be influenced by anemia. As shown in Figure 5A, all offspring of our different feeding groups developed clinically relevant anemia, yet only three out of four groups showed functional impairment in ABR patterns. This observation strongly suggests that anemia is not a determinant factor of pathological ABR patterns recorded in iron deficient animals and allows dissociation of the impairment caused by ID from effects that might be primarily due to anemia. As ABR analysis can be conducted in multiple species, including humans, this type of analysis may provide a particularly useful noninvasive tool for examining the effects of micronutrient deficiencies on CNS development.
While IDA was not required to cause ABR defects, the occurrence of IDA did, however, severely impact the survival rate of the offspring. The litter survival rate was approximately 75% in the four experimental groups compared to the control groups. This decrease in survival rate may lead to an unintentional sample population bias of “survivors” at P45 when the functional measurements are conducted. While we cannot completely rule out that the surviving animals are on some level more robust that their littermates, it is important to note that we cannot correlate survival with absence or presence of ABR defects. The ID -2wk, ID E0 and ID E7 groups had as many animals that died during the experiments as the ID E14 group yet none of the surviving animals in the last group showed an ABR defect, while all the animals in the other three groups showed an ABR defect of various severities.
Another novel observation in our study is the time-specific presence of thermoregulatory deficits in the offspring from mothers who were iron restricted two weeks prior to conception. The three other feeding groups did not develop such defects. As ID has sometimes been linked to poor thyroid function , which in turn can result in a thermoregulatory dysfunction , , , , , the decreased core body temperature in the most severe feeding regimen (ID -2wks) might involve a thyroid component. However, based on our data, we do not think that thyroid involvement is critical for the development of the ABR dysfunction, as the ID E0 and the ID E7 groups have significant ABR dysfunctions but normal core body temperatures.
In summary, the analysis of developmental defects that are associated with exposure to iron restriction during different phases of fetal development emphasizes the influence of the precise window of vulnerability and the associated duration and magnitude of ID that is exerted on the developing fetal CNS. The prolonged absolute and interpeak latencies revealed by the comprehensive ABR analysis in ID animals could have important clinical implications. If ID at an early age causes hypomyelination or delayed neural maturation along the auditory pathway altering the properties of the auditory message and its transmission through the brainstem structures, it is possible that such a dysfunction could have functional significance for language acquisition and the development of other higher cognitive and emotional functions.
In lieu of this information, our data not only reiterates the importance of iron supplementation as part of prenatal and gestational IDA therapeutic regimen but also as a preventive strategy. Identifying the windows of vulnerability when specifically vulnerable cell populations are generated during embryonic development will be critical in delineating the restricted window of therapeutic opportunity when iron supplementation could correct or prevent the functional deficit.
Adult female and male Sprague-Dawley rats (10–12 weeks of age) were purchased from Charles River Laboratories (Wilmington, MA). All animals were housed individually under controlled environmental conditions (0600–1800 h light cycle, 23°C and 32% humidity) and were provided free access to food and water.
For the time mated pregnancies, the female/male pair was housed in the same cage for 24–48 hours or until the detection of vaginal plug. There are no data to support a notion of decreased fertility in any of the iron deficient (ID) diet treated groups. The litter size was comparable to controls and the newborn pups appeared healthy although their overall size is smaller in the ID -2wk and ID E0 groups. We did, however, observe a decreased survival rate in all iron deficiency groups. By postnatal day 45, on average 10–20% of the offspring died with the most severe losses in the ID -2wk group. The lower survival rate was most likely a result of the severe anemia experienced by the offspring.
For the time-mated pregnancies, the first day of gestation was determined by the appearance of vaginal plug. For exact timing of the gestational age, the crown-to-rump length (CRL) of the embryo was determined. The measurement of CRL, using a standard millimeter scale, was taken from the vertex to the rump with care taken to avoid pressure that would distort the natural curvature of the embryos. Embryonic size based on CRL in addition to vaginal plug detection allowed for a more accurate evaluation of gestational age.
The feeding protocols used in our study are depicted in Figure 4. The experimental animals were divided in four groups of specific dietary regimens, using the same iron-deficient diet containing 2–6 µg Fe/g of food (rodent TD. 80396 ID Diet – Harlan Teklad Custom Research Diets).
The first diet regimen was designed to generate embryonic ID in early pregnancy. The dams were started on the iron-deficient diet two weeks before conception and the pups born to these dams represented the experimental group “ID -2wks”.
The next group of experimental animals was fed the iron-deficient diet on the day of conception or appearance of vaginal plug, designated “ID E0”.
The third experimental group of animals, designated “ID E7”, comprised of rat pups born to dams exposed to the iron-deficient diet from the seventh gestational day onwards.
The rat pups born to dams feeding on the iron-deficient diet from gestational day 14 onwards were assigned to the fourth dietary group of rats, labeled “ID E14”.
As our control (CTL), we used age-matched rats born to dams that were maintained on control iron diet throughout the duration of the experiment. The control diet contained about 240 µg Fe/g of food (Adjusted Iron Diet (240) – TD.05656 from Harlan Teklad Custom Research Diets). All other components of the diet, composition of nutrients and caloric values were identical to the ID diet except the concentration of iron. All control and experimental animals were given deionized-distilled water through glass sipper tubes.
All groups received the assigned dietary treatment (iron-deficient and control diets) during lactation and after weaning until postnatal day 45, when the animals were subjected to ABR and DPOAE testing as well as evaluation of body weight, core body temperature, and iron status. Animal procedures were approved by the University Committee of Animal Resources at the University of Rochester, New York.
The animals were anesthetized with ketamine (100 mg/kg i.p.) and blood samples for hematocrit measurement were collected in heparinized microcapillary tubes (VWR, West Chester, PA) from the right cardiac atrium before the rats were perfused. Tubes were centrifuged for 15 minutes at room temperature in a high speed hematocrit centrifuge (IEC MB Microhematocrit Centrifuge). The Hematocrit value was calculated as the length of red blood cell column per total blood sample column length and expressed as a percentage.
For a more detailed hematological analysis including hematocrit (Ht) values, hemoglobin (Hb) concentrations, red blood cell (RBC) counts, and mean corpuscular volumes (MCV), we analyzed the fresh blood samples, collected in EDTA-coated tubes, using HESKA Hema True Veterinary Hematology System.
Whole blood aliquots for serum iron determination were allowed to clot, and then were centrifuged at 10,000 x g at 4°C for 15 minutes to separate cells from sera. Serum samples were isolated then frozen and stored at −80°C until analysis for iron concentrations by atomic absorption spectrophotometry (AAS).
After anesthesia with ketamine (100 mg/kg i.p.) and blood collection for hematocrit and serum iron determination, the animals were transcardially perfused with11.2 µg/ml heparin in phosphate-buffered saline via the left ventricle to remove blood from organs. The embryos or CNS tissues were immediately dissected, washed in distilled deionized water, vacuum dried, weighed, and stored in iron-free teflon vials at −80°C until analyzed as previously described. Frozen embryonic and postnatal CNS tissues were wet-digested in ultra-pure nitric acid and analyzed for iron concentration using the graphite furnace atomic absorption spectrophotometer. Standards, blanks, and standard curves were used as previously specified , , .
All experimental ID and control rats were subjected to core body temperature measurement by using a RET-2 rectal probe and TH-5 Thermalert Monitoring Thermometer (Physitemp). The core body temperature was consistently recorded for all groups of animals at six o'clock in the evening of the day of auditory testing.
Control (CTL) and iron deficient (ID) dietary groups of rats (ID -2wks, ID E0, ID E7, and ID E14 groups) were ABR tested on postnatal day 40–45. For the test, rats were anesthetized with acepromazine (1 mg/kg i.p.) and ketamine (80–100 mg/kg i.p.). Needle electrodes were inserted at vertex and pinna, with a ground near the tail. ABR potentials were evoked with click tone pips at 5 log-spaced frequencies from 5.6 to 24.4 kHz at 70 dB SPL. The response was amplified (10,000x) and 2,048 responses were averaged with using software and high-frequency speakers from Intelligent Hearing Systems (Smart EP, Miami, FL). To compare the change of latencies, the ABR peaks and troughs were identified by a semi-automated tool and verified by a trained observer by visual inspection of recorded wave forms. Latencies were calculated from the onset of the stimulus, and the interpeak latencies of wave II-I (P2-P1), wave III-I (P3-P1), and wave IV-I (P4-P1) were computed. The change of interpeak latencies was then calculated.
The function of the outer hair cells in the ear was analyzed by measuring DPOAEs with secondary tones (f2) at 5 log-spaced frequencies from 5.6 to 24.4 kHz. DPOAE testing was performed with f2/f1=1.2 and unequal primaries (L1=10dB+L2) at 70 dB and 60 dB SPL using high-frequency speakers and software from Intelligent Hearing Systems (SmartOAE) and high-frequency microphones (ER10B+) from Etymotic Research (Elk Grove Village, IL). Cochlear function was assessed for all animals at approximately 40–45 days of age and rats were sedated with ketamine (80–100 mg/kg i.p.) and acepromazine (1 mg/kg i.p.).
“n” represents the number of animals in any group analyzed and has a value ≥ 3. Regardless the sample size (n value), each control and experimental group was composed of animals from at least 3 separate litters. Statistical analyses were performed with the two-tailed Student's t-test and conducted comparing nutritionally ID to control rats, under the same conditions. A p-value less than 0.05 was considered statistically significant. Comparison of mean values of ABR interpeak latencies between different dietary groups and controls was performed with analysis of variance (ANOVA) using a Bonferroni posttest analysis and significance was defined as a p value less than 0.05.
We thank Ollivier Hyrien for input regarding statistical analysis and other laboratory members for valuable assistance in this project. We also extend gratitude towards Mark Noble and Christopher Pröschel for helpful comments on this manuscript.
Competing Interests: The authors have declared that no competing interests exist.
Funding: The work was funded by the NIH grants R01HD059739 and NIH R01NS39511, and was in part supported by the Environment Health Science Center (EHSC) at the University of Rochester. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.