|Home | About | Journals | Submit | Contact Us | Français|
The success of folic acid fortification has generated consideration of similar fortification with cobalamin for its own sake but more so to mitigate possible neurologic risks from increased folate intake by cobalamin-deficient persons. However, the folate model itself, the success of which was predicted by successful clinical trials and the known favorable facts of high folic acid bioavailability and the infrequency of folate malabsorption, may not apply to cobalamin fortification. Cobalamin bioavailability is more restricted than folic acid and is unfortunately poorest in persons deficient in cobalamin. Moreover, clinical trials to demonstrate actual health benefits of relevant oral doses have not yet been done in persons with mild subclinical deficiency, who are the only practical targets of cobalamin fortification because >94% of persons with clinically overt cobalamin deficiency have severe malabsorption and therefore cannot respond to normal fortification doses. However, it is only in the severely malabsorptive disorders, such as pernicious anemia, not subclinical deficiency, that neurologic deterioration following folic acid therapy has been described to date. It is still unknown whether mild deficiency states, which usually arise from normal absorption or only food-bound cobalamin malabsorption, have real health consequences or how often they progress to overt clinical cobalamin deficiency. Reports of cognitive or other risks in the common subclinical deficiency state, although worrisome, have been inconsistent. Moreover, their observational nature proved neither causative connections nor documented health benefits. Extensive work, especially randomized clinical trials, must be done before mandatory dietary intervention on a national scale can be justified.
Mandatory fortifications of food have had important successes but are never steps to be taken lightly. Typically, such steps were taken when a nutrient deficiency was widespread and had serious health consequences (not just biochemical changes) and the nutrient’s properties readily permitted its assimilation by the population of interest without posing health risks to the nondeficient population. Mandatory fortification with cobalamin (Cbl) is now being contemplated for various reasons. Many of the reasons are related to the folic acid fortification that occurred in the late 1990s in the United States and elsewhere. Therefore, folic acid fortification will be discussed briefly before focusing on the two critical questions, the answers to which should determine whether fortification with Cbl will improve the public’s health.
Folic acid fortification was unlike most fortifications because it was not initiated to relieve or prevent a deficiency. Rather, its target was neural tube defects, and it was undertaken only after randomized clinical trials proved that folic acid taken orally actually reduced the incidence of the defects. The mechanism of action remains unknown and may involve providing extra folate to overcome genetic or acquired inefficiencies of maternofetal metabolism or transport. Overt maternal folate deficiency was rarely part of the risk profile. Fortification has prevented an estimated 20–50% of new neural tube defects. As a bonus, the combination of fortification and wider self-supplementation in the United States and elsewhere nearly eliminated folate deficiency and reduced the frequency of hyperhomocysteinemia (Pfeiffer et al. 2005). The success was predictable because folic acid has >85% bioavailability over a wide range of intakes and malabsorptive disorders were rarely evident in the mothers of infants with neural tube defect. As will be discussed, Cbl lacks these virtues.
Somewhat less felicitously, the excellent bioavailability allowed folic acid fortification and self-supplementation to more than double plasma and red cell folate levels, often to strikingly supranormal levels (McDowell et al. 2008). Unease has grown about unwanted consequences. Chief among them were revived concerns about adverse neurological consequences of high folate intake in persons with unrecognized Cbl deficiency. Folate interacts closely with Cbl metabolically (both vitamins intersect as cofactor and coenzyme in the methylation of homocysteine to methionine) as well as clinically (deficiency of either vitamin causes megaloblastic anemia). It also interacts therapeutically: large but uncontrolled series of patients with pernicious anemia (a disease defined by Cbl malabsorption resulting from irreversible loss of gastric intrinsic factor secretion) had described partial improvement of the megaloblastic anemia of Cbl deficiency when folic acid, usually in large daily oral doses, was given without Cbl in the 1940s and 1950s (reviewed in Chanarin 1979).
Although this phenomenon was and remains unclear and inconsistent, the folate-mediated hematologic improvement could sometimes mask the Cbl deficiency, delay Cbl therapy, and eventually lead to neurologic progression that could be irreversible. Some anecdotal reports suggested that folic acid may occasionally even accelerate neurologic deterioration. All these considerations motivated the US Food and Drug Administration to try to limit daily folate intake to <1 mg when it mandated folic acid fortification. However, events were overtaken in the United States, in part by enthusiasm for self-supplementation.
As metabolic testing improved in the 1990s, surveys found metabolically defined but asymptomatic Cbl deficiency to far outnumber clinically expressed deficiency. Rates were 10–25% among the elderly, largely because minimal biochemical changes became easily recognizable (Lindenbaum et al. 1994; Carmel et al. 1999; Carmel 2000). Concern arose that the success of folic acid fortification exposed more Cbl-deficient people than anticipated to folate-related neurologic risk and needed to be addressed, possibly by adding Cbl fortification. However, closer scrutiny indicates the situation, and thus also its solution, to be far from straightforward because almost all the new cases differed in important ways from clinically recognizable Cbl deficiency typified by pernicious anemia, which affects only 1.9% of the elderly (Carmel 1996). The differences between the two forms of Cbl deficiency, and their implications, especially the important role of malabsorption, have not always been appreciated. Individual authors and even convened panels seemed confident that Cbl fortification would prove as simple and successful as folic acid fortification. That optimism was at odds with older documentations of the complex, restrictive physiology of Cbl absorption and bioavailability, especially in Cbl-deficient individuals (Berlin et al. 1968; Chanarin 1979), and with the dissimilarities that make folic acid a misleading model for Cbl fortification (Carmel 2008). Not the least was the difference in goals and intended beneficiaries for the two fortifications. Folic acid targeted young, nondeficient, and usually healthy women, whereas Cbl fortification will target often malabsorbing elderly persons with Cbl deficiency of varying severity and health implications.
The remainder of this paper addresses the two main questions posed by Cbl fortification to prevent or reverse Cbl deficiency.
Several goals have been envisioned: two pertaining to concerns about Cbl deficiency and the other two to neural-tube-defect-related concerns. The most immediate goal is to prevent neurologic progression of unrecognized Cbl deficiency because of exposure to folic acid fortification. The second, broader, goal is to reduce the high frequency of mild Cbl insufficiency that especially often affects the elderly. Third, some have suggested that Cbl may itself contribute to preventing neural tube defects (Thompson et al. 2009). Finally, fortification may remove the Cbl deficiency-related constraints on further raising folic acid intake and thereby perhaps increase folate’s ability to prevent neural tube defects.
The validities of these goals and expectations require a clear understanding of Cbl physiology and the pathophysiology of Cbl deficiency. The most important of the concepts are outlined in Table 1. The chief points are that Cbl bioavailability is limited under normal circumstances and becomes even more limited under abnormal ones. Absorptive capacity plays a critical role: the classic Cbl deficiencies, the ones that feature megaloblastic anemia and/or neurologic dysfunction, are almost invariably caused by significant malabsorption (94% of clinically expressed deficiency in the survey by Savage et al. 1994), usually reflecting loss of gastric intrinsic factor (pernicious anemia) or interference with its intestinal uptake. Moreover, years must pass before stores become sufficiently depleted because daily losses are so small relative to body stores, and this is usually achieved only when malabsorption is chronic and does not fluctuate (Table 1); often, the underlying causes are irreversible.
This classic and typically malabsorptive, and hence progressive, deficiency state characterized by megaloblastic anemia and/or neurologic dysfunction is too uncommon a condition to justify population-wide intervention. As a serious medical disease, it requires individualized medical intervention instead, including injections or very large oral doses of Cbl. Such deficiency is rarely preventable or curable by dietary fortification. Not only is classic deficiency such as pernicious anemia the only Cbl deficiency state with proven, consistent health consequences, it is also the only state in which neurologic complications after folic acid therapy have been described (Chanarin 1979). The only form of Cbl deficiency common enough to warrant fortification and that might respond to small oral doses is the mild asymptomatic deficiency first identified in the 1980s (Carmel and Karnaze 1985; Carmel et al. 1987). This subclinical Cbl deficiency state usually involves only biochemical changes. As importantly, it rarely arises from intrinsic-factor-related absorption failure (Carmel et al. 1987; Karnaze and Carmel 1990; Carmel 2000). Instead, subclinical deficiency is usually nonmalabsorptive or, in 30–50% of cases, features mild malabsorption confined to food-bound Cbl (FBCM) (Carmel and Karnaze 1985; Carmel et al. 1987; Carmel 1995, 2000). FBCM is a partial (and occasionally reversible by antibiotics) malabsorption resulting from impaired release of Cbl from food and leading to very slowly progressive and perhaps even fluctuating Cbl deficiency (Carmel 1995).
If subclinical deficiency, currently defined only by biochemical changes, is the only target suitable for Cbl fortification, the appropriateness of fortification depends on proving the health consequences of subclinical deficiency and their responsiveness to small oral doses of Cbl. There are undoubted reasons to be concerned enough about the issue to justify efforts to obtain definitive answers. The most commonly expressed rationales reflexively assume that, left untreated, asymptomatic deficiency inevitably progresses to clinically expressed deficiency with megaloblastic anemia and neurologic dysfunction, just as early preclinical pernicious anemia with its permanent loss of intrinsic factor always does. However, no basis exists for equating the two. Progression of irreversible, malabsorptive conditions is inevitable, but subclinical Cbl deficiency, the causes of which are unknown in >50% of cases (Carmel 1995), has an uncertain and often static natural history. Progression of subclinical deficiency to clinical deficiency seems surprisingly infrequent (see Carmel 2000). Even the mild biochemical changes that define subclinical deficiency remain stable for years, and they reverse spontaneously much more often than they progress (44% vs. 16% of cases), with the rest remaining static over 4 years of study (Hvas et al. 2001). Subclinical deficiency’s occasional origin in very early pernicious anemia may explain much of whatever modest risks of progression exist (Carmel 2000, 2008, 2009). Moreover, when subclinical deficiency caused by FBCM or dietary insufficiency progresses, it is likely to do so very slowly. The natural history of subclinical deficiency and its risk of progression to clinical deficiency remain undefined, but they require definition in order to understand this common condition and to justify Cbl fortification.
The second frequently expressed reason for concern is that hidden health hazards may exist even in the asymptomatic but biochemically deficient state. Indeed, manifestations at or near the threshold of clinical expression, such as mild evoked potential and electroencephalographic abnormalities of unknown clinical relevance but which often reverse with Cbl injections, have been described (Carmel et al. 1987; Karnaze and Carmel 1990; van Asselt et al. 2001). They remain murky, however. To illustrate, a prospective study of 16 patients with dementia found high frequencies of mild neuropathy that often responded to Cbl, along with the electrophysiologic and metabolic abnormalities, whereas the dementia never improved (Carmel et al. 1995). However, the study was uncontrolled, some patients had mild Cbl-responsive macrocytic anemia as well, absorption status could not be tested in the demented individuals, and Cbl therapy was parenteral and intensive; all this suggested that the deficiency may not have been subclinical, and the likelihood that small oral doses could have produced similar improvement is unknown.
Even murkier are the meanings of statistical associations of Cbl levels with mild cognitive deficits or even loss of brain volume in the elderly. Cognitive associations have been inconsistent (Raman et al. 2007), confounders are frequent, other explanations and influences have been suggested, and Cbl levels are often only relatively rather than absolutely low as in Cbl deficiency (see, for example, Vogiatzoglou et al. 2008). Most importantly, nearly all the neurocognitive studies, even when prospective and impressive, have been observational, so the associations may identify markers rather than causes. Proposed nonneurological associations with subclinical deficiency, such as osteoporosis, diabetes in offspring, and tinnitus, are murkier still.
As to the neurocognitive impact of high folic acid intake on Cbl deficiency (Chanarin 1979; Reynolds 2006), it is unknown if the risks thus far reported only in patients with pernicious anemia, whose deficiency always progresses relentlessly, extend to subclinical deficiency. The question has been tackled broadly, but thus far inconclusively, by epidemiologic surveys, which do not lend themselves to sufficiently informative clinical, neurologic, and hematologic assessments (Carmel 2009). Whereas high folate levels appear to be associated with worse metabolic Cbl status in individuals with low Cbl levels than do normal folate levels (Selhub et al. 2007; Miller et al. 2009), only one survey (Morris et al. 2007) found cognitive dysfunction to also be more frequent with high folate status; two others did not (Clarke et al. 2008; Miller et al. 2009). All the cross-sectional findings await clarification (Carmel 2009).
Despite the legitimacy of the concerns, data have been insufficient to provide a basis for informed decisions about mandatory fortification with Cbl. Information that oral cobalamin actually modifies any of the proposed health risks of subclinical deficiency, not just its mild biochemical abnormalities, is badly needed (Table 2). Given the unsettled state of knowledge, the idea that randomized placebo-controlled trials of subclinical deficiency are unethical (Eussen et al. 2005) must be vigorously resisted. We need only recall the compelling associations of homocysteine with cardiovascular disease that failed to survive examination by clinical trials.
Several features peculiar to Cbl bioavailability are relevant to thinking about the feasibility and effectiveness of Cbl fortification (Table 1) and have been reviewed elsewhere (Carmel 2008). Cbl bioavailability is very limited and tightly restricted. The strict requirement for intrinsic factor maximizes absorption efficiency (>50% bioavailability from usual meal-sized intakes of 1–2 μg Cbl). However, this saturable mechanism has limited capacity (bioavailability drops sharply when intake exceeds 1–2 μg) and few escape hatches. If intrinsic factor disappears or its ileal receptors fail, bioavailability plummets to the nonspecific diffusion of just 1.2% of an ingested dose on average (Berlin et al. 1968). As a result, the daily 1-μg losses, normally replaced by the 50% bioavailability from recommended dietary intakes of 2.4 μg, can be replaced in persons with malabsorption only when intake is raised to 1,000 μg daily (the seeming excess allowing for unpredictable individual variations and the failure to reabsorb biliary Cbl). Fortification that provides only 1–10 μg daily thus may be ineffective in many deficient elderly persons. All these details contrast sharply with the excellent bioavailability of folic acid over a wide dosage range that made its fortification effective (Food and Nutrition Board, Institute of Medicine 1998). In addition, whatever the bioavailability of Cbl taken by fasted normal or abnormal individuals, it declines by about 40% when taken with a meal (Berlin et al. 1968). This inhibitory effect on Cbl bioavailability further complicates a fortification strategy based on the meal setting.
Cbl also differs dramatically from folate and other vitamins in the frequent origin of its deficiency in malabsorption and infrequent origin in poor intake (Chanarin 1979; Savage et al. 1994; Howard et al. 1998). As a result, bioavailability is likely to be worst precisely in those who need Cbl treatment the most and to be best in those who do not need Cbl and usually have normal absorption. Commonly recommended 1- to 10-μg fortification dosing may be ineffective in the former. Surveys describing Cbl level responses to small oral doses of Cbl often obscure the problem; they tend to emphasize overall mean rises, which allows the predominance of normal individuals in the study population to hide the small but crucial subsets of individuals with Cbl deficiency who cannot respond because of malabsorption. The latter can be ferreted out only by identifying and focusing on the relevant subsets at risk.
Cbl malabsorption is also diverse. FBCM, a mild, partial, and often reversible malabsorptive disorder wherein only Cbl release from food is impaired, is much more common than the relatively rare loss of intrinsic factor or failure of intrinsic factor in the ileum, in which free Cbl not bound to food as well as food-bound Cbl cannot be absorbed and biliary Cbl may not be reabsorbed (Carmel 1995). Because synthetic, free cyanocobalamin remains absorbable in persons with FBCM, normal responsiveness to small oral doses of free cyanocobalamin seems likely. However, that assumption may be premature (Carmel 2008); it is conceivable that free Cbl taken with a meal could become bound to the accompanying food upon ingestion. Patients with mild biochemical Cbl deficiency following gastric surgery, in which FBCM is frequent, did not improve metabolically until oral Cbl doses reached 50 μg or even 350 μg (Schilling et al. 1984). Moreover, two controlled dosage trials and one sequential study of elderly patients with mild deficiency unrelated to gastric surgery found that biochemical improvement often lagged until oral doses (apparently taken with meals) exceeded 500 μg daily (Seal et al. 2002; Rajan et al. 2002; Eussen et al. 2005); although absorption was not tested, FBCM may explain the poor responsiveness. A conflicting report claiming good responses to small oral doses in patients with FBCM (Blacher et al. 2007) was not credible because defective diagnostic criteria for FBCM, spuriously labeled as “Carmel’s criteria,” substituted for actual absorption testing (Carmel 2007).
The existing information, though limited in many respects, suggests that the fortification dose likely to help may approach 500–1,000 μg in many mildly Cbl-deficient elderly persons. Such dosage is impractical for fortification and could simultaneously overload normal people without malabsorption, in whom bioavailability from 1,000-μg doses is nearly twice that in pernicious anemia (Berlin et al. 1968).
The case for fortification cannot be taken beyond the rigorous scientific evidence of the weakest link in the plan, be it rationale, effectiveness, feasibility, safety, or monitoring. Much work remains to be done to assure scientific rigor. A fuller review of the many issues to be considered and resolved, including possible adverse consequences, can be found elsewhere (Carmel 2008), and some of those issues were also revisited in a recent commentary (Green 2009).
The discussion presented here does not address toxicity issues; none are known, but potential ones were addressed elsewhere (Carmel 2008). Were any toxicities to emerge from widespread, chronically high Cbl intake, their brunt would be borne by normal persons with normal absorption, and perhaps especially by children. Moreover, because <0.1% of body stores turn over daily, inflated stores may take years to return to normal.
There is urgency to address the many important issues because the concerns about widespread mild Cbl deficiency need fact-based resolution (Table 2). Ample evidence that the American public has already embraced liberal vitamin self-supplementation adds to the urgency because it may soon become impossible to marshall adequately sized, unsupplemented populations suitable for clinical trials. Well-planned, randomized clinical trials with well-characterized study subjects with clearly defined normal and abnormal Cbl absorption and realistic, effective doses must be undertaken soon if we are to convert assumptions into compelling evidence.
Much of the work underlying this article was funded by grant DK32640 from the National Institutes of Health.
Competing interest: None declared.
Presented at the 7th International Conference on Homocysteine Metabolism, Prague, 21–25 June 2009