|Home | About | Journals | Submit | Contact Us | Français|
Uric acid has historically been viewed as a purine metabolic waste product excreted by the kidney and gut that is relatively unimportant other than its penchant to crystallize in joints to cause the disease gout. In recent years, however, there has been the realization that uric acid is not biologically inert but may have a wide range of actions, including being both a pro- and anti-oxidant, a neurostimulant, and an inducer of inflammation and activator of the innate immune response. In this paper, we present the hypothesis that uric acid has a key role in the foraging response associated with starvation and fasting. We further suggest that there is a complex interplay between fructose, uric acid and vitamin C, with fructose and uric acid stimulating the foraging response and vitamin C countering this response. Finally, we suggest that the mutations in ascorbate synthesis and uricase that characterized early primate evolution were likely in response to the need to stimulate the foraging “survival” response and might have inadvertently had a role in accelerating the development of bipedal locomotion and intellectual development. Unfortunately, due to marked changes in the diet, resulting in dramatic increases in fructose- and purine-rich foods, these identical genotypic changes may be largely responsible for the epidemic of obesity, diabetes and cardiovascular disease in today’s society.
Gout is one of the best and the oldest known afflictions in humans. Its emergence often corresponds to a period of societal wealth, such as occurred during the Golden Age of Greece, the Roman Empire, and the industrialization of Europe in the eighteenth and nineteenth centuries (Hartung 1957). In the last century, gout has increased in frequency throughout the world, and correlates with the rising rates of obesity and cardiovascular disease. Indeed, gout can be considered another characteristic feature of the great epidemic characterized by increasing rates of hypertension, metabolic syndrome, diabetes, and chronic kidney and cardiovascular disease (Johnson et al. 2005).
Gout is associated with elevated blood levels of uric acid (which circulate in the form of sodium urate) and results when the urate precipitates as crystals in the synovial fluid. For the purposes of this paper, we will use the word “uric acid” as this is the standard used in the literature even though uric acid is most commonly present as monosodium urate in most biological fluids. Originally discovered in the urine by Scheele in 1776, uric acid was extracted from a tophus by Wollastone in 1787, and found to be elevated in the blood of gouty subjects by Garrod in 1848. Faires and McCarty (1962) provided the ultimate proof that gout was caused by uric acid when they injected their own knee joints with monosodium urate crystals and promptly developed the excruciating signs of the disease.
Today uric acid is synonymous with gout and is measured in the serum only when subjects are suspected of having gout or kidney stones. Inherent in the reasoning is the longstanding belief that uric acid in its soluble urate form is a biologically inert endproduct of purine metabolism that is excreted similar to other nitrogenous wastes, including urea and ammonia (Keilin 1959). This is not surprising, as uric acid is the primary nitrogen containing waste product produced by reptiles and birds. Unlike ammonia-excreting (ammonotelic) amphibians and fish, or urea-excreting (ureotelic) mammals, reptiles and birds metabolize amino acids and proteins into uric acid. Reptiles and birds can excrete the uric acid via the cloaca in almost solid form, allowing for the effective elimination of nitrogen (uric acid has 4 nitrogens per molecule) with minimal water loss (Gutman 1965).
The observation that uric acid is a waste product that is excreted by the urinary tract and gut has led to the conclusion that the primary problem associated with uric acid occurs only when serum levels increase such that crystalline monosodium urate is precipitated in joints. Indeed, asymptomatic hyperuricemia is generally considered a benign condition (Duffy et al. 1981). However, emerging evidence suggests that uric acid is not biologically inert, but rather is a highly reactive substance that may act as a “physiological alarm signal” with multiple potential biologic functions. In this short review we summarize some of the evidence supporting this newly appreciated role, and the potential deleterious consequences.
One of the major steps in human evolution was the introduction of apes during the early Miocene epoch (18–23 MYA). These early hominoids had larger brains and bigger bodies than monkeys, and were arboreal quadrupeds that lived in lush tropical rain forests, particularly in Eastern Africa. In the early Miocene, there was a marked expansion of apes, with over 100 species documented. However, during the mid Miocene there was an environmental change to a cooler, drier and more seasonal climate associated with the development of savannas with interspersed tropical and subtropical rain forests (Andrews and Martin 1991). In Central Europe, the annual mean temperature fell by approximately 7°C to approximately 15°C (Bohme 2003). In addition to temperature change, two flaming asteroids (‘bolides’) crashed in western Bavaria (Germany) approximately 14.5–15 million years ago, creating the Ries and Steinheim Craters with an estimated power of 1.8 million Hiroshima bombs. It is not known if the bolide impacts had any significant effects on the local terrestrial fauna. Nevertheless, evolutionary records of many ape species disappeared in the mid Miocene, suggesting that a mass extinction occurred. Only a limited number of species, such as Sivapithecus and Dryopithecus, appear to have survived (Begun 2003; Pilbeam 1996).
It is during the mid Miocene that major changes in uric acid metabolism occurred in our hominoid ancestors. As shown in Fig. 1, uric acid is generated from xanthine by xanthine oxidoreductase (XOR). In most mammals, uric acid is degraded by urate oxidase (uricase) to allantoin. However, during the mid Miocene there was a stepwise loss of uricase activity due to mutations involving the promoter region (Oda et al. 2002). Eventually, uricase was completely silenced, consisting of a nonsense mutation in codon 33 of exon 2 in the common ancestor of the Great Apes (gorilla, chimpanzee, orangutan, and human) and a nonsense mutation in codon 18 of exon 2 among the ancestors of the Lesser Apes (gibbon, siamang) (Oda et al. 2002; Wu et al. 1992). Based upon mutational rates and molecular evolutionary theory, Oda et al. (2002) have suggested that these mutations occurred approximately 15 and 9 million years ago, respectively. These studies suggest that the Greater and Lesser Apes each had their own ancestor that survived the “Miocene Disruption”, and both had major mutations in the uricase gene.
The parallel loss of uricase among early primates strongly suggests that there was an evolutionary advantage for having higher uric acid levels at that time in history. However, one cannot ascribe this to the need for humans to excrete nitrogen via uric acid as is observed in reptiles, for the vast majority of amino acids are metabolized to urea in humans, and only 1–3% of nitrogen is excreted in the form of uric acid (Gutman 1965). This suggests that uric acid may have some other key unidentified function of great biological significance. While many hypotheses have been proposed (Ames et al. 1981; Johnson et al. 2008; Orowan 1955; Proctor 1970; Watanabe et al. 2002), we would like to present new insights that come from the comparative physiology literature that may help to unify all prior hypotheses. Specifically, we would suggest that uric acid is a physiologic “alarm signal” key for survival under dire circumstances, but which, ironically, has now gone awry in modern society.
One of the most important biologic principles is the ability to survive during times of food shortage. In this regard, it has long been appreciated that uric acid is increased under conditions of starvation (Lennox 1924; Ogryzlo 1965). Studies in experimental animals have shown that fasting results in an initial rapid weight loss, followed by a second phase (lipid utilization phase) in which weight loss is low but constant, and then a period of more active weight loss (protein breakdown phase). During the protein breakdown phase serum uric acid may rise markedly (Robin et al. 1998), and this is associated with increased locomotor activity, a decrease in water excretion, and a rise in cortisol levels (Challet et al. 1995; Cherel and Le Maho 1991). In rats that are obese and have good lipid stores, fasting can occur for a longer time before the protein breakdown phase occurs (Cherel et al. 1992). Studies in humans have also confirmed that during prolonged fasting there is a preferential utilization of lipid stores with conservation of total body proteins (Cahill 1970). Studies in the 1920s showed that the sudden starvation of subjects can result in an increase of uric acid from approximately 4.1 mg/dl (244 μM) to 10.7 mg/dl (640 μM) within a week (Lennox 1924).
A particularly well studied example is the male emperor penguin that fasts for up to 4 months as it nests on ice under environmentally challenging conditions in Antarctica (Robin et al. 1998). These birds lose more than half of their weight during the fasting period (from a mean of 38.5 to 18.1 kg). Plasma uric acid levels remain low during the fast until the protein breakdown phase occurs, and then rise markedly, increasing from 3 mg/dl (180 μM) to 26 mg/dl (1530 μM) (Robin et al. 1998). The rise in uric acid occurs concurrently with a marked rise in plasma cortisol and locomotor activity, and is associated with the penguin leaving the nest in search of food (Robin et al. 1998).
Similarly, migratory birds also increase their fat stores prior to long distance flight as a means to prevent protein catabolism (Jenni et al. 2000). In one study, migratory birds originating from North Africa were captured shortly after traversing more than 500 km across the Mediterranean Sea. Serum uric acid in these birds correlated inversely with fat stores and directly with plasma cortisol levels (Jenni et al. 2000).
Hibernation is a relatively unique form of fasting associated with significant decrease in body temperature and metabolic rates (Carey et al. 2003; Heldmaier et al. 2004). Two types of hibernating mammals exist, consisting of fat storing mammals and food storing mammals. Fat storing mammals increase their tissue fat stores markedly prior to hibernation, often with an increase in body weight that may reach two-fold (Carey et al. 2003). In contrast to fat storing mammals, food storing hibernators will arouse intermittently and feed on a cache of food that they have brought into their den. Both types of hibernating mammals primarily use fat stores during the period of hibernation. Interestingly, studies in both the arctic ground squirrel (Ma et al. 2004; Toien et al. 2001) and the Syrian hamster (Okamoto et al. 2006) have reported a three- to four-fold increase in serum uric acid following induced arousal from the hibernating state. Another study, however, did not report a change in uric acid in the Syrian hamster with induced arousal (Osborne and Hashimoto 2007); however, in none of these studies was the arousal linked with the state of fat stores in the body, and absolute uric acid levels are low in these animals since these mammals express uricase. In contrast, studies of hibernating snakes have occasionally documented extremely high uric acid levels posthibernation, with levels reaching as high as 70 mg/dl (4.1 mM) and which can be associated with visceral gout and death (Dutton and Taylor 2003). The authors postulated that these marked changes in uric acid levels were due to insufficient fat stores (Dutton and Taylor 2003).
These studies suggest that the onset of this proteolytic phase is associated with a marked change in behavior, including the departure of the penguin to the sea for food, the awakening of the hibernating animal, and the signal for the migratory bird to feed. To date, direct evidence that the increase in uric acid has a functional role in this foraging behavior has not been shown, and the rise in uric acid is simply recognized as a component of this syndrome. However, as shown below, evidence from other experimental models suggest that this foraging response may be mediated in part by the rise in uric acid.
In the 1950s, Orowan (1955) proposed that an increased uric acid may have been advantageous to early hominoids due to potential neurostimulant properties based on the fact that uric acid is chemically similar to caffeine (a trimethylated xanthine). Indeed, studies have found relationships of uric acid level with IQ testing (Stetten and Hearon 1959), achievement-oriented behavior (Brooks and Mueller 1966), and school performance (Bloch and Brackenridge 1972), although the strength of the associations is usually weak. In addition, there are studies demonstrating that uric acid can increase locomotor activity in rats (Barrera et al. 1989), that uric acid increases with emotional or physical stress (Rahe et al. 1974), and that it may be linked with hyperactivity in children (Barrera et al. 1988). Uric acid does not alter adenosine binding in the brain such as observed with caffeine (Hunter et al. 1990); however, experimentally raising uric acid increases catecholamines in the striatum and substantia nigra of the guinea pig (Church and Rappolt 1999).
The increased locomotor activity associated with foraging has been shown to be mediated in part by the rise in corticosterone levels (Challet et al. 1995). However, the studies above raise the interesting possibility that uric acid may also have a role in mediating this behavior.
Uric acid may have a role in blood pressure regulation. Uric acid stimulates both the local (Corry et al. 2008) and systemic (Mazzali et al. 2001; Toma et al. 2007) renin angiotensin system and inhibits endothelial cell release of nitric oxide (Kang et al. 2005; Khosla et al. 2005). Raising uric acid in rats causes hypertension (Mazzali et al. 2001). A high serum uric acid also both predicts and is present in a high percentage of subjects with newly diagnosed essential hypertension (Feig and Johnson 2003; Masuo et al. 2003; Sundstrom et al. 2005), and recent clinical studies have documented that lowering uric acid with allopurinol results in a decrease in blood pressure in certain well-defined populations (Feig et al. 2004; Kanbay et al. 2007; Talaat and El-Sheikh 2007). Over time experimental hyperuricemia also causes salt-sensitivity which is mediated in part by the development of renal microvascular disease and interstitial inflammation (Watanabe et al. 2002). This has led to the hypothesis that the uricase mutation may have provided a survival advantage due to its effect to help preserve sodium retention and blood pressure (Watanabe et al. 2002) during this period in which food may have been scarce and sodium intake low (Eaton and Konner 1985).
An elevated serum uric acid is also commonly observed in subjects with obesity or metabolic syndrome, and historically was considered to be elevated secondary to the hyperinsulinemia (Facchini et al. 1991). However, an elevated uric acid often precedes the development of obesity (Masuo et al. 2003), hyperinsulinemia (Carnethon et al. 2003), and diabetes (Boyko et al. 2000; Dehghan et al. 2007, 2008; Nakanishi et al. 2003), documenting that it cannot always be considered a secondary phenomenon. For example, Japanese researchers followed a group of young healthy men for five years and found that those with higher uric acid levels gained four times more weight than those with low uric acid (Masuo et al. 2003). These studies suggest that uric acid may have a causal role in obesity and metabolic syndrome.
A potential insight into this possibility came with the discovery that fructose intake (in the form of table sugar or fructose corn syrup) is closely linked with the epidemic of obesity and metabolic syndrome (Bray et al. 2004; Havel 2005). Fructose is unique among sugars in that it rapidly causes features of metabolic syndrome both in experimental animals and humans (Johnson et al. 2007; Segal et al. 2007). Fructose ingestion also leads to fatty liver and elevated triglycerides in humans (Ouyang et al. 2008) and can also raise blood pressure (Brown et al. 2008). Intriguingly, fructose is a sugar that has the unique ability to raise serum uric acid. Upon entering the cell, fructose is rapidly phosphorylated by fructokinase. Unlike glucose, the phosphorylation of fructose is not regulated and local ATP depletion may occur, rendering the cell ischemic and generating uric acid in the process. Serum uric acid levels rise within minutes of fructose ingestion, often in association with lactate release (Perheentupa and Raivio 1967).
The possibility that fructose induced metabolic syndrome might be mediated by uric acid was studied in animals. Lowering uric acid was found to both prevent and treat early features of metabolic syndrome (Nakagawa et al. 2006a; Sanchez-Lozada et al. 2008). The mechanism was shown to be mediated in part by inducing endothelial dysfunction that impairs insulin action, as well as by direct effects on the adipocyte (Nakagawa et al. 2006a, b; Sautin et al. 2007). Confirmatory studies for a role for uric acid in obesity was provided by the XOR knockout mouse, which has a major defect in adipogenesis and fails to get fat (Cheung et al. 2007). Furthermore, there are old reports that the chronic administration of uricase inhibitors (which raise uric acid) to rats results in hypertension, hypertriglyceridemia, fatty liver, hyperglycemia, and elevated aldosterone levels (Wexler 1982; Wexler and Greenberg 1977). However, in those studies, proof that the mechanism was due to uric acid was not provided since no attempts were made to lower uric acid in those animals.
In summary, these studies suggest that the rise in uric acid associated with fasting might induce features of metabolic syndrome that could help favor weight gain and the reaccumulation of fat stores.
It is also known that hominoids during the Miocene could not biosynthesize vitamin C, as a key gene involved in vitamin C production (L-gulono-lactone oxidase) had been mutated 20–40 million years earlier. Hence, the uricase mutation has been proposed to increase uric acid as an anti-oxidant that could replace the decrease in vitamin C availability that may have occurred during this period (Ames et al. 1981; Proctor 1970). Following Proctor’s (1970) proposal that uric acid might function as an antioxidant similar to ascorbic acid, Ames et al. (1981) presented evidence that uric acid could function as an antioxidant in various redox reactions. Uric acid scavenges hydroxyl radicals, singlet oxygen (Ames et al. 1981), and peroxynitrite (Whiteman and Halliwell 1996), and also chelates iron and blocks iron catalyzed oxidation reactions (Davies et al. 1986). Uric acid also maintains various antioxidant systems, by preventing the inactivation of extracellullar SOD by hydrogen peroxide (Hink et al. 2002), the oxidation of ascorbate (Sevanian et al. 1985), and the oxidation of tetrahydrobiopterin in cultured endothelial cells exposed to peroxynitrite (Kuzkaya et al. 2005). Uric acid also exhibits negative feedback by inhibiting xanthine oxidase activity (Tan et al. 1993). Given the high concentrations of uric acid in the blood, some studies suggest that uric acid accounts for 60% of total plasma antioxidant activity (Nieto et al. 2000; Wayner et al. 1987). Hence, according to the hypothesis of Ames, the uricase mutation may have provided an evolutionary advantage by providing key antioxidant functions that might help combat cancer, prevent vascular disease, and combat the oxidative stress associated with aging.
Interestingly, uric acid can also be prooxidative under various conditions (Corry et al. 2008; Kanellis et al. 2003; Santos et al. 1999). For example, when uric acid reacts with peroxynitrite, it will generate several radicals as the urate molecule is degraded to triuret (Gersch et al. 2007; Santos et al. 1999). Uric acid can also stimulate NADPH oxidase with oxidant release (Corry et al. 2008; Sautin et al. 2007). Therefore, the anti-oxidant versus pro-oxidative effects of uric acid might depend on the environmental circumstance. Thus, while an increase in antioxidant activity might provide some protection in animals that are under stress such as during starvation, an increase oxidative capacity might also potentiate the stress and survival response.
Uric acid has also been recently recognized to have an important role in innate immunity, and especially the activation of dendritic cells and antigen presenting cells to endogenous antigens. For example, dying cells release microcrystalline uric acid that acts as an adjuvant to stimulate dendritic cells to aid in their removal (Shi et al. 2003). Dendritic cell activation by uric acid may indirectly influence the development of autoreactive T cells in autoimmune diabetes (Shi et al. 2006), in T cell recognition of transplanted cells (Shi et al. 2006) and exogenously administered embryonic stem cells (Kofidis et al. 2006) and in the rejection of tumors (Hu et al. 2004).
Theoretically, activation of the innate immune system might be beneficial in fighting infections that are often known to be precipitated by environmental change.
The above studies suggest that uric acid may have a beneficial effect as a physiological alarm signal under conditions of environmental stress and starvation. Indeed, it may have a basic biological role in the fattening-fasting cycle (Fig. 2). Thus, an increase in uric acid might have a number of beneficial effects, including (a) increasing locomotor activity necessary for foraging, (b) stimulating hypertriglyceridemia, fatty liver and weight gain to help reestablish fat stores and (c) increase blood pressure and induce salt sensitivity to help protect against dehydration. The development of insulin resistance also could be beneficial by reducing glucose uptake into skeletal muscle and adipose tissue, thus preserving glucose for utilization by the brain where glucose uptake is insulin independent, as suggested by Reaven (1999).
Furthermore, one could argue that fructose becomes an excellent food to aid an animal that is preparing to hibernate or fast for extended periods. Hence, due to its effect to stimulate the synthesis of triglycerides and to cause fatty liver (Ackerman et al. 2005; Ouyang et al. 2008) the animal will increase its lipid stores which should improve survival during the period of fasting. Furthermore, by increasing uric acid levels, it may encourage food intake, insulin and leptin resistance, and weight gain. Finally, the fructose which is not absorbed may be broken down by bacteria in the colon to generate ketoacids (Davids et al. 2004), which may also provide an energy source for the brain.
In contrast, vitamin C (ascorbate) may have opposing effects. Thus, vitamin C neutralizes some of the effects of fructose and uric acid to induce metabolic syndrome (Sautin et al. 2007; Vasdev et al. 2002). In addition, low levels of vitamin C are associated with increased oxidative stress and increased blood pressure [reviewed in (Johnson et al. 2008)]. This suggests that low serum levels of ascorbate may aid the foraging response. Consistent with this concept, in animals that synthesize vitamin C, both starvation and fructose inhibit ascorbate synthesis (Banhegyi et al. 1997). Studies in the arctic ground squirrel have also reported that plasma vitamin C levels are elevated during the hibernating phase and then fall following arousal in association with an increased in oxidative stress (Ma et al. 2004; Toien et al. 2001). Finally, fruits generally increase their fructose content as they ripen towards the end of summer, and during this period vitamin C content decrease as well (Nagy 1980). Thus, the ripening of fruit with loss of vitamin C content towards the end of the harvest could increase the impact of fructose to increase fat stores.
Given this data, it is tempting to propose that the loss of vitamin C synthesis approximately 35 MYA may have represented a survival advantage by augmenting the effects of uric acid (and fructose) in the foraging response. Similarly, the uricase mutation that occurred in the mid Miocene may have had similar benefits. The observation that the uricase mutation favored the larger apes may reflect their greater need for plentiful fat stores in their liver. Possibly as a benefit, these mutations may have also facilitated improvement in mental performance and reaction time as proposed by Orowan (1955). Improved survival with foraging may have also helped accelerate the switch from quadrupedal to bipedal locomotion. Indeed, studies suggest that hominoids may have required this adaptation between 15 and 4.2 MYA in order to forage over wider areas due to the increase in open country habitats associated with the cooling and drying of the global climate (Pickford 2002). Thus, the uricase and ascorbate mutations could have had key roles in the evolutionary development of the human species.
The relative increase in uric acid in early hominoids who underwent this mutation was likely not very high according to today’s standards. With the loss of the rain forests and the development of seasons, the availability of fruit was likely less, and given the relative lack of meat intake at that time, there were not many sources for generating uric acid. We have reported that gorillas and chimpanzees (which lack uricase) have serum uric acid levels of approximately 3 mg/dl which is slightly higher than uric acid observed in primates that express uricase (Johnson et al. 2005). We have also measured uric acid levels in the primitive Yanomamo of southern Venezuela who likely lived much like our prehistoric ancestors. The Yanomamo are well known to have low blood pressure which can be attributed in part to their very low sodium intake (Oliver et al. 1975). As shown in Table 1, the uric acid levels averaged only around 3 mg/dl and were lower than the uric acid levels of the American explorers that had led that expedition (Oliver and Johnson, unpublished). While low salt diets are known to have a mild effect to increase uric acid levels (as well as plasma glucose and other features of metabolic syndrome) (Egan and Lackland 2000), the vegetable-based diet of the Yanomamo is sufficiently low in purines and fructose that the mean uric acid levels are lower than observed in Western societies. Thus, the mutation of uricase may have only raised the uric acid slightly (but significantly) during this period of history.
Given these data, we have hypothesized that the introduction of the Western diet resulted in dramatic increases in serum uric acid due to the availability of purine-rich meats and the introduction of sugar (fructose). In particular, the increase in fructose intake closely parallels the rise in gout, obesity and metabolic syndrome that has occurred over the last two centuries (Bray et al. 2004; Havel 2005; Johnson et al. 2007; Segal et al. 2007). Serum uric acid levels increased from <3.5 mg/dl in the early twentieth century to over 6 mg/dl today in adult males (Johnson et al. 2005). Furthermore, there has also been a marked increased in sodium intake since prehistoric times (Eaton and Konner 1985). Thus, the marked increase in uric acid, due to intake of fructose- and purine-rich foods, coupled with increased sodium intake, might account for why today’s society is at increased risk for the development of obesity, hypertension, diabetes, and cardiovascular disease.
To test this hypothesis, we propose studies of the function of uric acid in the fasting state, and then under conditions of excessive fructose, purine and salt intake. If studies are performed in mice, one would like to use mice that mimic the ancestral primate phenotype (lack of vitamin C synthesis with uricase) and similar mice in which uricase has been deleted.
Finally, the concept that a rise in uric acid may have a beneficial role in starvation yet when present in excessive amounts may cause obesity and insulin resistance has a parallel with cortisol. Thus, cortisol also increases with uric acid during the proteolytic phase of starvation and is also thought to have a role in the foraging response (Challet et al. 1995; Jenni et al. 2000; Robin et al. 1998). While cortisol levels are typically not elevated in subjects with metabolic syndrome, elevated cortisol levels do occur in Cushing’s syndrome. In this regard, the manifestations of hypercortisolism include abdominal obesity, dyslipidemia (hypertriglyceridemia), hypertension, insulin resistance, and fatty liver (Friedman et al. 1996; Rockall et al. 2003) Thus, one could hypothesize that cortisol and uric acid may have similar physiological roles for the survival of the host, which, if present in excessive amounts, may also have deleterious and similar consequences.
Supported by US Public Health Service grants HL-68607 (RJ), DK-52121 (RJ).
Disclaimers Dr Johnson is listed as an inventor on several patent applications related to the role of uric acid in hypertension and metabolic syndrome; Dr Johnson is also an author for a book on fructose and uric acid (The Sugar Fix) that was published by Rodale in 2008.