|Home | About | Journals | Submit | Contact Us | Français|
The authors examined whether a diet that increases plasma urate level is also related to reduced risk of Parkinson’s disease (PD). The study population comprised 47,406 men in the Health Professionals Follow-up Study. The potential effect of diet on plasma urate level was estimated by regressing plasma urate on intakes of selected foods and nutrients in a subsample of 1,387 men. Coefficients of this regression model were then used to calculate a dietary urate index for all cohort participants. Multivariate relative risks of PD were estimated by means of Cox proportional hazards models. After 14 years of follow-up (1986-2000), the authors documented 248 incident cases of PD. A higher dietary urate index was associated with a lower risk of PD (top quintile vs. bottom: relative risk = 0.47, p-trend = 0.0008), after adjustment for age, smoking, caffeine intake, and other potential confounders. This association remained strong and significant after further adjustment for each component of the index individually (p-trend < 0.02 for each). These data support urate as a potentially protective factor in PD and suggest that dietary changes expected to increase plasma urate level may contribute to lower risk of PD. These potential benefits, however, should be weighed against expected adverse effects on risk of gout and other chronic diseases.
Oxidative stress contributes to the loss of dopaminergic neurons in the substantia nigra of persons with Parkinson’s disease (PD) (1). Urate is a potent antioxidant that effectively scavenges reactive nitrogen and oxygen radicals (2), and persons with a high plasma urate level have a markedly lower risk of developing PD (3-5). Further, among patients with early PD, a higher plasma or cerebrospinal fluid urate concentration predicts slower clinical progression (6, 7). In a prospective study, persons with gout had a significantly lower risk of developing PD than persons without gout (8). The results of these studies suggest that urate could be a novel and tractable target for neuroprotective therapies, because plasma levels of urate can be increased through pharmacologic or dietary interventions. However, it remains unknown whether this association reflects the influence of dietary or endogenous determinants of serum urate level. A possible explanation for the association between higher plasma urate and lower PD risk or progression is that still-unknown genetic factors could affect both. It is therefore important to establish whether variations in plasma urate level that are due to exogenous factors are also related to risk of PD.
In the present study, we prospectively examined the relation between the uricemic potential of diet—assessed by means of a newly established, empirically derived dietary urate index—and risk of PD in a large cohort of US men. The primary hypothesis was that a prouricemic diet predicts a lower risk of PD.
The investigation was conducted among men participating in the Health Professionals Follow-up Study, a prospective cohort study designed to examine the effects of diet on chronic diseases. The Health Professionals Follow-up Study was established in 1986, when 51,529 male US health professionals aged 40-75 years completed a mailed questionnaire about their medical history and lifestyle. Follow-up questionnaires have been mailed to participants every 2 years to obtain updated information on potential risk factors and to ascertain newly diagnosed diseases. Follow-up rates have averaged 94 percent in each 2- or 4-year cycle. As in previous analyses on diet and PD in this cohort (9, 10), we excluded participants who reported implausible energy intakes (<800 kcal/day or >4,200 kcal/day), left 70 or more items blank on the baseline dietary questionnaire, or had been previously diagnosed with PD, stroke, or cancer (other than nonmelanoma skin cancer). A total of 47,406 men were included in the analyses.
Dietary intakes were assessed at baseline (1986) and every 4 years thereafter with a semiquantitative food frequency questionnaire validated for use in this population (11, 12). In brief, participants were asked how often (on average) over the previous year they had consumed a specific amount of each food item, with nine possible responses ranging from “never” to “six or more times per day.” Food composition values for nutrients were obtained from the Harvard University Food Composition Database, which was derived from US Department of Agriculture sources. The correlations between intakes of these foods as measured by a food frequency questionnaire and consumption of the same foods according to diet records were 0.79 for alcoholic beverages, 0.52-0.88 for dairy foods and dairy products, 0.56-0.83 for meats and fish, 0.78 for orange juice, 0.84 for soft drinks (cola), and 0.8 for vitamin C (11, 12).
Information on age, weight, height, smoking status, physical activity, and use of medicines was collected through questionnaires. Body mass index was calculated as weight (kg)/height (m)2.
Diet is an important determinant of uricemia, partly because of its content in purines (13), which are the metabolic precursors of urate, and partly through indirect effects on purine metabolism. For example, fructose and alcohol are prouricemic, whereas dairy proteins and vitamin C are antiuricemic (14-18). Although the short-term effects of each nutrient on plasma urate have been demonstrated in experimental studies, for the purpose of our investigation we were interested in the long-term chronic effects of diet on uricemia, which may be different from the short-term effects estimated in experimental research. Further, experimental studies are typically conducted in small groups of highly selected subjects, and results may not generalize to larger, more heterogeneous populations. Therefore, we estimated the long-term effects of diet on uricemia by means of an empirical regression approach, using a large subset of study participants with available data on plasma urate level, as described below.
To obtain an equation that predicted plasma urate level according to habitual diet, we used data from a subsample of 1,387 Health Professionals Follow-up Study participants previously selected for a prospective nested case-control study of uricemia and hypertension. Blood samples were collected in 1993 and 1994. A stepwise linear regression model was fitted, with plasma urate as the dependent variable and intakes of foods and nutrients known to affect uricemia as potential predictors (14-17, 19-24). Criteria for inclusion in the case-control study were: 1) blood sample drawn after fasting for ≥8 hours; 2) body mass index <30 in 1994; and 3) no history of hypertension in 1994 (25). The mean age of men in the subsample was similar to that of the entire cohort (61.1 years (standard deviation (SD), 8.6) vs. 61.2 years (SD, 9.5); because of the exclusion of obese men, their mean body mass index was slightly lower than the cohort average (25.0 (SD, 2.4) vs. 26.0 (SD, 3.6) in the cohort, both assessed in 1994). Urate concentrations were determined through oxidization with the specific enzyme uricase to form allantoin and hydrogen peroxide (Roche Diagnostics, Indianapolis, Indiana) at Boston Children’s Hospital Laboratory. The coefficient of variation for this assay, as determined using blind quality control specimens, was 2.7 percent. The mean plasma urate concentration was 6.0 mg/dl (SD, 1.2) (357 μmol/liter (SD, 71.4)).
In the stepwise model, we included 21 dietary components which may influence plasma urate level. To control for potential confounders, we forced the variables age, smoking, body mass index, caloric intake, and history of hypertension after blood collection into the stepwise regression model (see the Appendix for more details). After controlling for the covariates, the dietary variables selected through the stepwise procedure included dairy protein (regression coefficient (β) = -0.009, p = 0.01), alcohol (β = 0.013, p < 0.0001), fructose (β = 0.007, p = 0.02), and vitamin C (β = -0.0003, p < 0.0001). When we did not include history of hypertension after blood collection in the stepwise regression model, the same dietary items remained in the final model and the coefficients did not change. In addition, results were virtually identical when the stepwise regression was conducted only among the controls from the original case-control study.
Although total meat and fish/seafood intakes were not retained in the stepwise procedure in the present study, they are good sources of purines and were found to be significantly associated with plasma urate level in the Third National Health and Nutrition Examination Survey (20). Therefore, we constructed an additional regression model by adding these two variables to the model described above. The regression coefficients for this alternative model were -0.008 for dairy protein (g/day, p < 0.03), 0.014 for alcohol (g/day, p < 0.0001), -0.009 for fructose (g/day, p = 0.009), -0.0003 for vitamin C (mg/day, p < 0.0001), 0.06 for total meat (servings/day, p = 0.32), and 0.11 for total fish/seafood (servings/day, p = 0.37).
The correlation coefficients for correlation between predicted and observed plasma urate levels were similar in the main model and the alternative models (r = 0.23 and r = 0.24, respectively; p < 0.0001 for both). A similar correlation was found among the 103 men with blood samples who developed PD during follow-up (r = 0.19, p = 0.06).
The predictive equations derived above were then applied to the food and nutrient consumption data reported on the baseline (1986) food frequency questionnaire for calculation of the corresponding “dietary urate index” for each cohort participant. We calculated two indices for each participant: 1) the main index based on consumption of dairy protein, alcohol, fructose, and vitamin C, referred to as the “dietary urate index,” and 2) an alternative index based on the same items in addition to total meat and seafood. Because our purpose in this study was to estimate the independent effects of diet on risk of PD, these indices were based on the dietary items only; we adjusted for age, body mass index, and other nondietary predictors of plasma urate level as potential confounders in the analysis. In secondary analyses, we derived similar indices using diet reported in 1986 to predict PD between 1986 and 1994 and the average of the 1986 and 1990 diets to predict PD risk between 1994 and 2000.
We further examined the reproducibility of the dietary urate index over time. The Pearson correlations for the indices obtained from four food frequency questionnaires during 1986-1998 ranged from 0.5 to 0.7 (p < 0.0001 for all) (see appendix table).
We identified new cases of PD by means of biennial self-reported questionnaires (10). We then asked the treating neurologist to complete a questionnaire to confirm the diagnosis of PD or to send a copy of the medical records. A case was confirmed if a diagnosis of PD was considered definite or probable by the treating neurologist or internist, or if the medical record included either a final diagnosis of PD made by a neurologist or evidence of at least two of the three cardinal signs (rest tremor, rigidity, bradykinesia) in the absence of features suggesting other diagnoses (26, 27). We conducted the review of medical records while blind to participants’ exposure status. Overall, the diagnosis was confirmed by the treating neurologist in 81 percent of the cases, by review of the medical records in 3 percent, and by the treating internist without further support in the remaining 16 percent. We also requested the death certificates of the deceased study participants and identified PD diagnoses that were not reported during regular follow-up (<2 percent). If PD was listed as a cause of death on the death certificate, we requested permission from the family to contact the treating neurologist or physician and followed the same procedure as for the nonfatal cases. In this analysis, we used only definite and probable cases of PD, as we did previously (9, 28).
We computed person-time of follow-up for each participant from the return date of the baseline questionnaire (1986) to the date of the occurrence of the first symptoms attributed to PD, the date of death, or the end of follow-up (January 31, 2000), whichever came first. We used the date of first symptoms as an outcome because PD patients may change their diet or other relevant behaviors before the disease is diagnosed. Baseline dietary urate indices were categorized into quintiles, and relative risks were calculated by dividing the incidence rate in each quintile by the rate in the lowest quintile. Multivariate-adjusted relative risks were derived from Cox proportional hazards models controlling for age (in months), smoking status (never smoker, past smoker, or current smoker of 1-14 cigarettes/day or ≥15 cigarettes/day), body mass index (<23, 23-24.9, 25-26.9, 27-29.9, or ≥30), physical activity (quintiles), use of diuretics (yes/no), use of nonsteroidal antiinflammatory drugs (yes/no), caloric intake (kcal/day), and caffeine intake (quintiles). All covariates were measured at baseline (1986).
We also examined potential interactions between the dietary urate index and baseline age (<60 years vs. ≥60 years), smoking status (never smoker vs. ever smoker), body mass index (<25 vs. ≥25), and caffeine intake (0-149 mg/day vs. ≥150 mg/day, based on median intake) by adding multiplicative terms to the models. To address the possibility that dietary changes caused by early symptoms of PD might affect the results, we conducted lag analyses by excluding the first 4 years of follow-up. Because the dietary urate indices were derived from a subsample of men who were free of hypertension and had a body mass index less than 30, we also conducted a sensitivity analysis restricted to participants who met the applicable criteria of the subsample. Given the possibility that some cases of PD were not reported to us, either because the disease was not diagnosed or because the participant was too sick to complete and return the questionnaire, we censored participants at the age of 75 years; few participants younger than 75 would be expected to be severely ill and to not report the diagnosis of PD.
After an average of 14 years of follow-up, we documented 248 incident cases of PD. Participants with a higher dietary urate index were more likely to smoke, more likely to take diuretics, and consumed larger amounts of caffeine and energy than those with a low urate index (table 1) but did not differ from persons with a lower urate index with regard to other covariates.
Overall, we found that a higher dietary urate index was associated with a lower risk of PD. Comparing the top quintile of the index with the bottom quintile, the age- and smoking-adjusted relative risk of PD was 0.47 (95 percent confidence interval (CI): 0.30, 0.74; p-trend = 0.001). The results were not changed (relative risk (RR) = 0.47) after further adjustment for caffeine, body mass index, caloric intake, and other potential confounders (figure 1). To examine whether this association could be explained by any of the dietary factors included in the urate index, we further adjusted the analyses for each component of the index, one at a time. The relative risk for the highest quintile of the urate index versus the lowest remained significantly elevated after adjustment for vitamin C (RR = 0.39, 95 percent CI: 0.24, 0.64), dairy protein (RR = 0.53, 95 percent CI: 0.31, 0.88), fructose (RR = 0.49, 95 percent CI: 0.30, 0.78), and alcohol (RR = 0.45, 95 percent CI: 0.24, 0.83).
When considered individually, fructose, dairy protein, and alcohol were significantly or marginally significantly associated with PD risk in the direction consistent with their estimated effects on plasma urate level (table 2). Vitamin C was not related to PD risk. After further adjustment for the dietary urate index, greater intake of fructose remained a significant predictor of a lower PD risk, but the associations between dairy protein and alcohol and PD were much attenuated. In contrast, vitamin C intake became significantly associated with a lower risk of PD (for comparison of the two extreme categories, adjusted RR = 0.46, 95 percent CI: 0.21, 0.98; p-trend = 0.04).
We conducted several sensitivity analyses to examine the robustness of our findings and observed similar significant associations. When we used an alternative urate index which further included meat and seafood, the results did not materially change. The multivariate-adjusted relative risk for the highest urate index quintile versus the lowest was 0.44 (95 percent CI: 0.27, 0.70; p-trend = 0.0002). After exclusion of subjects with gout at baseline, the relative risk for the highest urate index quintile versus the lowest was 0.46 (95 percent CI: 0.28, 0.75; p-trend = 0.001). In the 4-year lag analyses, the multivariate relative risks changed slightly, relative to the primary analyses. The relative risks were 1.0 (referent), 0.73, 0.94, 0.58, and 0.42 (95 percent CI: 0.25, 0.71; p-trend = 0.0007) for quintiles 1-5 of the urate index. When we excluded participants with a body mass index ≥30 or with reported hypertension, the multivariate-adjusted relative risks across quintiles of the index were 1.0 (referent), 0.64, 0.85, 0.58, and 0.38 (95 percent CI: 0.21, 0.69; p-trend = 0.001). Similar significant results were obtained when we used the cumulative average dietary urate index (for highest quintile vs. lowest, RR 0.55, 95 percent CI: 0.35, 0.87; p-trend = 0.01) and when we censored participants at the age of 75 years (for highest quintile vs. lowest, RR = 0.45, 95 percent CI: 0.26, 0.76; p-trend = 0.001). Similar patterns were seen for PD cases diagnosed by neurologists and PD cases diagnosed by nonneurologists. The adjusted relative risks across quintiles of the index were 1.0 (referent), 0.68, 0.84, 0.61, and 0.42 (p-trend = 0.001) for cases diagnosed by neurologists and 1.0 (referent), 0.63, 0.76, 0.55, and 0.55 for cases diagnosed by nonneurologists (p-trend = 0.20).
We did not find significant interactions between the dietary urate index and age, smoking status, body mass index, or caffeine intake at baseline (table 3). The association between a higher urate index and a lower PD risk was evident in subgroup analyses carried out according to age, smoking status, body mass index, and caffeine intake.
In this large prospective study with 14 years of follow-up, we found that men with diets which may increase plasma urate level had a significantly reduced risk of developing PD: A greater than twofold reduction in risk was observed between the highest and lowest quintiles of a dietary urate index that summarizes the combined effects of diet on plasma urate level. Our results are unlikely to have been affected by recall or selection bias because of the prospective study design. The robustness of this association was further supported by a variety of sensitivity analyses.
Our results are consistent with the previous observations in several longitudinal studies that higher plasma urate level predicts a lower risk of PD (3-5) and with the recent finding that higher plasma urate level predicts a slower clinical progression of PD (6, 7). Earlier pathologic studies had demonstrated that PD patients had a lower cerebrospinal fluid urate or xanthine concentration than controls (29). However, neither the serologic studies nor the pathologic studies were able to exclude the possibility that the association between higher urate concentration and a lower risk of PD results from genetic factors causing both a decrease in plasma urate level and an increase in susceptibility to PD. In contrast, the results of the present study, suggesting that diet-dependent changes in plasma urate level modify PD risk, provide support for a protective effect of urate per se or of its metabolic correlates.
Experimental evidence also supports a potential role of urate in the pathogenesis of PD. Urate reduces oxidative stress primarily through its actions as an effective scavenger of peroxynitrite and hydroxyl radicals (2, 30). Urate completely prevented the death of the dopaminergic cells in culture induced by homocysteine plus rotenone or iron (31). Possible mechanisms for a neuroprotective action of urate include suppression of oxyradical accumulation and preservation of mitochondrial function (32), inhibition of the cytotoxic activity of lactoperoxidase (33), and protection from dopamine-induced apoptosis (34).
The interpretation that urate protects against PD, how-ever, seems at odds with the observation that the major nondietary determinants of hyperuricemia, such as male gender, obesity, and hypertension, either are not associated with PD risk or are associated with increased risk (35-37). A possible explanation for this discrepancy is that other factors associated with male gender, obesity, and hypertension may increase the risk of PD and thus counteract a protective effect of increased plasma urate level. These factors may include higher occupational exposure to neurotoxic chemicals in men as compared with women (38), a protective effect of estrogen (39), an inverse relation between obesity and the availability of dopamine D2 receptor in the striatum (40), and effects of the renin-angiotensin system on dopamine release (41). Alternatively, the lower risk of PD among persons with high plasma urate levels could reflect confounding by still-unknown genetic or environmental factors. In a recent study, coffee consumption, which is inversely related to PD risk in men (10), was found to be associated with lower plasma urate levels (42). However, neither caffeine intake nor coffee consumption was related to plasma urate in our cohort. More studies are needed to elucidate the association.
The empirically derived urate index is also consistent with previous observations on diet and risk of gout among men in this cohort. We found that intakes of alcohol (24) and fructose were associated with an increased risk of gout, while intakes of dairy food (21) and vitamin C were associated with a lower risk.
Ingestion of both fructose and ethanol, by increasing the degradation of adenine nucleotides, leads to increased synthesis of inosine, hypoxanthine, and xanthine, which is then converted to urate by xanthine oxidase (14, 16, 18). The activation of this purinergic pathway within the brain, rather than the final increase in plasma urate level, could be the relevant determinant of PD risk. This explanation, however, may not clearly explain the strong association between greater consumption of dairy food and the higher PD risk observed in this study and previous studies (9, 43), because the antiuricemic effects of dairy protein seem to be mostly due to increased uricosuria (15). Another possible explanation is that these dietary factors could modulate transport of urate through the blood-brain barrier. We also observed that increased vitamin C intake was associated with a lower plasma urate concentration, which is consistent with results from clinical trials (17). Vitamin C could increase urate excretion by competing with urate for renal reabsorption (17). The lack of an association between greater vitamin C intake and higher PD risk may be due to the possibility that the hypouricemic effect of vitamin C could be offset by its function as a powerful antioxidant. After adjustment for the dietary urate index, a greater intake of vitamin C was significantly associated with a lower risk of PD.
Strengths of this study include the large population, the longitudinal design, and the high rate of follow-up. The main limitation of the present study was that, because of its observational design, we could not exclude the possibility that the association between the urate index and PD risk resulted from some unmeasured underlying common determinate(s). In addition, the clinical diagnosis of PD is not perfect, and thus some degree of diagnostic error is likely to have been present. In a recent large clinicopathologic study, however, the positive predictive value of a clinical diagnosis of PD was found to be 90 percent or higher (44); thus, bias from this source is likely to have been modest.
In this large longitudinal investigation carried out among US men, we found that a diet expected to increase plasma urate level was associated with a reduced risk of PD. This result, together with previous findings that uricemia not only is a strong negative risk factor for PD (4) but also predicts slower clinical progression of PD (6, 7), supports a possible neuroprotective effect of urate or its precursors in PD pathogenesis and suggests that dietary modulation of plasma urate may play a role in the prevention and treatment of PD.
This study was supported by National Institutes of Health/National Institute of Neurological Disorders and Stroke grant R01 NS048517, TAP Pharmaceutical Products, Inc. (Lake Forest, Illinois), and the Alfreda Louise Kwiecien Parkinson’s Research Fund. The study was also supported in part by the Intramural Research Program of the National Institutes of Health, National Institute of Environmental Health Sciences.
None of the sponsors participated in the design of the study or in the collection, analysis, or interpretation of the data.
Conflict of interest: none declared.
For derivation of the dietary urate index, the following foods and nutrients were considered in a stepwise model: total meat (servings/day), seafood (servings/day), dairy protein (g/day), individual dairy foods and dairy products (milk, yogurt, cheese, cream, and ice cream; servings/day), alcohol (g/day), individual alcoholic beverages (liquor, beer, and wine; servings/day), vitamin C (mg/day), fructose (g/day), sucrose (g/day), vegetables (servings/day), legumes and soybean products (servings/day), flavonoids (mg/day), folate (mg/day), coffee (cups/day), and caffeine (mg/day). Dairy protein has been suggested to be a major contributor to the association between dairy food intake and plasma urate concentration (15, 20). However, because of the possibility that nonprotein components could also contribute to the association, we included both dairy protein and individual dairy items as possible predictors of plasma urate level. The same rationale was used for considering as possible predictors both alcohol and individual alcoholic beverages. The significance levels were set a priori at 0.2 for entry into the model and at 0.1 for staying in the model. To control for potential confounders, we forced the following covariates, obtained from the 1994 follow-up questionnaire, into the stepwise regression model: age (in months), smoking status (never smoker, past smoker, or current smoker of 1-14 cigarettes/day or ≥15 cigarettes/day), body mass index (weight (kg)/height (m)2; <23, 23-24.9, 25-26.9, or 27-29.9), caloric intake (kcal/day), and history of hypertension after blood collection (yes/no).