There are increasing concerns that exposure to unmetabolized folic acid, which results from folic acid intakes that overwhelm the liver's metabolic capacity, may be associated with adverse effects. In this paper, we examined the folic acid status of women of reproductive age in relation to dietary intake and the effect of folic acid supplementation (1.1 mg or 5 mg). Plasma unmetabolized folic acid was not significantly correlated with folate intake estimated by food frequency questionnaire or biomarkers. The proportion of women with detectable levels of unmetabolized folic acid increased from 65% to 100% after twelve weeks of supplementation (P < 0.05); however, the increase in concentrations did not reach statistical significance and the effect was not sustained. Moreover, there were no significant differences between the two doses. This suggests that there are mechanisms by which the body adapts to high folic acid intakes to limit exposure to unmetabolized folic acid.
This study examined trends in prescription drug abuse and dependence (sedatives, tranquilizers, opioids, and stimulants), co-occurrence with other substance use disorders and substance abuse treatment utilization among those with diagnoses of prescription drug abuse and dependence in two large, nationally representative, independent samples of adults in the United States in 1991–1992 and 2001–2002.
Two nationally representative cross-sectional samples of civilian noninstitutionalized adults 18 years or older in the United States, of which 52% were women. Data were collected from structured diagnostic interviews using the NIAAA Alcohol Use Disorder and Associated Disabilities Interview Schedule: Diagnostic and Statistical Manual version IV (DSM-IV). National prevalence estimates were derived from the 1991–1992 National Longitudinal Alcohol Epidemiologic Survey (N = 42,862) and the 2001–2002 National Epidemiologic Survey on Alcohol and Related Conditions (N = 43,093).
The past-year prevalence of prescription sedative abuse, sedative dependence, opioid abuse, and opioid dependence increased from 1991–1992 to 2001–2002. The majority of individuals with past-year sedative (56.8%), tranquilizer (89.0%), stimulant (67.9%) and opioid (74.2%) use disorders also met DSM-IV criteria for an additional past-year substance use disorder. The co-occurrence of several forms of prescription drug use disorders and other substance use disorders increased from 1991–1992 to 2001–2002. A minority of individuals with past-year prescription drug abuse and approximately one-half of those with past-year prescription drug dependence utilized substance abuse treatment.
The findings reinforce the importance of continued national monitoring based on the increases in prescription drug abuse and dependence, high co-occurrence with other substance use disorders, and underutilization of substance abuse treatment services.
Epidemiology; DSM-IV; substance use disorders; prescription drugs; substance abuse treatment utilization
During the early 1990s in the U.S., changes to the provision and financing of alcohol treatment services included reductions in inpatient treatment services and in private sector spending for treatment. We investigated trends in alcohol services utilization over the 10-year period from 1991-1992 to 2001-2002 among U.S. Whites, Blacks and Hispanics.
Data come from two household surveys of the U.S. adult population. The 1991-1992 National Longitudinal Alcohol Epidemiologic Survey (NLAES) and the 2001-2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) conducted face-to-face interviews with a multistage cluster sample of individuals 18 years of age and older in the continental United States. Treatment utilization represented both total utilization and the use of alcohol services. Data analyses were prevalence rates and multivariate logistic regressions for lifetime utilization with drinkers and individuals with alcohol use disorders (AUD).
From 1991-1992 to 2001-2002, drinking-related emergency room and human services use increased for drinkers, while total utilization and the use of private health professional services and mutual aid decreased for individuals with AUDs. In drinkers and individuals with AUDs, Blacks and Hispanics were less likely than Whites to use private health professional care. Hispanics with AUDs were less likely than Whites with AUDs to use alcohol or drug programs. Ethnicity interacted with alcohol severity to predict alcohol services utilization. At higher levels of alcohol severity, Blacks and Hispanics were less likely than Whites to ever use treatment and to use alcohol services (i.e., human services for Hispanic drinkers, mental health services for Blacks with AUDs, and mutual aid for Hispanics with AUDs).
Our findings showed increases from 1991-1992 to 2001-2002 in alcohol services utilization for drinkers, but reductions in utilization for individuals with AUDs. Blacks and Hispanics, particularly those at higher levels of alcohol severity, underutilized treatment services compared to Whites. These utilization trends for Blacks and Hispanics may reflect underlying disparities in health care access for minority groups, and language and logistical barriers to utilizing services.
Alcohol Services; Treatment Utilization; Ethnicity; National Trends
The present study examined the associations between early onset of non-medical use of prescription drugs (NMUPD) (i.e. sedatives, tranquilizers, opioids, stimulants) and the development of prescription drug abuse and dependence in the United States.
Data were collected from structured diagnostic interviews using the National Institute on Alcohol Abuse and Alcoholism (NIAAA) Alcohol Use Disorder and Associated Disabilities Interview Schedule: Diagnostic and Statistical Manual version IV (DSM-IV).
National prevalence estimates were derived from the 2001–2002 National Epidemiologic Survey on Alcohol and Related Conditions (NESARC, n = 43 093).
A nationally representative cross-sectional sample of civilian non-institutionalized adults aged 18 years or older in the United States, of whom 52% were women, 71% white, 12% Hispanic, 11% African American, 4% Asian and 2% Native American or of other racial background.
A higher percentage of individuals who began using prescription drugs non-medically at or before 13 years of age were found to have developed prescription drug abuse and dependence versus those individuals who began using at or after 21 years of age. Multivariate logistic regression analyses indicated that the odds of developing any life-time prescription drug abuse among non-medical users was reduced by approximately 5% with each year non-medical use was delayed [adjusted odds ratio (AOR) = 0.95, 95% CI = 0.94, 0.97], and that the odds of developing any life-time prescription drug dependence were reduced by about 2% with each year onset was delayed (AOR = 0.98, 95% CI = 0.96, 1.00) when controlling for relevant covariates.
The results of this study indicate that early onset of NMUPD was a significant predictor of prescription drug abuse and dependence. These findings reinforce the importance of developing prevention efforts to reduce NMUPD and diversion of prescription drugs among children and adolescents.
DSM-IV drug use disorders; non-medical use; prescription drug abuse; prescription drug dependence; prescription drug initiation
In 2001, the U.S. government’s Healthy People 2010 initiative set a goal of reducing contraceptive failure during the first year of use from 13% in 1995 to 7% by 2010. We provide updated estimates of contraceptive failure for the most commonly used reversible methods in the United States, as well as an assessment of changes in failure rates from 1995 to 2002.
Estimates are obtained using the 2002 National Survey of Family Growth (NSFG), a nationally representative sample of U.S. women containing information on their characteristics, pregnancies, and contraceptive use. We also use the 2001 Abortion Patient Survey to correct for underreporting of abortion in the NSFG. We measure trends in contraceptive failure between 1995 and 2002, provide new estimates for several population subgroups, examine changes in subgroup differences since 1995, and identify socioeconomic characteristics associated with elevated risks of failure for three commonly used reversible contraceptive methods in the U.S.: the pill, male condom and withdrawal.
In 2002, 12.4% of all episodes of contraceptive use ended with a failure within 12 months after initiation of use. Injectable and oral contraceptives remain the most effective reversible methods used by women in the U.S., with probabilities of failure during the first 12 months of use of 7% and 9%, respectively. The probabilities of failure for withdrawal (18%) and the condom (17%) are similar. Reliance on fertility-awareness-based methods results in the highest probability of failure (25%). Population subgroups experience different probabilities of failure, but the characteristics of users that may predict elevated risks are not the same for all methods.
There was no clear improvement in contraceptive effectiveness between 1995 and 2002. Failure rates remain high for users of the condom, withdrawal and fertility-awareness methods, but for all methods, the risk of failure is greatly affected by socioeconomic characteristics of the users.
Contraception; Contraceptive Methods; Contraceptive Effectiveness; Contraceptive Failure; National Survey of Family Growth; NSFG
Serum folic acid tests are routinely ordered by physicians for evaluating anemia and sometimes ordered for evaluating dementia and altered mental status.
To determine the utility of routine folic acid testing for patients with anemia or dementia/altered mental status in the era of folic acid fortification.
Retrospective analysis of consecutive folic acid tests performed on adults over a 4-month period; chart review of patients without anemia.
Measurements and Main Results
Serum folic acid level, mean corpuscular volume (MCV), and hematocrit. We reviewed 1,007 folic acid tests performed on 980 patients. The average age was 63.8 years, and 62% of the tests were from outpatient facilities. Only 4 (0.4%) patients had folic acid levels <3 ng/mL, while 10 (1%) patients had levels of 3–4 ng/mL (borderline). Thirty-five percent of the folic acid tests were performed on patients who were not anemic; most of these were ordered to evaluate dementia or altered mental status and folic acid level was normal in all these patients. Only 7% of the patients tested had a macrocytic anemia; these patients were more likely than those without macrocytic anemia to have low folic acid levels (2.8% vs 0.4%, p < .03).
Low serum folic acid levels were rarely detected in a series of patients being evaluated for anemia, dementia, or altered mental status. The test should be reserved for patients with macrocytic anemia and those at high risk for folic acid deficiency.
folic acid, anemia, dementia
The implementation of folic acid fortification in the United States has resulted in unprecedented amounts of this synthetic form of folate in the American diet. Folic acid in circulation may be a useful measure of physiologic exposure to synthetic folic acid, and there is a potential for elevated concentrations after fortification and the possibility of adverse effects.
We assessed the effect of folic acid fortification on circulating concentrations of folic acid and 5-methyltetrahydrofolate in the Framingham Offspring Cohort.
This is a cross-sectional study that used plasma samples from fasting subjects before and after fortification. Samples were measured for folate distribution with the use of an affinity-HPLC method with electrochemical detection.
Among nonsupplement users, the median concentration of folic acid in plasma increased from 0.25 to 0.50 nmol/L (P < 0.001) after fortification, and among supplement users the median increased from 0.54 to 0.68 nmol/L (P = 0.001). Among nonsupplement users, the prevalence of high circulating folic acid (≥85th percentile) increased from 9.4% to 19.1% (P = 0.002) after fortification. Among supplement users, the prevalence of high circulating folic acid increased from 15.9% to 24.3% (P = 0.02). Folic acid intake and total plasma folate were positively and significantly related to high circulating folic acid after adjustment for potential confounding factors (P for trend < 0.001).
Folic acid fortification has resulted in increased exposure to circulating folic acid. The biochemical and physiologic consequences of this are unknown, but these findings highlight the need to understand the effects of chronic exposure to circulating folic acid.
This study investigated the link between physical pain and non-medical prescription analgesic use (NMPAU), as well as the degree to which this association may vary by the presence of psychiatric and substance use disorders. Data were from the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC), a nationally representative, in-person probability sample of adults (n=43,093) aged 18 or older in the United States (2001–2002). Face-to-face interviews were used to gather information on past-year levels of physical pain (i.e., low, medium, high), in addition to DSM-IV classifications for mood, anxiety, substance use problems (i.e., abuse and/or dependence), and personality disorders. Within the analytic sample of those with valid data (n=42,734), the past-year rate of NMPAU was 1.8%, of which 20% met the DSM-IV criteria for abuse/dependence. Among past-year NMPAUs, 53% was incidental (e.g., less than monthly), but daily use was substantial (13% of NMPAUs). Accounting for our target confounding factors, pain was positively associated (p<.05) with an increased probability of non-disordered (i.e., no abuse and/or dependence) and disordered (i.e., abuse and/or dependence) NMPAU in the past year. Within each level of pain, the odds of past-year non-disordered and disordered NMPAU were significantly higher (p<.05) for those with disordered alcohol use compared with non-disordered users. This pattern was similar for illicit drugs, although marginally significant (p=.060) and specific to disordered NMPAU. In contrast, psychiatric disorders increased the probability of both types of NMPAU, but these associations did not differ by levels of pain. These findings suggest that pain is an independent risk factor for non-disordered and disordered NMPAU, yet its effects are substantially modified by patterns of substance use.
Prescription drug abuse; opioids; pain reliever; psychiatric disorders; substance use disorders
Background: Folic acid (FA) fortification of food created the need to determine whether fortification elevated concentrations of unmetabolized FA in plasma and whether this form of the vitamin in blood is associated with adverse health outcomes.
Objective: The objective of this research was to devise a simple, rapid method for the measurement of unmetabolized plasma FA in epidemiologic studies.
Design: We previously used the affinity/HPLC with electrochemical detection method to measure folate distribution in human plasma and red blood cells (RBCs). We modified this method with the inclusion of synthetic ethyltetrahydrofolate as an internal standard and with the use of 2 affinity columns connected in parallel to the analytic column through a switching valve to allow one column to be loaded while the other column was eluted into the analytic column.
Results: We identified FA and 5-methyltetrahydrofolate (5-mTHF) by retention time and characteristic response across the channels of the electrochemical detector. Limits of detection were 0.034 pmol for 5-mTHF and 0.027 pmol for FA per injection, and the recovery was 92.2% (5-mTHF) and 98.9% (FA). CVs for samples were 8.1% (within day) and 6.8% (between day) for 5-mTHF and 3.2% (within day) and 5.9% (between day) for FA. Total folate with the use of this method correlated highly (r2 = 0.98, P < 0.001) with values from the microbial assay. The run time for the method was 30 min per sample. Researchers can use this method with longer run times to measure the distribution of folate forms in RBCs.
Conclusion: This updated method allows efficient analysis of folate forms in human plasma and tissues without the loss of sensitivity or precision.
Conducting iron supplementation programs has been a major strategy to reduce iron deficiency anemia in pregnancy. However, only a few countries have reported improvements in the anemia rate at a national level. The strategies used for control of nutrition problems need regular review to maintain and improve their effectiveness. The objective of this study was to analyze the factors in compliance with taking iron tablets, where daily doses of iron (60 mg) and folic acid (400 μg) were distributed in rural Vietnamese communes.
A cross sectional survey was conducted in Nghe An province, Vietnam in January, 2003. The study population was adult women aged less than 35 years who delivered babies between August 1st 2001 and December 1st 2002 (n = 205), of which 159 took part in the study. Data for the study were collected from a series of workshops with community leaders, focus group discussions with community members and a questionnaire survey.
Improvements in the rate of anemia was not given a high priority as one of the commune's needs, but the participants still made efforts to continue taking iron tablets. Two major factors motivated the participants to continue taking iron tablets; their experience of fewer spells of dizziness (50%), and their concern for the health of their newborn baby (54%). When examining the reasons for taking iron tablets for at least 5–9 months, the most important factor was identified as 'a frequent supply of iron tablets' (OR = 11.93, 95% CI: 4.33–32.85).
The study found that multiple poor environmental risk factors discouraged women from taking iron tablets continuously. The availability (frequent supply) of iron tablets was the most effective way to help adult women to continue taking iron tablets.
All flour in the USA is fortified with folic acid at a level of 140 μg/100 g which is estimated to supply an extra 100 μg daily to the average diet. Some researchers have advocated that this be increased to double and even four times this amount. Based on previous research these higher levels are likely to lead to the appearance of unmetabolised vitamin in the circulation, which may have safety implications for sub-groups of the population. The UK and the Republic of Ireland will likely introduce mandatory fortification also in the next year or so.
The aim of this study was to capture the short-term effect of folic acid fortification on unmetabolised folic acid in serum after chronic consumption of folic acid.
After pre-saturation with 400 μg folic acid supplements daily for 14-weeks, healthy folate replete adults (n = 20) consumed folic acid fortified bread, at three different levels (400 μg, 200 μg, 100 μg) over a period of one week each. The dose was administered in two-equal sized slices consumed at 09.00 hrs and 13.00 hrs. Serum samples for total folate and folic acid were collected at baseline, after 14-weeks of supplementation, and pre and post (at 1, 2, 3 and 4 hours) each dose tested.
Unmetabolised folic acid was detected after the 14-week supplementation period. Folic acid was not detected in either the 200 μg or 100 μg (current US regime) doses tested but was present at the highest level (400 μg) tested.
Our findings suggest that persons exposed to the current US fortification programme supplying an average of 100 μg per day or less are unlikely to have unmetabolised folic acid in serum. It also seems that daily consumption of the higher level of 200 μg or less is unlikely to be problematic. Increasing the level however to 400 μg on the other hand is likely to lead to unmetabolised folic acid appearance.
To assess the prevalence and clinical impact of comorbid Social Anxiety Disorder (SAD) and Alcohol Use Disorders (AUD, i.e., alcohol abuse and alcohol dependence) in a nationally representative sample of adults in the United States.
Data came from a large representative sample of the United States population. Face-to-face interviews of 43,093 adults residing in households were conducted during 2001–2002. Diagnoses of mood, anxiety, alcohol and drug use disorders, and personality disorders were based on the Alcohol Use Disorder and Associated Disabilities Interview Schedule—DSM-IV Version.
Lifetime prevalence of comorbid AUD and SAD in the general population was 2.4%. SAD was associated with significantly increased rates of alcohol dependence (OR=2.8) and alcohol abuse (OR=1.2). Among respondents with alcohol dependence, SAD was associated with significantly more mood, anxiety, psychotic, and personality disorders. Among respondents with SAD, alcohol dependence and abuse were most strongly associated with more substance use disorders, pathological gambling, and antisocial personality disorders. SAD occurred before alcohol dependence in 79.7% of comorbid cases, but comorbidity status did not influence age of onset for either disorder. Comorbid SAD was associated with increased severity of alcohol dependence and abuse. Respondents with comorbid SAD and alcohol dependence or abuse reported low rates of treatment-seeking.
Comorbid lifetime AUD and SAD is a prevalent dual diagnosis, associated with substantial rates of additional comorbidity, but remaining largely untreated. Future research should clarify the etiology of this comorbid presentation to better identify effective means of intervention.
Rates of neural tube defects have decreased since folic acid fortification of the food supply in the United States. The authors’ objective was to evaluate the associations between neural tube defects and maternal folic acid intake among pregnancies conceived after fortification. This is a multicenter, case-control study that uses data from the National Birth Defects Prevention Study, 1998–2003. Logistic regression was used to compute crude and adjusted odds ratios between cases and controls assessing maternal periconceptional use of folic acid and intake of dietary folic acid. Among 180 anencephalic cases, 385 spina bifida cases, and 3, 963 controls, 21.1%, 25.2%, and 26.1%, respectively, reported periconceptional use of folic acid supplements. Periconceptional supplement use did not reduce the risk of having a pregnancy affected by a neural tube defect. Maternal intake of dietary folate was not significantly associated with neural tube defects. In this study conducted among pregnancies conceived after mandatory folic acid fortification, the authors found little evidence of an association between neural tube defects and maternal folic acid intake. A possible explanation is that folic acid fortification reduced the occurrence of folic acid-sensitive neural tube defects. Further investigation is warranted to possibly identify women who remain at increased risk of preventable neural tube defects.
folic acid; neural tube defects
Objective: To summarize changes in folic acid awareness, knowledge, and behavior among women of childbearing age in the United States since the U.S. Public Health Service (USPHS) 1992 folic acid recommendation and later fortification. Methods: Random-digit dialed telephone surveys were conducted of approximately 2000 women (per survey year) aged 18–45 years from 1995–2005 in the United States. Results: The percentage of women reporting having heard or read about folic acid steadily increased from 52% in 1995 to 84% in 2005. Of all women surveyed in 2005, 19% knew folic acid prevented birth defects, an increase from 4% in 1995. The proportion of women who reported learning about folic acid from health care providers increased from 13% in 1995 to 26% in 2005. The proportion of all women who reported taking a vitamin supplement containing folic acid increased slightly from 28% in 1995 to 33% in 2005. Among women who were not pregnant at the time of the survey in 2005, 31% reported taking a vitamin containing folic acid daily compared with 25% in 1995. Conclusions: The percentage of women taking folic acid daily has increased modestly since 1995. Despite this increase, the data show that the majority of women of childbearing age still do not take a vitamin containing folic acid daily. Health care providers and maternal child health professionals must continue to promote preconceptional health among all women of childbearing age, and encourage them to take a vitamin containing folic acid daily.
Multivitamin use; Folic acid consumption; Childbearing age women
This study examined effects of type of and cumulative burden of childhood adversities on bullying and cruelty to animals in the United States. Data were derived from Waves I and II of the National Epidemiologic Survey on Alcohol and Related Conditions, a nationally representative sample of U.S. adults. Structured psychiatric interviews were completed by trained lay interviewers between 2001–2002 and 2003–2004. Although the effects of childhood adversity diminished with the inclusion of confounding variables, several adversities remained significant. For bullying, these included being made to do chores that were too difficult or dangerous, threatening to hit or throw something, pushing, shoving, slapping, or hitting, and hitting that left bruises, marks, or injuries. With respect to cruelty to animals, swearing and saying hurtful things, having a parent or other adult living within the home that went to jail or prison, and adult/other person fondling/touching in a sexual way were significant. The final models indicated that the cumulative burden of childhood adversities had strong effects on the increased likelihood of bullying behavior but not cruelty to animals.
aggression; bullying; animal cruelty; child abuse and neglect; childhood risk; violence
We examined the association between substance use disorders and migration to the United States in a nationally representative sample of the Mexican population.
We used the World Mental Health version of the Composite International Diagnostic Interview to conduct structured, computer-assisted, face-to-face interviews with a cross-sectional sample of household residents aged 18 to 65 years who lived in Mexico in cities with a population of at least 2500 people in 2001 and 2002. The response rate was 76.6%, with 5826 respondents interviewed.
Respondents who had migrated to the United States and respondents who had family members in the United States were more likely to have used alcohol, marijuana, or cocaine at least once in their lifetime; to develop a substance use disorder; and to have a current (in the past 12 months) substance use disorder than were other Mexicans.
International migration appears to play a large role in transforming substance use norms and pathology in Mexico. Future studies should examine how networks extending over international boundaries influence substance use.
The overall prevalence of complementary medicine (CM) use among adults in the United States with diabetes has been examined both in representative national samples and in more restricted populations. However, none of these earlier studies attempted to identify predictors of CM use to treat diabetes among the populations sampled, nor looked for a relationship between CM use and diabetes severity.
Combining data from the 2002 and 2007 National Health Interview Survey (NHIS), we constructed a nationally representative sample of 3,978 U.S. adults aged ≥18 years with self-reported diabetes. Both the 2002 and 2007 NHIS contained extensive questions on the use of CM. We used logistic regression to examine the association between diabetes severity and overall CM use, as well as the use of specific categories of CM.
In adults with type-2 diabetes, 30.9% used CM for any reason, but only 3.4% used CM to treat or manage their type-2 diabetes versus 7.1% of those with type-1 diabetes. Among those using CM to treat/manage their type-2 diabetes, 77% used both CM and conventional prescription medicine for their diabetes. The most prevalent types of CM therapies used were diet-based interventions (35.19%, S.E. 5.11) and non-vitamin/non-mineral dietary supplements (33.74%, S.E. 5.07). After controlling for sociodemographic factors, we found that, based on a count of measures of diabetes severity, persons with the most severe diabetes had nearly twice the odds of using CM as those with less severe disease (OR=1.9, 95%CI 1.2-3.01). Persons who had diabetes 10 years or more (OR=1.66, 95%CI 1.04-3.66) and those that had a functional limitation resulting from their diabetes (OR=1.74, 95%CI 1.09-2.8) had greater odds of using CM than those not reporting these measures. No significant associations were observed between overall CM use and other individual measures of diabetes severity: use of diabetic medications, weak or failing kidneys, coronary heart disease, or severe vision problems.
Our results demonstrate that individuals with more severe diabetes are more likely to use CM independent of sociodemographic factors. Further studies are essential to determine if CM therapies actually improve clinical outcomes when used to treat/manage diabetes.
Complementary medicine; Diabetes; Disease severity; Logistic regression; Survey
The purpose of this study was to determine if six weeks of folic acid supplementation would improve brachial artery endothelial-dependent flow-mediated dilation in eumenorrheic female runners with previously normal serum folate levels. This was a prospective, double-blinded, randomized pilot study with convenience sampling. Sixteen eumenorrheic subjects who were not taking birth control pills and who ran at least 20 miles/week were randomly assigned to 10 mg/day of folic acid supplementation or placebo for at least 6 weeks. Serum folate levels and brachial artery measurements were made during the early follicular phase of the menstrual cycle, in a sedentary state, following an 8 hour fast; a standard ultrasound technique was used. The brachial artery vasodilator response to reactive hyperemia was similar between the folic acid (6.6% ± 0.8%, mean ± SE) and placebo groups (6.5% ± 0.7%) at baseline. After six weeks, there was a significantly higher change in flow-mediated dilation for the folic acid group (3.5% ± 0.6%) compared to the placebo group (0.1% ± 0.2%) (p = 0.01). Serum folate levels also increased significantly in the folic acid group following six weeks of folic acid supplementation. This study demonstrates that brachial artery flow-mediated dilation improves significantly in eumenorrheic female runners with previously normal serum folate levels after 6 weeks of supplementation with folic acid.
Folic acid improves FMD in eumenorrheic runners.
Folic acid improves FMD in women runners.
Premenopause; regular menstruation; endothelial function; folate; flow-mediated vasodilation
Isolated renal tubule fragments prepared from adult Sprague-Dawley rats were used to study the cellular uptake of hypoxanthine. This uptake was rapid, reaching a steady state after 30 min of incubation. Analysis of the intracellular pool during the initial uptake and at the steady state revealed a concentration gradient of hypoxanthine consistent with active transport, although only one-third of the transported hypoxanthine remained unmetabolized. The remainder of the transported hypoxanthine was converted to inosine and inosinic acid, but detectable conversion to uric acid was not noted. A kinetic analysis of uptake revealed that two systems for cellular entry of hypoxanthine existed with Km1 = 0.005 and Km2 = 0.80 mM. Hypoxanthine uptake at physiologic concentrations was oxygen, sodium, and temperature dependent, but the addition of metabolic fuels and alteration of the medium pH over the range of from 6.1 to 7.4 had no effect. Adenine, guanine, and inosine inhibited the uptake of hypoxanthine via the low-Km system which mediates the majority of uptake at physiologic levels. Xanthine, uric acid, and probenecid inhibited uptake via the high-Km system, but did not affect uptake via the low-Km system. The data indicate that hypoxanthine at physiologic levels is transported into the renal tubule cell via a system different from that for other oxypurines.
This study aimed to assess the contribution of postnatal services to the risk of neonatal mortality, and the relative contributions of antenatal iron/folic acid supplements and postnatal care in preventing neonatal mortality in Indonesia.
Retrospective cohort study.
Setting and participants
Data used in this study were the 2002–2007 Indonesia Demographic and Health Surveys, nationally representative surveys. The pooled data provided survival information of 26 591 most recent live-born infants within the 5-years prior to each interview.
Primary outcomes were early neonatal mortality, that is, deaths in the first week, and all neonatal mortality, that is, deaths in the first month of life. Exposures were antenatal iron/folic acid supplementation and postnatal care from days 1 to 7. Potential confounders were community, socio-economic status and birthing characteristics and perinatal healthcare. Cox regression was used to assess the association between study factors and neonatal mortality.
Postnatal care services were not associated with newborn survival. Postnatal care on days 1–7 after birth did not reduce neonatal death (HR=1.00, 95% CI 0.55 to 1.83, p=1.00) and early postnatal care on day 1 was associated with an increased risk of early neonatal death (HR=1.27, 95% CI 0.69 to 2.32, p=0.44) possibly reflecting referral of ill newborns. Early postnatal care on day 1 was not protective for neonatal deaths on days 2–7 whether provided by doctors (HR 3.61, 95% CI 1.54 to 8.45, p<0.01), or by midwives or nurses (HR 1.38, 95% CI 0.53 to 3.57, p=0.512). In mothers who took iron/folic acid supplements during pregnancy, the risk of early neonatal death was reduced by 51% (HR=0.49, 95% CI 0.30 to 0.79, p<0.01).
We found no protective effect of postnatal care against neonatal deaths in Indonesia. However, important reductions in the risk of neonatal death were found for women who reported use of antenatal iron/folic acid supplements during pregnancy.
Epidemiology; Perinatology; Public Health
Objectives To compare diabetes management in adults between England and the United States, particularly focusing on the impact of a universal access health insurance system.
Design Analysis of the nationally-representative surveys Health Survey of England, 2003 (unweighted n =14 057) and the National Health and Nutrition Examination Survey, 2001-2002 (unweighted n =5411).
Setting and participants Adults 20-64 years of age; individuals >65.
Main outcome measures Glycaemic, lipid and blood pressure control and medication use among individuals with previously diagnosed diabetes.
Results Among those aged 20-64 the prevalence of diagnosed diabetes was lower in England (2.7%) than in the USA (5.0%). The proportion with diabetes receiving treatment was similar for the two countries. However, the mean HbA1c in England was 7.6%: in the USA it was 7.5% for those with insurance and 8.6% for those without insurance. The proportion of individuals on ACE inhibitors in England was 39%: in USA it was 39% for those with insurance, and 14% for those without.
Conclusions Individuals in a healthcare system providing universal access have better managed diabetes than those in a market based system once one accounts for insurance.
The Institute of Medicine set a tolerable upper intake level (UL) for usual daily total folic acid intake (1,000 µg). Less than 3% of US adults currently exceed the UL.
The objective of this study was to determine if folic acid fortification of corn masa flour would increase the percentage of the US population who exceed the UL.
We used dietary intake data from NHANES 2001–2008 to estimate the percentage of adults and children who would exceed the UL if corn masa flour were fortified at 140 µg of folic acid/100 g.
In 2001–2008, 2.5% of the US adult population (aged≥19 years) exceeded the UL, which could increase to 2.6% if fortification of corn masa flour occurred. With corn masa flour fortification, percentage point increases were small and not statistically significant for US adults exceeding the UL regardless of supplement use, sex, race/ethnicity, or age. Children aged 1–8 years, specifically supplement users, were the most likely to exceed their age-specific UL. With fortification of corn masa flour, there were no statistically significant increases in the percentage of US children who were exceeding their age-specific UL, and the percentage point increases were small.
Our results suggest that fortification of corn masa flour would not significantly increase the percentage of individuals who would exceed the UL. Supplement use was the main factor related to exceeding the UL with or without fortification of corn masa flour and within all strata of sex, race/ethnicity, and age group.
Folic acid; fortification; corn masa flour; tolerable upper intake level
To examine a potential association between biologically confirmed secondhand smoke exposure and symptoms of Diagnostic and Statistical Manual of Mental Disorders (Fourth Edition) (DSM-IV) major depressive disorder, generalized anxiety disorder, panic disorder, attention-deficit/hyperactivity disorder, and conduct disorder using a nationally representative sample of US children and adolescents.
Nationally representative cross-sectional survey of the United States.
Continental United States.
Children and adolescents aged 8 to 15 years who participated in the National Health and Nutrition Examination Survey from 2001 to 2004.
Measurement of serum cotinine level to assess secondhand smoke exposure among nonsmokers.
Main Outcome Measures
The DSM-IV symptoms were derived from selected modules of the National Institute of Mental Health's Diagnostic Interview Schedule for Children Version IV, a structured diagnostic interview administered by trained lay interviewers.
Among nonsmokers, serum cotinine level was positively associated with symptoms of DSM-IV major depressive disorder, generalized anxiety disorder, attention-deficit/hyperactivity disorder, and conduct disorder after adjusting for survey design, age, sex, race/ethnicity, poverty, migraine, asthma, hay fever, maternal smoking during pregnancy, and allostatic load. Associations with serum cotinine level were more apparent for boys and for participants of non-Hispanic white race/ethnicity.
Our results are consistent with a growing body of research documenting an association between secondhand smoke exposure and mental health outcomes. Future research is warranted to establish the biological or psychological mechanisms of association.
Whether lifetime abstainer’s antisocial behavior are maladjusted or well-adjusted is unresolved. The aim of this study was to compare abstainers (defined as persons with no lifetime use of alcohol and other drugs and non-engagement in antisocial or delinquent behavior) with non-abstainers across a range of sociodemographic and mental health characteristics in the United States.
Data were derived from the National Epidemiologic Survey on Alcohol and Related Conditions, a nationally representative sample of U.S. adults. Structured psychiatric interviews (N = 43,093) using the Alcohol Use Disorder and Associated Disabilities Interview Schedule – DSM-IV version (AUDADIS-IV) were completed by trained lay interviewers between 2001 and 2002.
The prevalence of abstaining was 11%. Abstainers were significantly more likely to be female, Asian and African-American, born outside the U.S., and less likely to be unemployed. Multivariate logistic regression analyses revealed that abstainers were significantly less likely to evidence lifetime mood, anxiety, or personality disorder compared to non-abstainers.
Findings indicate that abstainers are not maladapted and are comparatively more functional than non-abstainers.
Abstainers; delinquency; alcohol; drug use; externalizing behaviors
Growing evidence supports the validity of distinguishing major depressive disorder (MDD) plus a lifetime history of subthreshold hypomania (D(m)) from pure MDD in psychiatric classifications. The present study sought to estimate the proportion of individuals with D(m) that would have been included in RCTs for MDD using typical eligibility criteria, and examine the potential impact of including these participants on internal validity.
Data were derived from the 2001–2002 National Epidemiological Survey on Alcohol and Related Conditions (NESARC), a national representative sample of 43,093 adults of the United States population. We examined the proportion of participants with a current diagnosis of pure MDD and D(m) that would have been eligible in clinical trials for MDD with a traditional set of eligibility criteria, and compared it with that of participants with bipolar 2 disorder if the same set of eligibility criteria was applied. We considered 4 models including different definitions of subthreshold hypomania.
We found that more than 7 out of ten participants with pure MDD and with D(m) would have been excluded by at least one classical eligibility criterion. Prevalence rate of individuals with D(m) in RCTs for MDD with traditional eligibility criteria would have ranged from 7.98% to 22.59%. Overall exclusion rate of individuals with MDD plus at least 4 lifetime concomitant hypomanic probes significantly differ from those with pure MDD, whereas it was not significantly different in those with at least 2 lifetime concomitant hypomanic probes compared to those with bipolar 2 disorder.
The current design of clinical trials for MDD may suffer from impaired external validity and potential impaired internal validity, due to the inclusion of a substantial proportion of individuals with subthreshold hypomania presenting with similar pattern of exclusion rates to those with bipolar 2 disorder, possibly resulting in a selection bias.