To assess the utility of the Cogstate self-administered computerized neuropsychological battery in a large population of older men.
We invited 7,167 men (mean age: 75 years) from the Health Professionals Follow-up Study, a prospective cohort of male health professionals. We considered individual Cogstate scores and composite scores measuring psychomotor speed and attention, learning and working memory, and overall cognition. Multivariate linear regression was used to assess the association between risk factors measured 4 and 28 years prior to cognitive testing and each outcome.
The 1,866 men who agreed to complete Cogstate testing were similar to the 5,301 non-responders. Many expected risk factors were associated with Cogstate scores in multivariate-adjusted models. Increasing age was significantly associated with worse performance on all outcomes (p < 0.001). For risk factors measured four years prior to testing and overall cognition, a history of hypertension was significantly associated with worse performance (mean difference=−0.08 standard units [95% CI −0.16, 0.00]) and higher nut consumption was significantly associated with better performance (>2 servings/week vs. <1 serving/month: 0.15 [0.03, 0.27]).
The self-administered Cogstate battery showed significant associations with several risk factors known to be associated with cognitive function. Future studies of cognitive aging may benefit from the numerous advantages of self-administered computerized testing.
epidemiology; dementia; alzheimer’s; cognitive function; neuropsychological assessment
To evaluate and compare the associations of microvascular and macrovascular abnormalities with cognitive and physical function
Cross-sectional analysis of the Cardiovascular Health Study (1998–1999)
2452 participants (mean age 79.5 years) with available data on ≥3 of 5 microvascular abnormalities (brain, retina, kidney) and ≥3 of 6 macrovascular abnormalities (brain, carotid artery, heart, peripheral artery)
Standardized composite scores derived from 3 cognitive tests (Modified Mini-Mental State Examination, digit-symbol substitution test, trail making test) and 3 physical tests (gait speed, grip strength, 5-times sit-to-stands)
Compared with individuals with low microvascular and macrovascular burden, those with high microvascular and macrovascular burden had the worst cognitive function (mean score difference [95% confidence interval]: −0.30 [−0.37, −0.24]) and physical function (−0.32 [−0.38, −0.26]). Individuals with high microvascular burden alone had similarly lower scores as those with high macrovascular burden alone (cognitive function: −0.16 [−0.24, −0.08] versus −0.13 [−0.20, −0.06], respectively; physical function: −0.15 [−0.22, −0.08] versus −0.12 [−0.18, −0.06], respectively). Psychomotor speed and working memory, assessed by trail making test, were only impaired in the presence of high microvascular burden. Of the 11 vascular abnormalities considered, white matter hyperintensity, cystatin C-based glomerular filtration rate, large brain infarct, and ankle-arm index were independently associated with both cognitive and physical function.
Microvascular and macrovascular abnormalities assessed from non-invasive tests of the brain, kidney, and peripheral artery were independently associated with poor cognitive and physical function in older adults. Future research should evaluate the usefulness of these tests in prognostication.
Vascular disease; microvascular disease; cognitive function; physical function
To evaluate the association between infertility and fertility treatments on subsequent risk of hypertension.
Nurses’ Health Study II
116,430 female nurses followed from 1993 to June 2011 as part of the Nurses' Health Study II cohort.
Main Outcome Measures
Self-reported, physician diagnosed hypertension
Compared to women who never reported infertility, infertile women were at no greater risk of hypertension (multi-variable adjusted relative risk (RR) = 1.01 95% confidence interval [0.94–1.07]). Infertility due to tubal disease was associated with a higher risk of hypertension (RR=1.15 [1.01–1.31]) but all other diagnoses were not associated with hypertension risk compared to women who did not report infertility (ovulatory disorder: RR=1.03 [0.94–1.13], cervical: RR=0.88 [0.70–1.10], male factor: RR= 1.05 [0.95–1.15], other reason: RR=1.02 [0.94–1.11], reason not found: RR=1.02 [0.95–1.10]). Among infertile women there were 5,070 cases of hypertension. No clear pattern between use of fertility treatment and hypertension was found among infertile women (Clomiphene: RR =0.97 [0.90–1.04], Gonadotropin alone: RR=0.97 [0.87–1.08], IUI: RR=0.86 [0.71–1.03], IVF: RR=0.86 [0.73–1.01]).
Among this relatively young cohort of women, there was no apparent increase in hypertension risk among infertile women or among women who underwent fertility treatment in the past.
Assisted Reproduction; Epidemiology; Infertility; IUI; IVF/ICSI Outcome
To investigate whether a positive transition into retirement may be associated with later cognitive ageing, we included a subset of 4,926 Nurses’ Health Study participants who retired from work at ages 60–69, then provided a subjective assessment of the change in overall quality of life (QOL) with retirement. Subsequently (range: 1 month to 4.7 years later), when all were aged 70+ years, they completed a baseline telephone cognitive battery evaluating global cognition, episodic memory and executive function. They had up to three follow-up cognitive assessments. Controlling for various occupational factors before retirement and socioeconomic, lifestyle, and health-related factors as of the baseline cognitive assessment, we used generalized linear models for repeated measures to estimate mean differences in rates of cognitive decline across categories of QOL transition at retirement: “worse”, “same” or “better”. Over a median 6 years of follow-up, the global cognitive score change was −0.123 on average. Compared with women who reported no change in QOL at retirement (31%), women who reported improvement (61%) showed a significantly slower rate of cognitive decline (difference= +0.011 95% CI =0.004, 0.019). This mean difference was equivalent to that observed between women who were 2 years apart in age. No significant differences in cognitive decline rates were observed for the women who reported worsened QOL (8%). Secondary analyses to address possible reverse causation showed robust associations. A positive transition into retirement was associated with better maintenance of cognitive function over time in aging women. These findings need to be replicated in other populations.
Cognition; Aging; Quality of life; Retirement; Cohort studies; Epidemiology
Memory performance in older persons can reflect genetic influences on cognitive function and dementing processes. We aimed to identify genetic contributions to verbal declarative memory in a community setting.
We conducted genome-wide association studies for paragraph or word list delayed recall in 19 cohorts from the Cohorts for Heart and Aging Research in Genomic Epidemiology consortium, comprising 29,076 dementia-and stroke-free individuals of European descent, aged ≥45 years. Replication of suggestive associations (p < 5 × 10−6) was sought in 10,617 participants of European descent, 3811 African-Americans, and 1561 young adults.
rs4420638, near APOE, was associated with poorer delayed recall performance in discovery (p = 5.57 × 10−10) and replication cohorts (p = 5.65 × 10−8). This association was stronger for paragraph than word list delayed recall and in the oldest persons. Two associations with specific tests, in subsets of the total sample, reached genome-wide significance in combined analyses of discovery and replication (rs11074779 [HS3ST4], p = 3.11 × 10−8, and rs6813517 [SPOCK3], p = 2.58 × 10−8) near genes involved in immune response. A genetic score combining 58 independent suggestive memory risk variants was associated with increasing Alzheimer disease pathology in 725 autopsy samples. Association of memory risk loci with gene expression in 138 human hippocampus samples showed cis-associations with WDR48 and CLDN5, both related to ubiquitin metabolism.
This largest study to date exploring the genetics of memory function in ~ 40,000 older individuals revealed genome-wide associations and suggested an involvement of immune and ubiquitin pathways.
Alzheimer disease; Dementia; Epidemiology; Genetics; Population-based; Verbal declarative memory
Homozygosity has long been associated with rare, often devastating, Mendelian disorders1 and Darwin was one of the first to recognise that inbreeding reduces evolutionary fitness2. However, the effect of the more distant parental relatedness common in modern human populations is less well understood. Genomic data now allow us to investigate the effects of homozygosity on traits of public health importance by observing contiguous homozygous segments (runs of homozygosity, ROH), which are inferred to be homozygous along their complete length. Given the low levels of genome-wide homozygosity prevalent in most human populations, information is required on very large numbers of people to provide sufficient power3,4. Here we use ROH to study 16 health-related quantitative traits in 354,224 individuals from 102 cohorts and find statistically significant associations between summed runs of homozygosity (SROH) and four complex traits: height, forced expiratory lung volume in 1 second (FEV1), general cognitive ability (g) and educational attainment (nominal p<1 × 10−300, 2.1 × 10−6, 2.5 × 10−10, 1.8 × 10−10). In each case increased homozygosity was associated with decreased trait value, equivalent to the offspring of first cousins being 1.2 cm shorter and having 10 months less education. Similar effect sizes were found across four continental groups and populations with different degrees of genome-wide homozygosity, providing convincing evidence for the first time that homozygosity, rather than confounding, directly contributes to phenotypic variance. Contrary to earlier reports in substantially smaller samples5,6, no evidence was seen of an influence of genome-wide homozygosity on blood pressure and low density lipoprotein (LDL) cholesterol, or ten other cardio-metabolic traits. Since directional dominance is predicted for traits under directional evolutionary selection7, this study provides evidence that increased stature and cognitive function have been positively selected in human evolution, whereas many important risk factors for late-onset complex diseases may not have been.
Microvascular and macrovascular abnormalities are frequently found on noninvasive tests performed in older adults. Their prognostic implications on disability and life expectancy have not been collectively assessed.
This prospective study included 2,452 adults (mean age: 79.5 years) with available measures of microvascular (brain, retina, kidney) and macrovascular abnormalities (brain, carotid, coronary, peripheral artery) in the Cardiovascular Health Study. The burden of microvascular and macrovascular abnormalities was examined in relation to total, activity-of-daily-living disability-free, and severe disability-free life expectancies in the next 10 years (1999–2009).
At 75 years, individuals with low burden of both abnormalities lived, on average, 8.71 years (95% confidence interval: 8.29, 9.12) of which 7.67 years (7.16, 8.17) were without disability. In comparison, individuals with high burden of both abnormalities had shortest total life expectancy (6.95 years [6.52, 7.37]; p < .001) and disability-free life expectancy (5.60 years [5.10, 6.11]; p < .001). Although total life expectancy was similarly reduced for those with high burden of either type of abnormalities (microvascular: 7.96 years [7.50, 8.42] vs macrovascular: 8.25 years [7.80, 8.70]; p = .10), microvascular abnormalities seemed to have larger impact than macrovascular abnormalities on disability-free life expectancy (6.45 years [5.90, 6.99] vs 6.96 years [6.43, 7.48]; p = .016). These results were consistent for severe disability-free life expectancy and in individuals without clinical cardiovascular disease.
Considering both microvascular and macrovascular abnormalities from multiple noninvasive tests may provide additional prognostic information on how older adults spend their remaining life. Optimal clinical use of this information remains to be determined.
Cardiovascular; Epidemiology; Longevity; Disablement process.
Subjective cognitive concerns may represent a simple method to assess likelihood of memory decline among APOE ε4 carriers.
We examined the relationship of self-reported subjective cognitive concerns, using 7 specific cognitive concerns, with memory and memory decline over 6 years among APOE ε4 carriers and non-carriers from the Nurses’ Health Study.
In both groups, increasing subjective cognitive concern score predicted worse baseline memory and faster rates of subsequent memory decline, after adjustment for age, education and depression. The relation with baseline memory appeared statistically stronger in APOE ε4 carriers (P-interaction=0.03). For memory decline, mean differences in slopes of episodic memory (95% CI) for 4 to 7 versus no concern = −0.05 (−0.10, 0.01) standard units in APOE ε4 carriers, and −0.04 (−0.08, −0.01) standard units in non-carriers.
APOE ε4 carriers with self-assessed cognitive concerns appear to have worse memory, and possibly accelerated memory decline.
memory; Apolipoprotein E4; subjective cognitive concerns
Worldwide, over 35 million people suffer from Alzheimer’s disease and related dementias. This number is expected to triple over the next 40 years. How can we improve the evidence supporting strategies to reduce the rate of dementia in future generations? The risk of dementia is likely influenced by modifiable factors such as exercise, cognitive activity, and the clinical management of diabetes and hypertension. However, the quality of evidence is limited and it remains unclear whether specific interventions to reduce these modifiable risk factors can, in turn, reduce the risk of dementia. Although randomized controlled trials are the gold-standard for causality, the majority of evidence for long-term dementia prevention derives from, and will likely continue to derive from, observational studies. Although observational research has some unavoidable limitations, its utility for dementia prevention might be improved by, for example, better distinction between confirmatory and exploratory research, higher reporting standards, investment in effectiveness research enabled by increased data-pooling, and standardized exposure and outcome measures. Informed decision-making by the general public on low-risk health choices that could have broad potential benefits could be enabled by internet-based tools and decision-aids to communicate the evidence, its quality, and the estimated magnitude of effect.
Alzheimer’s; primary prevention; non-randomized studies; low-risk; communication
To evaluate use of fertility treatments among a large cohort of women in the United States.
Nurses’ Health Study II
10,036 women who reported having utilized fertility treatment on biennial questionnaires from 1993-2009
Main Outcome Measure(s)
Data on patterns of treatment modality were collected via self-report from validated mailed questionnaires. Information on clomiphene, gonadotropin injections alone, and gonadotropin injections as part of intrauterine insemination (IUI) and in-vitro fertilization (IVF) was queried.
Most women who reported fertility treatment utilized clomiphene (94%), with a large majority reporting clomiphene as their only form of treatment (73%). Of women who reported treatment more advanced than clomiphene, 13% had used gonadotropin injections alone, 11% IUI treatment, and 11% IVF. Several subgroups were more likely to use multiple treatment modalities and to initiate treatment with gonadotropins rather than clomiphene, including women living in states with insurance coverage of fertility procedures, with higher household income, younger in age, those who remained nulliparous at the study close, and those treated after 2000.
Results should be interpreted cautiously, but to our knowledge, this represents the first study of fertility treatment patterns in the US, and could inform public health planning.
fertility treatment; IVF; IUI; Clomiphene; Gonadotropins
Introduction and Hypothesis
Previous studies have reported higher prevalence of depression among women with urgency or mixed urinary incontinence (UI) than stress UI. Urgency UI is the dominant UI type among black women, while stress UI is the predominant type among white women. Thus, UI-related mental health issues could be a key consideration among black women. We hypothesized that the association between UI and depression might be stronger in black versus white women.
These cross-sectional analyses included 934 black and 71,161 white women aged 58-83 in the Nurses’ Health Study, which was established among women living in the USA. Depressive symptoms were assessed using the ten-item Center for Epidemiologic Studies Depression scale (CESD-10). Multivariable-adjusted odds ratios (ORs) and 95% confidence intervals (CIs) for high depressive symptoms (CESD-10 score≥10) according to self-reported UI frequency, severity, and type were calculated using logistic regression models.
Although point estimates for associations of UI frequency, severity, and type with high depressive symptoms were higher in black women, differences in ORs between black and white women were not statistically significant. For example, the OR for at least weekly UI compared with no UI was 2.29 (95% CI 1.30-4.01) in black women and 1.58 (95% CI 1.49-1.68) in white women (p-interaction=0.4).
We did not find statistically significant differences in associations of UI frequency, severity, and type with high depressive symptoms between black and white women. However, small numbers of black women with high depressive symptoms limited statistical power to detect significant interactions. Thus, these results should be interpreted with caution.
depression; epidemiology; urinary incontinence; women
Sleep might influence brain health in older adults; however, epidemiologic research in this area is limited. We evaluated associations of sleep duration at midlife and later life, and change in sleep duration over time, with cognitive function in older women.
Participants reported sleep duration in 1986 and 2000, and a subgroup of older participants began cognitive testing in 1995–2001; follow-up testing was conducted three times, at two-year intervals.
Prospective Nurses’ Health Study cohort.
15,385 female nurses, aged ≥70 years, free of stroke and depression at the initial cognitive assessment.
Validated, telephone-based cognitive battery to measure cognitive function; we averaged the four repeated assessments over a six-year period to estimate overall cognition at older ages, and also evaluated trajectories of cognitive change over follow up.
Extreme sleep durations in later life were associated with worse average cognition (p-value for the quadratic term<0.001 for a global score averaging all six cognitive tests). For example, women sleeping ≤5 hours/day had worse global cognition than those sleeping 7 hours/day, as did women sleeping ≥9 hours/day; differences were equivalent to nearly two additional years of age. Associations were similar, though slightly attenuated, for sleep duration in midlife. Also, women whose sleep duration changed by ≥2 hours/day over time had worse cognition than women with no change in sleep duration (e.g., for the global score, p-value for the quadratic term<0.001). Sleep duration was not associated with women’s trajectories of cognitive function over six years (i.e., cognitive decline), which might be attributable to relatively short follow up for detecting cognitive decline.
Extreme sleep durations at both midlife and later life, and extreme changes in sleep duration over time, appear to be associated with worse cognitive status in older women.
sleep; cognition; epidemiology; cohort study
Vitamin D may play a role in preserving cognitive function. However,
there is a paucity of prospective studies on the relationship between
vitamin D and cognition with aging. The aim of this study was to examine the
association between plasma levels of vitamin D and subsequent cognitive
This is a prospective study including 1,185 women aged 60–70
years from the Nurses’ Health Study, who had plasma
25-hydroxy-vitamin D levels measured in 1989–1990 and completed an
initial Telephone Interview of Cognitive Status approximately 9 years later.
Subsequently, three follow-up cognitive assessments were conducted at
1.5–2.0 years intervals. We used multivariable-adjusted linear
regression to model initial cognitive function, and mixed linear regression
to model change in cognitive function over time.
Lower vitamin D levels were associated with significantly worse
cognitive function 9 years later. For example, the mean global composite
score averaging all the cognitive tests was 0.20 lower (95%
Confidence Interval (CI):−0.33,−0.08; p-trend=0.009)
in women in the lowest quintile (median=14.1 ng/mL) compared with
women in the highest quintile of vitamin D (median=38.4 ng/mL). The
observed differences were equivalent to the effect estimates we found for
women who were approximately 4–6 years apart in age. However,
vitamin D levels were not significantly associated with subsequent cognitive
decline during 6 years of follow-up.
Higher levels of plasma vitamin D in women aged 60–70 years
were associated with better cognitive function about a decade later but were
not associated with cognitive decline during 6 years of follow-up.
Vitamin D; Cognitive Function; Women’s Health
The relationship of postmenopausal hormone therapy with all-cause dementia and Alzheimer's disease dementia has been controversial. Given continued interest in the role of hormone therapy in chronic disease prevention and the emergence of more prospective studies, we conducted a systematic review to identify all epidemiologic studies meeting prespecified criteria reporting on postmenopausal hormone therapy use and risk of Alzheimer's disease or dementia. A systematic search of Medline and Embase through December 31, 2012, returned 15 articles meeting our criteria. Our meta-analysis of any versus never use did not support the hypothesis that hormone therapy reduces risk of Alzheimer's disease (summary estimate = 0.88, 95% confidence interval: 0.66, 1.16). Exclusion of trial findings did not change this estimate. There were not enough all-cause dementia results for a separate meta-analysis, but when we combined all-cause dementia results (n = 3) with Alzheimer's disease results (n = 7), the summary estimate remained null (summary estimate = 0.94, 95% confidence interval: 0.71, 1.26). The limited explorations of timing of use—both duration and early initiation—did not yield consistent findings. Our findings support current recommendations that hormone therapy should not be used for dementia prevention. We discuss trends in hormone therapy research that could explain our novel findings and highlight areas where additional data are needed.
Alzheimer disease; clinical trial; cognition; cohort studies; dementia; estrogen replacement therapy; hormone replacement therapy; systematic review
Despite widespread use of multivitamin supplements, their effect on cognitive health – a critical issue with aging – remains inconclusive. To date, there have been no long-term clinical trials to study multivitamin use and cognitive decline in older persons.
To evaluate whether long-term multivitamin supplementation affects cognitive health in later-life.
Randomized, double-blind, placebo-controlled trial of a multivitamin from 1997 to June 1, 2011. The cognitive function sub-study began in 1998; we completed up to four repeated cognitive assessments by telephone interview over 12 years.
The Physicians’ Health Study II.
5,947 male physicians aged ≥ 65 years.
Daily multivitamin, or placebo.
A global composite score averaging 5 tests of global cognition, verbal memory, and category fluency. The secondary endpoint was a verbal memory score combining 4 tests of verbal memory, a strong predictor of Alzheimer disease.
There was no difference in the mean cognitive change over time between the multivitamin and placebo groups, or in the mean level of cognition at any of the four assessments. Specifically, for the global composite score, the mean difference in cognitive change over follow-up was −0.01 (95% confidence interval [CI] −0.04, 0.02) standard units, comparing treatment versus placebo. Similarly, there was no difference in cognitive performance between the treated and placebo groups on the secondary outcome, verbal memory (e.g., mean difference in cognitive change over follow-up=−0.005, 95% CI −0.04, 0.03).
Doses of vitamins may be too low, or population may be too well-nourished to benefit from multivitamin.
In male physicians aged ≥ 65 years, long-term use of a daily multivitamin did not provide cognitive benefits.
http://www.clinicaltrials.gov identifier: NCT00270647
multivitamin; cognitive function; randomized clinical trial; men
To examine the relation of phobic anxiety to late-life cognitive trajectory.
Nurses’ Health Study – U.S. registered nurses.
16,351 women among whom phobic anxiety symptoms were assessed in 1988 (mean age=63 years).
Beginning a decade after phobic anxiety ascertainment (mean age=74 years), three assessments of general cognition, word and paragraph immediate and delayed recall, category fluency, and attention/working memory were administered over an average of 4.4 years; global cognitive and verbal memory composite scores were generated from the component tests. General linear models of response profiles were used to evaluate relations of phobic anxiety to initial cognitive performance and subsequent change.
Higher phobic anxiety was associated with poorer initial performance: e.g., comparing women with the highest anxiety to those with no/minimal symptoms, the multivariate-adjusted mean difference (95% confidence interval) in scores was −0.10 (−0.13,−0.06) standard units for the global score summarizing all tests, and −0.08 (−0.11,−0.04) standard units for verbal memory (summarizing 4 word- and paragraph-recall tasks). Mean differences between extreme categories of phobic anxiety were equal to those for participants aged 1.5–2 years apart: i.e., cognitively equivalent to being about two years older. There were no relations of phobic anxiety to subsequent cognitive change.
Higher mid-life phobic anxiety was related to worse later-life overall cognition and verbal memory. Yet, profiles of poorer cognition with higher anxiety remained parallel over time, suggesting phobic anxiety may impose impact on cognition earlier in life, rather than ongoing impact in later-life.
Rotating night-shift work, which can disrupt circadian rhythm, may adversely affect long-term health. Experimental studies indicate that circadian rhythm disruption might specifically accelerate brain aging; thus, we prospectively examined shift-work history at midlife as associated with cognitive function among older women in the Nurses' Health Study. Women reported their history of rotating night-shift work in 1988 and participated in telephone-based cognitive interviews between 1995 and 2001; interviews included 6 cognitive tests that were subsequently repeated 3 times, at 2-year intervals. We focused on shift work through midlife (here, ages 58–68 years) because cognitive decline is thought to begin during this period. Using multivariable-adjusted linear regression, we evaluated mean differences in both “average cognitive status” at older age (averaging cognitive scores from all 4 interviews) and rates of cognitive decline over time across categories of shift-work duration at midlife (none, 1–9, 10–19, or ≥20 years). There was little association between shift work and average cognition in later life or between shift work and cognitive decline. Overall, this study does not clearly support the hypothesis that shift-work history in midlife has long-term effects on cognition in older adults.
circadian rhythm; cognition; nurses; shift work
Understanding how to maintain health and well-being in aging populations is critical.
To examine the relation of dietary patterns in midlife to the prevalence of healthy aging.
Cross-sectional observational study.
Nurses’ Health Study.
10,670 women with dietary data and no major chronic diseases in 1984–1986, when they were in their late 50’s and early 60s (median age = 59 years); all women provided information on multiple aspects of aging an average 15 years later.
Diet quality in midlife was ascertained using the Alternative Healthy Eating Index-2010 (AHEI-2010) and Alternate Mediterranean diet (A-MeDi) scores, averaged from two food frequency questionnaires (1984–1986). We defined “healthy” vs “usual” aging as of age 70 years; healthy aging was based on survival to 70+ years with maintenance of four health domains - no major chronic diseases, or major impairments in cognitive or physical function or mental health.
After multivariable adjustment, greater adherence to the AHEI-2010 (upper vs. lower quintile) in midlife was related to 34% (95% CI=9% to 66%, P-trend<0.001) greater odds of healthy versus usual aging. Greater adherence to A-MeDi was related to 46% (95% CI=17% to 83%, P-trend=0.002) greater odds of healthy aging. When the 4 components of healthy aging were analyzed separately, AHEI-2010 and A-MeDi were significantly associated with higher likelihood of no major limitations in physical function and mental health.
Possibility of residual confounding, although we controlled for many confounding factors; bias due to complex patterns of measurement error within diet scores cannot be excluded.
Better diet quality at midlife appears strongly linked to greater health and well-being among those surviving to older ages.
Seafood consumption may prevent age-related cognitive decline. However, benefits may vary by nutrient contents in different seafood types. We examined associations between total seafood consumption and cognitive decline and whether these associations differ by seafood types.
We conducted a prospective cohort study of 5,988 women (mean age, 72 years) from the Women’s Health Study who self-reported seafood intake at Women’s Health Study baseline and also participated in telephone assessments of general cognition, verbal memory, and category fluency administered 5.6 years after Women’s Health Study baseline and 2 and 4 years thereafter. Primary outcomes were standardized composite scores of global cognition and verbal memory.
After adjusting for potential confounders, different amounts of total seafood consumption were not associated with changes in global cognition (p = .56) or verbal memory (p = .29). Considering seafood types, however, compared with women consuming less than once-weekly tuna or dark-meat finfish, those with once-weekly or higher consumption had significantly better verbal memory (0.079 standard units; p < .01) after 4 years—a difference comparable to that for women 2.1 years apart in age. There was also a statistically nonsignificant suggestion of better global cognition (p = .13) with once-weekly or higher tuna or dark-meat fish consumption. No significant associations were observed for light-meat finfish or shellfish.
The relation of seafood to cognition may depend on the types consumed. Total consumption levels of seafood were unrelated to cognitive change. However, consumption of tuna and dark-meat fish once weekly or higher was associated with lower decline in verbal memory for a period of 4 years.
Cognition; Epidemiology; Nutrition.
We examined whether midlife body mass index (BMI) and waist circumference (WC) predict successful ageing.
Design and Methods
BMI/WC were assessed in 4869 persons (mean age 51.2, range 42–63 in 1991/93) and survival and successful ageing (alive, no chronic disease at age >60 years, not in the worst age- and sex-standardized quintile of cognitive, physical, respiratory, and cardiovascular, and mental health) ascertained over a 16-year follow-up, analysed using logistic regression adjusted for socio-demographic factors and health behaviours.
507 participants died, 1008 met the criteria for successful ageing. Those with BMI≥30 kg/m2 had lower odds of successful ageing (Odds Ratio (OR)=0.37; 95% Confidence Interval (CI): 0.27, 0.50) and survival (OR=0.55; 95% CI: 0.41, 0.74) compared to BMI between 18.5–25 kg/m2. Those with a large waist circumference (≥102/88 cm in men/women) had lower odds of successful ageing (OR=0.41; 95% CI: 0.31, 0.54) and survival (OR=0.57; 95% CI: 0.44, 0.73) compared to those with a small waist (<94/80 cm in men/women). Analysis with finer categories showed lower odds of successful ageing starting at BMI ≥23.5 kg/m2 and waist circumference 82/68 cm in men/women.
Optimal midlife BMI and waist circumference for successful ageing might be substantially below the current thresholds used to define obesity.
obesity; body mass index; waist circumference; ageing
To estimate the prevalence of urinary incontinence, fecal incontinence, and dual incontinence in a large cohort of older women and compare risk factors across the three conditions.
These cross-sectional analyses utilized data from the Nurses’ Health Study. The 2008 questionnaire, mailed to 96,480 surviving participants aged 62–87 years, included two separate items on prevalence of urinary and fecal incontinence. A response of leakage at least once per month defined incontinence for both urine and stool. Dual incontinence was defined by responses at this frequency for both conditions. Using a polytomous logistic regression model we assessed each risk factor for prevalence of urinary, fecal, and dual incontinence, respectively.
The survey was completed by 64,396 women. Thirty-eight percent had urinary incontinence alone, 4% had fecal incontinence alone, and 7% had dual incontinence. Age older than 80 years compared with age younger than 70 years was associated most strongly with dual incontinence (odds ratio [OR] 2.49, 95% confidence interval [CI] 2.28–2.73), followed by depression (OR 2.28, 95% CI 2.13–2.43), neurologic disease (OR 1.84, 95% CI 1.65–2.07), functional limitations (OR 1.86, 95% CI 1.71–2.02), multiparity (OR 1.66, 95% CI 1.41–1.94), and heavier fetal birth weight (OR 1.24, 95% CI 1.10–1.41). Obesity was associated only with urinary incontinence (OR 1.99, 95% CI 1.90–2.08) and type 2 diabetes was a stronger risk factor for fecal than urinary incontinence (OR 1.43, 95% CI 1.28–1.59). Black race was associated with a reduced risk of all types of incontinence, especially dual incontinence (OR 0.30, 95% CI 0.21–0.44).
In this large cohort, dual incontinence was primarily associated with advanced age, decompensating medical conditions, depression, and multiparity.
Many women with urinary incontinence (UI) have symptoms that continue over many years; however, virtually nothing is known about factors that are associated with persistent UI.
We studied 36,843 participants of the Nurses’ Health Study, aged 54–79 years at baseline for the UI study, who provided UI information on biennial questionnaires from 2000 to 2008; follow up in the Nurses’ Health Study is 90%. In total, 18,347 women had “persistent UI,” defined as urine leakage ≥1/month reported on all five biennial questionnaires during this eight-year period; 18,496 women had no UI during this period. Using multivariable-adjusted logistic regression, we estimated odds ratios (OR) of persistent UI versus no UI across various demographic, lifestyle, and health-related factors, which were derived from reports in 2000.
Increasing age group, white race, greater parity, greater BMI, and lower physical activity levels were each associated with greater odds of persistent UI, as were several health-related factors (i.e., stroke, type 2 diabetes, and hysterectomy). Associations with persistent UI were particularly strong for increasing age group (p-trend<0.0001; OR=2.75, 95% CI=2.54–2.98 comparing women aged ≥75 vs. <60 years) and greater BMI (p-trend<0.0001; OR=3.14, 95% CI=2.95–3.33 comparing women with BMI ≥30 vs. <25 kg/m2); moreover, black women had much lower odds of persistent UI compared to white women (OR=0.27, 95% CI=0.21–0.34).
Factors associated with persistent UI were generally consistent with those identified in previous studies of UI over shorter time periods; however, older age, white race, and obesity were particularly strongly related to persistent UI.
epidemiology; risk factors; urinary incontinence; women
Adherence to a Mediterranean diet may help prevent cognitive decline in older age, but studies are limited. We examined the association of adherence to the Mediterranean diet with cognitive function and decline.
We included 6,174 participants, aged 65+ years, from the cognitive sub-study of the Women’s Health Study. Women provided dietary information in 1998 and completed a cognitive battery 5 years later, followed by two assessments at 2-year intervals. The primary outcomes were composite scores of global cognition and verbal memory. The alternate Mediterranean diet adherence 9-point-score was constructed based on intakes of: vegetables, fruits, legumes, whole grains, nuts, fish, red and processed meats, moderate alcohol, and the ratio of monounsaturated-to-saturated fats.
After multivariable adjustment, the alternate Mediterranean diet score was not associated with trajectories of repeated cognitive scores (P-trend across quintiles=0.26 and 0.40 for global cognition and verbal memory, respectively), nor with overall global cognition and verbal memory at older ages, assessed by averaging the three cognitive measures (P-trend=0.63 and 0.44, respectively). Among alternate Mediterranean diet components, higher monounsaturated-to-saturated fats ratio was associated with more favorable cognitive trajectories (P-trend=0.03 and 0.05 for global cognition and verbal memory, respectively). Greater whole grain intake was not associated with cognitive trajectories, but was related to better average global cognition (P-trend=0.02).
In this large study of older women, we observed no association of the Mediterranean diet with cognitive decline. Relations between individual Mediterranean diet components, particularly whole grains, and cognitive function merit further study.
Background: Elevated plasma homocysteine is a risk factor for Alzheimer disease, but the relevance of homocysteine lowering to slow the rate of cognitive aging is uncertain.
Objective: The aim was to assess the effects of treatment with B vitamins compared with placebo, when administered for several years, on composite domains of cognitive function, global cognitive function, and cognitive aging.
Design: A meta-analysis was conducted by using data combined from 11 large trials in 22,000 participants. Domain-based z scores (for memory, speed, and executive function and a domain-composite score for global cognitive function) were available before and after treatment (mean duration: 2.3 y) in the 4 cognitive-domain trials (1340 individuals); Mini-Mental State Examination (MMSE)–type tests were available at the end of treatment (mean duration: 5 y) in the 7 global cognition trials (20,431 individuals).
Results: The domain-composite and MMSE-type global cognitive function z scores both decreased with age (mean ± SE: −0.054 ± 0.004 and −0.036 ± 0.001/y, respectively). Allocation to B vitamins lowered homocysteine concentrations by 28% in the cognitive-domain trials but had no significant effects on the z score differences from baseline for individual domains or for global cognitive function (z score difference: 0.00; 95% CI: −0.05, 0.06). Likewise, allocation to B vitamins lowered homocysteine by 26% in the global cognition trials but also had no significant effect on end-treatment MMSE-type global cognitive function (z score difference: −0.01; 95% CI: −0.03, 0.02). Overall, the effect of a 25% reduction in homocysteine equated to 0.02 y (95% CI: −0.10, 0.13 y) of cognitive aging per year and excluded reductions of >1 mo per year of treatment.
Conclusion: Homocysteine lowering by using B vitamins had no significant effect on individual cognitive domains or global cognitive function or on cognitive aging.