Search tips
Search criteria

Results 1-25 (1230246)

Clipboard (0)

Related Articles

1.  Additive Effects of Cognitive Function and Depressive Symptoms on Mortality in Elderly Community-Living Adults 
Poor cognitive function and depressive symptoms are common in the elderly, frequently coexist, and are interrelated. Both risk factors are independently associated with mortality. Few studies have comprehensively described how the combination of poor cognitive function and depressive symptoms affect the risk for mortality. Our aim was to examine whether the combination of varying levels of cognitive function and depressive symptoms affect the risk of mortality in community-living elderly adults.
We studied 6301 elderly adults (mean age, 77 years; 62% women; 81% white) enrolled in the Asset and Health Dynamics Among the Oldest Old (AHEAD) study, a prospective study of community-living participants conducted from 1993 to 1995. Cognitive function and depressive symptoms were measured using two validated measures developed for the AHEAD study. On each measure, participants were divided into tertiles representing the best, middle, and worst scores, and then placed into one of nine mutually exclusive groups ranging from best functioning on both measures to worst functioning on both measures. Mortality rates were assessed in each of the nine groups. Cox proportional hazards models were used to control for potentially confounding characteristics such as demographics, education, income, smoking, alcohol consumption, comorbidity, and baseline functional impairment.
During 2 years of follow-up, 9% (548) of the participants died. Together, cognitive function and depressive symptoms differentiated between elderly adults at markedly different risk for mortality, ranging from 3% in those with the best function on both measures to 16% in those with the worst function on both measures (p < .001). Furthermore, for each level of cognitive function, more depressive symptoms were associated with higher mortality rates, and for each level of depressive symptoms, worse cognitive function was associated with higher mortality rates. In participants with the best cognitive function, mortality rates were 3%, 5%, and 9% in participants with low, middle, and high depressive symptoms, respectively (p < .001 for trend). The corresponding rates were 6%, 7%, and 12% in participants with the middle level of cognitive function (p < .001 for trend), and 10%, 13%, and 16% in participants with the worst level of cognitive function (p < .001 for trend). After adjustment for confounders, participants with the worst function on both measures remained at considerably higher risk for death than participants with the best function on both measures (adjusted hazard ratio, 3.1; 95% confidence interval, 2.0–4.7).
Cognitive function and depressive symptoms can be used together to stratify elderly adults into groups that have significantly different rates of death. These two risk factors are associated with an increased risk in mortality in a progressive, additive manner.
PMCID: PMC2939722  PMID: 12730257
2.  Advanced Paternal Age Is Associated with Impaired Neurocognitive Outcomes during Infancy and Childhood 
PLoS Medicine  2009;6(3):e1000040.
Advanced paternal age (APA) is associated with an increased risk of neurodevelopmental disorders such as autism and schizophrenia, as well as with dyslexia and reduced intelligence. The aim of this study was to examine the relationship between paternal age and performance on neurocognitive measures during infancy and childhood.
Methods and Findings
A sample of singleton children (n = 33,437) was drawn from the US Collaborative Perinatal Project. The outcome measures were assessed at 8 mo, 4 y, and 7 y (Bayley scales, Stanford Binet Intelligence Scale, Graham-Ernhart Block Sort Test, Wechsler Intelligence Scale for Children, Wide Range Achievement Test). The main analyses examined the relationship between neurocognitive measures and paternal or maternal age when adjusted for potential confounding factors. Advanced paternal age showed significant associations with poorer scores on all of the neurocognitive measures apart from the Bayley Motor score. The findings were broadly consistent in direction and effect size at all three ages. In contrast, advanced maternal age was generally associated with better scores on these same measures.
The offspring of older fathers show subtle impairments on tests of neurocognitive ability during infancy and childhood. In light of secular trends related to delayed fatherhood, the clinical implications and the mechanisms underlying these findings warrant closer scrutiny.
Using a sample of children from the US Collaborative Perinatal Project, John McGrath and colleagues show that the offspring of older fathers exhibit subtle impairments on tests of neurocognitive ability during infancy and childhood.
Editors' Summary
Over the last few decades, changes in society in the developed world have made it increasingly common for couples to wait until their late thirties to have children. In 1993, 25% of live births within marriage in England and Wales were to fathers aged 35–54 years, but by 2003 it was 40%. It is well known that women's fertility declines with age and that older mothers are more likely to have children with disabilities such as Down's syndrome. In contrast, many men can father children throughout their lives, and little attention has been paid to the effects of older fatherhood.
More recent evidence shows that a man's age does affect both fertility and the child's health. “Advanced paternal age” has been linked to miscarriages, birth deformities, cancer, and specific behavioral problems such as autism or schizophrenia.
Rates of autism have increased in recent decades, but the cause is unknown. Studies of twins and families have suggested there may be a complex genetic basis, and it is suspected that damage to sperm, which can accumulate over a man's lifetime, may be responsible. A woman's eggs are formed largely while she is herself in the womb, but sperm-making cells divide throughout a man's lifetime, increasing the chance of mutations in sperm.
Why Was This Study Done?
There is good evidence linking specific disorders with older fathers, but the link between a father's age and a child's more general intelligence is not as clear. A recent study suggested a link between reduced intelligence and both very young and older fathers. The authors wanted to use this large dataset to test the idea that older fathers have children who do worse on tests of intelligence. They also wanted to re-examine others' findings using this same dataset that older mothers have more intelligent children.
What Did the Researchers Do and Find?
The researchers gathered no new data but reanalyzed data on children from the US Collaborative Perinatal Project (CPP), which had used a variety of tests given to children at ages 8 months, 4 years, and 7 years, to measure cognitive ability—the ability to think and reason, including concentration, memory, learning, understanding, speaking, and reading. Some tests included assessments of “motor skills”—physical co-ordination.
The CPP dataset holds information on children of 55,908 expectant mothers who attended 12 university-affiliated hospital clinics in the United States from 1959 to 1965. The researchers excluded premature babies and multiple births and chose one pregnancy at random for each eligible woman, to keep their analysis simpler. This approach reduced the number of children in their analysis to 33,437.
The researchers analyzed the data using two models. In one, they took into account physical factors such as the parents' ages. In the other, they also took into account social factors such as the parents' level of education and income, which are linked to intelligence. In addition, the authors grouped the children by their mother's age and, within each group, looked for a link between the lowest-scoring children and the age of their father.
The researchers found that children with older fathers had lower scores on all of the measures except one measure of motor skills. In contrast, children with older mothers had higher scores. They found that the older the father, the more likely was this result found.
What Do These Findings Mean?
This study is the first to show that children of older fathers perform less well in a range of tests when young, but cannot say whether those children catch up with their peers after the age of 7 years. Results may also be biased because information was more likely to be missing for children whose father's age was not recorded.
Previous researchers had proposed that children of older mothers may perform better in tests because they experience a more nurturing home environment. If this is the case, children of older fathers do not experience the same benefit.
However, further work needs to be done to confirm these findings. Especially in newer datasets, current trends to delay parenthood mean these findings have implications for individuals, couples, and policymakers. Individuals and couples need to be aware that the ages of both partners can affect their ability to have healthy children, though the risks for individual children are small. Policymakers should consider promoting awareness of the risks of delaying parenthood or introducing policies to encourage childbearing at an optimal age.
Additional Information.
Please access these Web sites via the online version of this summary at
Mothers 35+ is a UK Web site with resources and information for older mothers, mothers-to-be, and would-be mothers, including information on the health implications of fathering a child late in life
The American Society for Reproductive Medicine published a Patient Information Booklet on Age and Fertility in 2003, which is available online; it contains a small section called “Fertility in the Aging Male,” but otherwise focuses on women
The online encyclopedia Wikipedia has a short article on the “Paternal age effect” (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
In 2005, the UK Office of National Statistics published a booklet entitled “Perpetual postponers? Women's, men's and couple's fertility intentions and subsequent fertility behaviour” looking at data from the British Household Panel Survey
PMCID: PMC2653549  PMID: 19278291
3.  Prevalence, Distribution, and Impact of Mild Cognitive Impairment in Latin America, China, and India: A 10/66 Population-Based Study 
PLoS Medicine  2012;9(2):e1001170.
A set of cross-sectional surveys carried out in Cuba, Dominican Republic, Peru, Mexico, Venezuela, Puerto Rico, China, and India reveal the prevalence and between-country variation in mild cognitive impairment at a population level.
Rapid demographic ageing is a growing public health issue in many low- and middle-income countries (LAMICs). Mild cognitive impairment (MCI) is a construct frequently used to define groups of people who may be at risk of developing dementia, crucial for targeting preventative interventions. However, little is known about the prevalence or impact of MCI in LAMIC settings.
Methods and Findings
Data were analysed from cross-sectional surveys established by the 10/66 Dementia Research Group and carried out in Cuba, Dominican Republic, Peru, Mexico, Venezuela, Puerto Rico, China, and India on 15,376 individuals aged 65+ without dementia. Standardised assessments of mental and physical health, and cognitive function were carried out including informant interviews. An algorithm was developed to define Mayo Clinic amnestic MCI (aMCI). Disability (12-item World Health Organization disability assessment schedule [WHODAS]) and informant-reported neuropsychiatric symptoms (neuropsychiatric inventory [NPI-Q]) were measured. After adjustment, aMCI was associated with disability, anxiety, apathy, and irritability (but not depression); between-country heterogeneity in these associations was only significant for disability. The crude prevalence of aMCI ranged from 0.8% in China to 4.3% in India. Country differences changed little (range 0.6%–4.6%) after standardization for age, gender, and education level. In pooled estimates, aMCI was modestly associated with male gender and fewer assets but was not associated with age or education. There was no significant between-country variation in these demographic associations.
An algorithm-derived diagnosis of aMCI showed few sociodemographic associations but was consistently associated with higher disability and neuropsychiatric symptoms in addition to showing substantial variation in prevalence across LAMIC populations. Longitudinal data are needed to confirm findings—in particular, to investigate the predictive validity of aMCI in these settings and risk/protective factors for progression to dementia; however, the large number affected has important implications in these rapidly ageing settings.
Please see later in the article for the Editors' Summary
Editors' Summary
Currently, more than 35 million people worldwide have dementia, a group of brain disorders characterized by an irreversible decline in memory, problem solving, communication, and other “cognitive” functions. Dementia, the commonest form of which is Alzheimer's disease, mainly affects older people and, because more people than ever are living to a ripe old age, experts estimate that, by 2050, more than 115 million people will have dementia. At present, there is no cure for dementia although drugs can be used to manage some of the symptoms. Risk factors for dementia include physical inactivity, infrequent participation in mentally or socially stimulating activities, and common vascular risk factors such as high blood pressure, diabetes, and smoking. In addition, some studies have reported that mild cognitive impairment (MCI) is associated with an increased risk of dementia. MCI can be seen as an intermediate state between normal cognitive aging (becoming increasingly forgetful) and dementia although many people with MCI never develop dementia, and some types of MCI can be static or self-limiting. Individuals with MCI have cognitive problems that are more severe than those normally seen in people of a similar age but they have no other symptoms of dementia and are able to look after themselves. The best studied form of MCI—amnestic MCI (aMCI)—is characterized by memory problems such as misplacing things and forgetting appointments.
Why Was This Study Done?
Much of the expected increase in dementia will occur in low and middle income countries (LAMICs) because these countries have rapidly aging populations. Given that aMCI is frequently used to define groups of people who may be at risk of developing dementia, it would be useful to know what proportion of community-dwelling older adults in LAMICs have aMCI (the prevalence of aMCI). Such information might help governments plan their future health care and social support needs. In this cross-sectional, population-based study, the researchers estimate the prevalence of aMCI in eight LAMICs using data collected by the 10/66 Dementia Research Group. They also investigate the association of aMCI with sociodemographic factors (for example, age, gender, and education), disability, and neuropsychiatric symptoms such as anxiety, apathy, irritability, and depression. A cross-sectional study collects data on a population at a single time point; the 10/66 Dementia Research Group is building an evidence base to inform the development and implementation of policies for improving the health and social welfare of older people in LAMICs, particularly people with dementia.
What Did the Researchers Do and Find?
In cross-sectional surveys carried out in six Latin American LAMICS, China, and India, more than 15,000 elderly individuals without dementia completed standardized assessments of their mental and physical health and their cognitive function. Interviews with relatives and carers provided further details about the participant's cognitive decline and about neuropsychiatric symptoms. The researchers developed an algorithm (set of formulae) that used the data collected in these surveys to diagnose aMCI in the study participants. Finally, they used statistical methods to analyze the prevalence, distribution, and impact of aMCI in the eight LAMICs. The researchers report that aMCI was associated with disability, anxiety, apathy, and irritability but not with depression and that the prevalence of aMCI ranged from 0.8% in China to 4.3% in India. Other analyses show that, considered across all eight countries, aMCI was modestly associated with being male (men had a slightly higher prevalence of aMCI than women) and with having fewer assets but was not associated with age or education.
What Do These Findings Mean?
These findings suggest that aMCI, as diagnosed using the algorithm developed by the researchers, is consistently associated with higher disability and with neuropsychiatric symptoms in the LAMICs studied but not with most sociodemographic factors. Because prevalidated and standardized measurements were applied consistently in all the countries and a common algorithm was used to define aMCI, these findings also suggest that the prevalence of aMCI varies markedly among LAMIC populations and is similar to or slightly lower than the prevalence most often reported for European and North American populations. Although longitudinal studies are now needed to investigate the extent to which aMCI can be used as risk marker for further cognitive decline and dementia in these settings, the large absolute numbers of older people with aMCI in LAMICs revealed here potentially has important implications for health care and social service planning in these rapidly aging and populous regions of the world.
Additional Information
Please access these Web sites via the online version of this summary at
Alzheimer's Disease International is the international federation of Alzheimer associations around the world; it provides links to individual associations, information about dementia, and links to three World Alzheimer Reports; information about the 10/66 Dementia Research Group is also available on this web site
The Alzheimer's Society provides information for patients and carers about dementia, including information on MCI and personal stories about living with dementia
The Alzheimer's Association also provides information for patients and carers about dementia and about MCI, and personal stories about dementia
A BBC radio program that includes an interview with a man with MCI is available
MedlinePlus provides links to further resources about MCI and dementia (in English and Spanish)
PMCID: PMC3274506  PMID: 22346736
4.  Cognitive impairment, decline and fluctuations in older community-dwelling subjects with Lewy bodies 
Brain  2012;135(10):3005-3014.
Lewy bodies are common in the ageing brain and often co-occur with Alzheimer’s disease pathology. There is little known regarding the independent role of Lewy body pathology in cognition impairment, decline and fluctuations in community-dwelling older persons. We examined the contribution of Lewy body pathology to dementia, global cognition, cognitive domains, cognitive decline and fluctuations in 872 autopsied subjects (mean age = 87.9 years) from the Rush Religious Order Study (n = 491) and Memory and Aging Project (n = 381) longitudinal community-based clinical–pathological studies. Dementia was based on a clinical evaluation; annual cognitive performance tests were used to create a measure of global cognition and five cognitive domains. Lewy body type was determined by using α-synuclein immunostained sections of substantia nigra, limbic and neocortical regions. Statistical models included multiple regression models for dementia and cognition and mixed effects models for decline. Cognitive fluctuations were estimated by comparing standard deviations of individual residuals from mean trajectories of decline in those with and without Lewy bodies. All models controlled for age, sex, education, Alzheimer’s disease pathology and infarcts. One hundred and fifty-seven subjects (18%) exhibited Lewy body pathology (76 neocortical-type, 54 limbic-type and 27 nigra-predominant). One hundred and three (66%) subjects with Lewy body pathology had a pathologic diagnosis of Alzheimer’s disease. Neocortical-type, but not nigral-predominant or limbic-type Lewy body pathology was related to an increased odds of dementia (odds ratio = 3.21; 95% confidence interval = 1.78–5.81) and lower cognition (P < 0.001) including episodic memory function (P < 0.001) proximate to death. Neocortical-type Lewy body pathology was also related to a faster decline in global cognition (P < 0.001), decline in all five specific cognitive domains (all P-values < 0.001), and to fluctuations in decline of working and semantic memory (P-values < 0.001). Limbic-type Lewy body pathology was related to lower and faster decline in visuospatial skills (P = 0.042). The relationship of Lewy body pathology to cognition and dementia was not modified by Alzheimer’s disease pathology. Neocortical-type Lewy body pathology is associated with increased odds of dementia; lower and more rapid decline in all cognitive domains including episodic memory and fluctuations in decline in semantic and working memory. Limbic-type Lewy body pathology is specifically associated with lower and more rapid decline in visuospatial skills. The effect of Lewy body pathology on cognition appears to be independent of Alzheimer’s disease pathology.
PMCID: PMC3470712  PMID: 23065790
Lewy body pathology; cognition; dementia; cognitive decline; fluctuations
5.  Cognitive Performance in Late Adolescence and the Subsequent Risk of Subdural Hematoma: An Observational Study of a Prospective Nationwide Cohort 
PLoS Medicine  2011;8(12):e1001151.
Anna and Peter Nordström analyzed a prospective nationwide cohort of 440,742 Swedish men and found that reduced cognitive function in young adulthood was associated with increased risk of subdural hematoma later in life, whereas a higher level of education and physical fitness were associated with a decreased risk.
There are few identified risk factors for traumatic brain injuries such as subdural hematoma (SDH). The aim of the present study was to investigate whether low cognitive performance in young adulthood is associated with SDH later in life. A second aim was to investigate whether this risk factor was associated with education and physical fitness.
Methods and Findings
Word recollection, logical, visuospatial, and technical performances were tested at a mean age of 18.5 years in a prospective nation-wide cohort of 440,742 men. An estimate of global intelligence was calculated from these four tests. Associations between cognitive performance, education, physical fitness, and SDH during follow-up were explored using Cox regression analyses. During a median follow-up of 35 years, 863 SDHs were diagnosed in the cohort. Low global intelligence was associated with an increased risk of SDH during follow-up (hazard ratio [HR]: 1.33, per standard deviation decrease, 95% CI = 1.25–1.43). Similar results were obtained for the other measures of cognitive performance (HR: 1.24–1.33, p<0.001 for all). In contrast, a high education (HR: 0.27, comparing more than 2 years of high school and 8 years of elementary school, 95% CI = 0.19–0.39), and a high level of physical fitness (HR: 0.76, per standard deviation increase, 95% CI = 0.70–0.83), was associated with a decreased risk of suffering from a SDH.
The present findings suggest that reduced cognitive function in young adulthood is strongly associated with an increased risk of SDH later in life. In contrast, a higher level of education and a higher physical fitness were associated with a decreased risk of SDH.
Please see later in the article for the Editors' Summary
Editors' Summary
Every year, about 10 million people worldwide sustain a traumatic brain injury that needs medical attention or that proves fatal. Such injuries occur when the head is suddenly hit or jolted or when an object such as a bullet pierces the skull and enters the brain. Motor vehicle accidents are responsible for many traumatic brain injuries, but falls, assaults, and military action can also cause these serious injuries. The symptoms of a traumatic brain injury, which may not appear until many days after the injury, include loss of consciousness, headaches, dizziness, and nausea. Affected individuals can experience changes in their memory, concentration, or ability to think clearly (“cognitive” changes) and can have behavioral or emotional problems. Although the initial brain damage caused by trauma cannot be reversed, immediate medical treatment is essential to prevent further injury occurring. In particular, patients need to be monitored for “subdural hematoma,” a common outcome of traumatic brain injury in which blood from ruptured vessels collects between the brain and the skull. Subdural hematoma puts pressure on the brain and has to be removed surgically to prevent further brain damage.
Why Was This Study Done?
Not everyone who has a traumatic brain injury develops subdural hematoma. If the factors that increase a person's risk of developing subdural hematoma could be identified, it might be possible to devise public-health interventions that would reduce the incidence of subdural hematomas. In this prospective population-based analysis, the researchers investigate whether low cognitive performance in early adulthood is associated with subdural hematoma later in life. Impaired cognitive functioning is sometimes recorded as a symptom of subdural hematoma but the researchers hypothesize that these cognitive deficits might have been present before the traumatic head injury that led to subdural hematoma. Low cognitive performance is associated with a reduced ability to compare objects and patterns (perceptual speed) and with impaired judgment, planning, and risk behavior (executive functions), so low cognitive performance might increase a person's risk of having an accident that results in a head injury and subdural hematoma.
What Did the Researchers Do and Find?
The researchers calculated a global intelligence score for 440,742 male Swedish military conscripts (average age 18.5 years) from cognitive tests completed by the men between 1969 and 1978. They obtained information about diagnoses of subdural hematoma up to 40 years later among these men from medical records, and then used several statistical approaches to look for associations between cognitive performance, education (recorded during conscription assignment), physical fitness (measured during conscription assignment), and subsequent subdural hematoma. During the follow-up period, 863 subdural hematomas were diagnosed among the men. Conscripts with a low global intelligence score in early adulthood were more likely to develop subdural hematoma during later life than those with a high score. Specifically, when the men were divided into five groups (quintiles) on the basis of their global intelligence score, men with a score in the lowest quintile were more than twice as likely to develop subdural hematoma as those with a score in the highest quintile. By contrast, men who had had more than 2 years high school education were much less likely to develop subdural hematoma than those who had only had 8 years of elementary school education. A high level of physical fitness in early adulthood also reduced the risk of subdural hematoma.
What Do These Findings Mean?
These findings suggest that low cognitive function in early adulthood is associated with subdural hematoma later in life, whereas high levels of education and physical fitness is associated with a decreased risk of subdural hematoma. Because this study was observational, these findings do not prove that low cognitive performance, low education level, or low physical fitness is causally linked to subdural hematoma. Other unidentified factors (confounders) shared by people with these characteristics might actually be responsible for the observed association between these factors and subdural hematoma. For example, poorly educated people might work in more hazardous environments than those who attended high school. However, if these findings can to be confirmed in other large studies, an exploration of the mechanistic basis of the associations reported here might eventually inform the development of public-health interventions designed to reduce the occurrence of subdural hematoma.
Additional Information
Please access these Web sites via the online version of this summary at
The US National Institute of Neurological Disorders and Stroke provides detailed information about traumatic brain injury (in English and Spanish)
The US Centers for Disease Control and Prevention also provides detailed information about traumatic brain injury
The UK National Health Service Choices website has an article about severe head injury that includes a personal story about a head injury sustained in a motor vehicle accident, and an article about subdural hematoma
MedlinePlus provide links to further resources on traumatic brain injury and information on subdural hematoma; it also provides an interactive tutorial on traumatic brain injury (available in English and Spanish)
The UK charity Headway, which works to improve life after brain injury, has a collection of personal stories about brain injury
PMCID: PMC3246434  PMID: 22215989
6.  Plasma c-peptide levels and rates of cognitive decline in older, community-dwelling women without diabetes 
Psychoneuroendocrinology  2008;33(4):455-461.
Both type 2 diabetes and hyperinsulinemia have been related to diminished cognition. To address independent effects of increasing mid-life insulin secretion on late-life cognition, we prospectively examined the relation of plasma c-peptide levels to cognitive decline in a large sample of older women without diabetes or stroke.
Plasma c-peptide levels were measured in 1,187 “young-old” women (mean age=64 years) without diabetes in the Nurses’ Health Study. Cognitive decline was assessed approximately 10 years later. Three repeated cognitive batteries were administered over an average of 4.4 years using telephone-based tests of general cognition, verbal memory, category fluency, and attention. Primary outcomes were general cognition (measured by the Telephone interview for Cognitive Status [TICS], as well as a global score averaging all tests) and a verbal memory score averaging 4 tests of word-list and paragraph recall. Linear mixed effects models were used to compute associations between c-peptide levels and rates of cognitive decline.
Higher c-peptide levels were associated with faster decline in global cognition and verbal memory. Compared to those in the lowest c-peptide quartile, multivariable-adjusted mean differences (95% CI) in rates of decline for women in the highest quartile were −0.03 (−0.06, − 0.00) units/year for the global score, and −0.05 (−0.09, −0.02) units/year for verbal memory. Each one standard-deviation increase in c-peptide was associated with significantly faster decline on the TICS (p-trend=0.05), global score (p-trend=0.04) and verbal memory (p-trend=0.006).
Higher levels of insulin secretion in those without diabetes may be related to decline in general cognition and verbal memory.
PMCID: PMC2396343  PMID: 18261857
insulin; c-peptide; diabetes; cognitive decline; aging
7.  Treatment for Schistosoma japonicum, Reduction of Intestinal Parasite Load, and Cognitive Test Score Improvements in School-Aged Children 
To determine whether treatment of intestinal parasitic infections improves cognitive function in school-aged children, we examined changes in cognitive testscores over 18 months in relation to: (i) treatment-related Schistosoma japonicum intensity decline, (ii) spontaneous reduction of single soil-transmitted helminth (STH) species, and (iii) ≥2 STH infections among 253 S. japonicum-infected children.
Helminth infections were assessed at baseline and quarterly by the Kato-Katz method. S. japonicum infection was treated at baseline using praziquantel. An intensity-based indicator of lower vs. no change/higher infection was defined separately for each helminth species and joint intensity declines of ≥2 STH species. In addition, S. japonicum infection-free duration was defined in four categories based on time of schistosome re-infection: >18 (i.e. cured), >12 to ≤18, 6 to ≤12 and ≤6 (persistently infected) months. There was no baseline treatment for STHs but their intensity varied possibly due to spontaneous infection clearance/acquisition. Four cognitive tests were administered at baseline, 6, 12, and 18 months following S. japonicum treatment: learning and memory domains of Wide Range Assessment of Memory and Learning (WRAML), verbal fluency (VF), and Philippine nonverbal intelligence test (PNIT). Linear regression models were used to relate changes in respective infections to test performance with adjustment for sociodemographic confounders and coincident helminth infections.
Principal Findings
Children cured (β = 5.8; P = 0.02) and those schistosome-free for >12 months (β = 1.5; P = 0.03) scored higher in WRAML memory and VF tests compared to persistently infected children independent of STH infections. A decline vs. no change/increase of any individual STH species (β:11.5–14.5; all P<0.01) and the joint decline of ≥2 STH (β = 13.1; P = 0.01) species were associated with higher scores in WRAML learning test independent of schistosome infection. Hookworm and Trichuris trichiura declines were independently associated with improvements in WRAML memory scores as was the joint decline in ≥2 STH species. Baseline coinfection by ≥2 STH species was associated with low PNIT scores (β = −1.9; P = 0.04).
Children cured/S. japonicum-free for >12 months post-treatment and those who experienced declines of ≥2 STH species scored higher in three of four cognitive tests. Our result suggests that sustained deworming and simultaneous control for schistosome and STH infections could improve children's ability to take advantage of educational opportunities in helminth-endemic regions.
Author Summary
Parasitic worm infections are associated with cognitive impairment and lower academic achievement for infected relative to uninfected children. However, it is unclear whether curing or reducing worm infection intensity improves child cognitive function. We examined the independent associations between: (i) Schistosoma japonicum infection-free duration, (ii) declines in single helminth species, and (iii) joint declines of ≥2 soil-transmitted helminth (STH) infections and improvements in four cognitive tests during18 months of follow-up. Enrolled were schistosome-infected school-aged children among whom coinfection with STH was common. All children were treated for schistosome infection only at enrolment with praziquantel. Children cured or schistosome-free for >12 months scored higher in memory and verbal fluency tests compared to persistently infected children. Likewise, declines of single and polyparasitic STH infections predicted higher scores in three of four tests. We conclude that reducing the intensity of certain helminth species and the frequency of multi-species STH infections may have long-term benefits for affected children's cognitive performance. The rapidity of schistosome re-infection and the ubiquity of concurrent multi-species infection highlight the importance of sustained deworming for both schistosome and STH infections to enhance the learning and educational attainment of children in helminth-endemic settings.
PMCID: PMC3341324  PMID: 22563514
8.  Estimates of Outcomes Up to Ten Years after Stroke: Analysis from the Prospective South London Stroke Register 
PLoS Medicine  2011;8(5):e1001033.
Charles Wolfe and colleagues collected data from the South London Stroke Register on 3,373 first strokes registered between 1995 and 2006 and showed that between 20% and 30% of survivors have poor outcomes up to 10 years after stroke.
Although stroke is acknowledged as a long-term condition, population estimates of outcomes longer term are lacking. Such estimates would be useful for planning health services and developing research that might ultimately improve outcomes. This burden of disease study provides population-based estimates of outcomes with a focus on disability, cognition, and psychological outcomes up to 10 y after initial stroke event in a multi-ethnic European population.
Methods and Findings
Data were collected from the population-based South London Stroke Register, a prospective population-based register documenting all first in a lifetime strokes since 1 January 1995 in a multi-ethnic inner city population. The outcomes assessed are reported as estimates of need and included disability (Barthel Index <15), inactivity (Frenchay Activities Index <15), cognitive impairment (Abbreviated Mental Test < 8 or Mini-Mental State Exam <24), anxiety and depression (Hospital Anxiety and Depression Scale >10), and mental and physical domain scores of the Medical Outcomes Study 12-item short form (SF-12) health survey. Estimates were stratified by age, gender, and ethnicity, and age-adjusted using the standard European population. Plots of outcome estimates over time were constructed to examine temporal trends and sociodemographic differences. Between 1995 and 2006, 3,373 first-ever strokes were registered: 20%–30% of survivors had a poor outcome over 10 y of follow-up. The highest rate of disability was observed 7 d after stroke and remained at around 110 per 1,000 stroke survivors from 3 mo to 10 y. Rates of inactivity and cognitive impairment both declined up to 1 y (280/1,000 and 180/1,000 survivors, respectively); thereafter rates of inactivity remained stable till year eight, then increased, whereas rates of cognitive impairment fluctuated till year eight, then increased. Anxiety and depression showed some fluctuation over time, with a rate of 350 and 310 per 1,000 stroke survivors, respectively. SF-12 scores showed little variation from 3 mo to 10 y after stroke. Inactivity was higher in males at all time points, and in white compared to black stroke survivors, although black survivors reported better outcomes in the SF-12 physical domain. No other major differences were observed by gender or ethnicity. Increased age was associated with higher rates of disability, inactivity, and cognitive impairment.
Between 20% and 30% of stroke survivors have a poor range of outcomes up to 10 y after stroke. Such epidemiological data demonstrate the sociodemographic groups that are most affected longer term and should be used to develop longer term management strategies that reduce the significant poor outcomes of this group, for whom effective interventions are currently elusive.
Please see later in the article for the Editors' Summary
Editors' Summary
Every year, 15 million people have a stroke. About 5 million of these people die within a few days, and another 5 million are left disabled. Stroke occurs when the brain's blood supply is suddenly interrupted by a blood clot blocking a blood vessel in the brain (ischemic stroke, the commonest type of stroke) or by a blood vessel in the brain bursting (hemorrhagic stroke). Deprived of the oxygen normally carried to them by the blood, the brain cells near the blockage die. The symptoms of stroke depend on which part of the brain is damaged but include sudden weakness or paralysis along one side of the body, vision loss in one or both eyes, and confusion or trouble speaking or understanding speech. Anyone experiencing these symptoms should seek immediate medical attention because prompt treatment can limit the damage to the brain. Risk factors for stroke include age (three-quarters of strokes occur in people over 65 years old), high blood pressure, and heart disease.
Why Was This Study Done?
Post-stroke rehabilitation can help individuals overcome the physical disabilities caused by stroke, and drugs and behavioral counseling can reduce the risk of a second stroke. However, people can also have problems with cognition (thinking, awareness, attention, learning, judgment, and memory) after a stroke, and they can become depressed or anxious. These “outcomes” can persist for many years, but although stroke is acknowledged as a long-term condition, most existing data on stroke outcomes are limited to a year after the stroke and often focus on disability alone. Longer term, more extensive information is needed to help plan services and to help develop research to improve outcomes. In this burden of disease analysis, the researchers use follow-up data collected by the prospective South London Stroke Register (SLSR) to provide long-term population-based estimates of disability, cognition, and psychological outcomes after a first stroke. The SLSR has recorded and followed all patients of all ages in an inner area of South London after their first-ever stroke since 1995.
What Did the Researchers Do and Find?
Between 1995 and 2006, the SLSR recorded 3,373 first-ever strokes. Patients were examined within 48 hours of referral to SLSR, their stroke diagnosis was verified, and their sociodemographic characteristics (including age, gender, and ethnic origin) were recorded. Study nurses and fieldworkers then assessed the patients at three months and annually after the stroke for disability (using the Barthel Index, which measures the ability to, for example, eat unaided), inactivity (using the Frenchay Activities Index, which measures participation in social activities), and cognitive impairment (using the Abbreviated Mental Test or the Mini-Mental State Exam). Anxiety and depression and the patients' perceptions of their mental and physical capabilities were also assessed. Using preset cut-offs for each outcome, 20%–30% of stroke survivors had a poor outcome over ten years of follow-up. So, for example, 110 individuals per 1,000 population were judged disabled from three months to ten years, rates of inactivity remained constant from year one to year eight, at 280 affected individuals per 1,000 survivors, and rates of anxiety and depression fluctuated over time but affected about a third of the population. Notably, levels of inactivity were higher among men than women at all time points and were higher in white than in black stroke survivors. Finally, increased age was associated with higher rates of disability, inactivity, and cognitive impairment.
What Do These Findings Mean?
Although the accuracy of these findings may be affected by the loss of some patients to follow-up, these population-based estimates of outcome measures for survivors of a first-ever stroke for up to ten years after the event provide concrete evidence that stroke is a lifelong condition with ongoing poor outcomes. They also identify the sociodemographic groups of patients that are most affected in the longer term. Importantly, most of the measured outcomes remain relatively constant (and worse than outcomes in an age-matched non-stroke-affected population) after 3–12 months, a result that needs to be considered when planning services for stroke survivors. In other words, these findings highlight the need for health and social services to provide long-term, ongoing assessment and rehabilitation for patients for many years after a stroke.
Additional Information
Please access these Web sites via the online version of this summary at
The US National Institute of Neurological Disorders and Stroke provides information about all aspects of stroke (in English and Spanish); the US National Institute of Health SeniorHealth Web site has additional information about stroke
The Internet Stroke Center provides detailed information about stroke for patients, families, and health professionals (in English and Spanish)
The UK National Health Service Choices Web site also provides information about stroke for patients and their families
MedlinePlus has links to additional resources about stroke (in English and Spanish)
More information about the South London Stroke Register is available
PMCID: PMC3096613  PMID: 21610863
9.  Mediterranean diet and cognitive decline in women with cardiovascular disease or risk factors 
Cardiovascular disease and vascular risk factors increase rates of cognitive impairment, but very little is known regarding prevention in this high-risk group. The heart-healthy Mediterranean-type dietary pattern may beneficially influence both vascular and cognitive outcomes.
We examined the association between Mediterranean-style diet and cognitive decline in women with prevalent vascular disease or ≥3 coronary risk factors.
Design / Participants / Setting
Prospective cohort study among 2504 women participants of the Women’s Antioxidant Cardiovascular Study (WACS), a cohort of female health professionals Adherence to the Mediterranean diet was determined at WACS baseline (1995–1996) using a zero-to-nine-point scale with higher scores indicating higher adherence. In 1998–2000, participants aged ≥ 65 years underwent a telephone cognitive battery including five tests of global cognition, verbal memory, and category fluency. Tests were administered three additional times over 5.4 years.
Statistical analyses performed
We used multivariable-adjusted generalized linear models for repeated measures to compare the annual rates of cognitive score changes across tertiles of Mediterranean diet score, as assessed at WACS baseline.
In both basic- and multivariable-adjusted models, Mediterranean diet was not related to cognitive decline. No effect modification was detected by age, education, depression, cardiovascular disease severity at WACS baseline, or level of cognition at initial assessment.
In women at higher risk of cognitive decline due to vascular disease or risk factors, adherence to the Mediterranean diet was not associated with subsequent 5-year cognitive change.
PMCID: PMC3378990  PMID: 22709809
cognitive decline; vascular disease; hypertension; Mediterranean diet; longitudinal study
10.  Dementia before Death in Ageing Societies— The Promise of Prevention and the Reality 
PLoS Medicine  2006;3(10):e397.
Dementia and severe cognitive impairment are very closely linked to ageing. The longer we live the more likely we are to suffer from these conditions. Given population increases in longevity it is important to understand not only risk and protective factors for dementia and severe cognitive impairment at given ages but also whether protection affects cumulative risk. This can be explored by examining the effect on cumulative risk by time of death of factors found consistently to reduce risk at particular ages, such as education and social status.
Methods and Findings
In this analysis we report the prevalence of dementia and severe cognitive impairment in the year before death in a large population sample. In the Medical Research Council Cognitive Function and Ageing Study (a 10-y population-based cohort study of individuals 65 and over in England and Wales), these prevalences have been estimated by age, sex, social class, and education. Differences have been explored using logistic regression. The overall prevalence of dementia at death was 30%. There was a strong increasing trend for dementia with age from 6% for those aged 65–69 y at time of death to 58% for those aged 95 y and above at time of death. Higher prevalences were seen for severe cognitive impairment, with similar patterns. People with higher education and social class had significantly reduced dementia and severe cognitive impairment before death, but the absolute difference was small (under 10%).
Reducing risk for dementia at a given age will lead to further extension of life, thus cumulative risk (even in populations at lower risk for given ages) remains high. Ageing of populations is likely to result in an increase in the number of people dying with dementia and severe cognitive impairment even in the presence of preventative programmes. Policy development and research for dementia must address the needs of individuals who will continue to experience these conditions before death.
The overall prevalence of dementia at death in this large study was 30%. Ageing of populations is likely to result in an increase in the number of people dying with dementia even in the presence of preventative programmes.
Editors' Summary
Severe cognitive impairment and its advanced form, dementia, are among the most difficult problems associated with aging in industrialized countries. Age-associated decline in mental functioning is also expected to become more common in developing countries as improvement of conditions that affect health leads to longer life expectancies. Although the risk of cognitive impairment is known to increase with age, the number of people who suffer from loss of mental abilities in the last years of their lives has not been well studied, as such persons are usually reported to have died from other causes. Further, because the very elderly are seldom included in prevention studies, it is not known whether factors found to reduce the risk of developing dementia by a given age will provide protection until the end of life.
Why Was This Study Done?
This study was designed to follow a representative population of aged people over several years to estimate the risk of developing cognitive impairment or dementia near the end of life and to determine whether factors such as education and social class, which may be protective earlier in life, can ultimately prevent decline in mental functioning.
What Did the Researchers Do and Find?
Using standardized assessments of cognitive status, the researchers interviewed people age 65 and over at six sites representing rural and urban areas in the United Kingdom. Interviews were conducted at regular intervals over ten years. Of approximately 12,000 study participants who had died by the time of this report, just over 2,500 had an assessment for dementia within one year before dying. Of this group, those who died between ages 65 and 69 had a 6% chance of dying with dementia, and those who died above age 95 had a 58% chance of dying with dementia. When moderate and severe cognitive impairment were considered together, the rate in people above age 95 reached almost 80%. Women were more likely to develop dementia than men, even after taking into account the fact that women tend to live longer than men. A higher level of education was associated with only a slightly lower risk of dementia before death.
What Do These Findings Mean?
According to these results, as the number of aged persons increases (with improved health care, preventive medicine, and healthier lifestyles), the chances of developing dementia in the last years of life will continue to increase. Factors believed to protect against dementia at earlier times may be of little effect at the end of life. Planning for aging societies must therefore include not only research into treatments and preventive efforts to reduce the impact of dementia at the end of life, but also realistic allocation of resources to support individuals and their caregivers who must deal with the difficulties of cognitive decline.
Additional Information.
Please access these Web sites via the online version of this summary at
Web site of the Medical Research Council Cognitive Function and Ageing Study
Web site of the Alzheimer's Association
Wikipedia entry on dementia (note: Wikipedia is a free Internet encyclopedia that anyone can edit)
PMCID: PMC1626550  PMID: 17076551
11.  Physician Emigration from Sub-Saharan Africa to the United States: Analysis of the 2011 AMA Physician Masterfile 
PLoS Medicine  2013;10(9):e1001513.
Siankam Tankwanchi and colleagues used the AMA Physician Masterfile and the WHO Global Health Workforce Statistics on physicians in sub-Saharan Africa to determine trends in physician emigration to the United States.
Please see later in the article for the Editors' Summary
The large-scale emigration of physicians from sub-Saharan Africa (SSA) to high-income nations is a serious development concern. Our objective was to determine current emigration trends of SSA physicians found in the physician workforce of the United States.
Methods and Findings
We analyzed physician data from the World Health Organization (WHO) Global Health Workforce Statistics along with graduation and residency data from the 2011 American Medical Association Physician Masterfile (AMA-PM) on physicians trained or born in SSA countries who currently practice in the US. We estimated emigration proportions, year of US entry, years of practice before emigration, and length of time in the US. According to the 2011 AMA-PM, 10,819 physicians were born or trained in 28 SSA countries. Sixty-eight percent (n = 7,370) were SSA-trained, 20% (n = 2,126) were US-trained, and 12% (n = 1,323) were trained outside both SSA and the US. We estimated active physicians (age ≤70 years) to represent 96% (n = 10,377) of the total. Migration trends among SSA-trained physicians increased from 2002 to 2011 for all but one principal source country; the exception was South Africa whose physician migration to the US decreased by 8% (−156). The increase in last-decade migration was >50% in Nigeria (+1,113) and Ghana (+243), >100% in Ethiopia (+274), and >200% (+244) in Sudan. Liberia was the most affected by migration to the US with 77% (n = 175) of its estimated physicians in the 2011 AMA-PM. On average, SSA-trained physicians have been in the US for 18 years. They practiced for 6.5 years before US entry, and nearly half emigrated during the implementation years (1984–1999) of the structural adjustment programs.
Physician emigration from SSA to the US is increasing for most SSA source countries. Unless far-reaching policies are implemented by the US and SSA countries, the current emigration trends will persist, and the US will remain a leading destination for SSA physicians emigrating from the continent of greatest need.
Please see later in the article for the Editors' Summary
Editors' Summary
Population growth and aging and increasingly complex health care interventions, as well as existing policies and market forces, mean that many countries are facing a shortage of health care professionals. High-income countries are addressing this problem in part by encouraging the immigration of foreign health care professionals from low- and middle-income countries. In the US, for example, international medical graduates (IMGs) can secure visas and permanent residency by passing examinations provided by the Educational Commission of Foreign Medical Graduates and by agreeing to provide care in areas that are underserved by US physicians. Inevitably, the emigration of physicians from low- and middle-income countries undermines health service delivery in the emigrating physicians' country of origin because physician supply is already inadequate in those countries. Physician emigration from sub-Saharan Africa, which has only 2% of the global physician workforce but a quarter of the global burden of disease, is particularly worrying. Since 1970, as a result of large-scale emigration and limited medical education, there has been negligible or negative growth in the density of physicians in many countries in sub-Saharan Africa. In Liberia, for example, in 1973, there were 7.76 physicians per 100,000 people but by 2008 there were only 1.37 physicians per 100,000 people; in the US, there are 250 physicians per 100,000 people.
Why Was This Study Done?
Before policy proposals can be formulated to address global inequities in physician distribution, a clear picture of the patterns of physician emigration from resource-limited countries is needed. In this study, the researchers use data from the 2011 American Medical Association Physician Masterfile (AMA-PM) to investigate the “brain drain” of physicians from sub-Saharan Africa to the US. The AMA-PM collects annual demographic, academic, and professional data on all residents (physicians undergoing training in a medical specialty) and licensed physicians who practice in the US.
What Did the Researchers Do and Find?
The researchers used data from the World Health Organization (WHO) Global Health Workforce Statistics and graduation and residency data from the 2011 AMA-PM to estimate physician emigration rates from sub-Saharan African countries, year of US entry, years of service provided before emigration to the US, and length of time in the US. There were 10,819 physicians who were born or trained in 28 sub-Saharan African countries in the 2011 AMA-PM. By using a published analysis of the 2002 AMA-PM, the researchers estimated that US immigration among sub-Saharan African-trained physicians had increased over the past decade for all the countries examined except South Africa, where physician emigration had decreased by 8%. Overall, the number of sub-Saharan African IMGs in the US had increased by 38% since 2002. More than half of this increase was accounted for by Nigerian IMGs. Liberia was the country most affected by migration of its physicians to the US—77% of its estimated 226 physicians were in the 2011 AMA-PM. On average, sub-Saharan African IMGs had been in the US for 18 years and had practiced for 6.5 years before emigration. Finally, nearly half of the sub-Saharan African IMGs had migrated to US between 1984 and 1995, years during which structural adjustment programs, which resulted in deep cuts to public health care services, were implemented in developing countries by international financial institutions as conditions for refinancing.
What Do These Findings Mean?
Although the sub-Saharan African IMGs in the 2011 AMA-PM only represent about 1% of all the physicians and less than 5% of the IMGs in the AMA-PM, these findings reveal a major loss of physicians from sub-Saharan Africa. They also suggest that emigration of physicians from sub-Saharan Africa is a growing problem and is likely to continue unless job satisfaction for physicians is improved in their country of origin. Moreover, because the AMA-PM only lists physicians who qualify for a US residency position, more physicians may have moved from sub-Saharan Africa to the US than reported here and may be working in other jobs incommensurate with their medical degrees (“brain waste”). The researchers suggest that physician emigration from sub-Saharan Africa to the US reflects the complexities in the labor markets for health care professionals in both Africa and the US and can be seen as low- and middle-income nations subsidizing the education of physicians in high-income countries. Policy proposals to address global inequities in physician distribution will therefore need both to encourage the recruitment, training, and retention of health care professionals in resource-limited countries and to persuade high-income countries to train more home-grown physicians to meet the needs of their own populations.
Additional Information
Please access these websites via the online version of this summary at
The Foundation for Advancement of International Medical Education and Research is a non-profit foundation committed to improving world health through education that was established in 2000 by the Educational Commission for Foreign Medical Graduates
The Global Health Workforce Alliance is a partnership of national governments, civil society, international agencies, finance institutions, researchers, educators, and professional associations dedicated to identifying, implementing and advocating for solutions to the chronic global shortage of health care professionals (available in several languages)
Information on the American Medical Association Physician Masterfile and the providers of physician data lists is available via the American Medical Associations website
The World Health Organization (WHO) annual World Health Statistics reports present the most recent health statistics for the WHO Member States
The Medical Education Partnership Initiative is a US-sponsored initiative that supports medical education and research in sub-Saharan African institutions, aiming to increase the quantity, quality, and retention of graduates with specific skills addressing the health needs of their national populations
CapacityPlus is the USAID-funded global project uniquely focused on the health workforce needed to achieve the Millennium Development Goals
Seed Global Health cultivates the next generation of health professionals by allying medical and nursing volunteers with their peers in resource-limited settings
"America is Stealing the Worlds Doctors", a 2012 New York Times article by Matt McAllester, describes the personal experience of a young doctor who emigrated from Zambia to the US
Path to United States Practice Is Long Slog to Foreign Doctors, a 2013 New York Times article by Catherine Rampell, describes the hurdles that immigrant physicians face in practicing in the US
PMCID: PMC3775724  PMID: 24068894
12.  Computerized Cognitive Training in Cognitively Healthy Older Adults: A Systematic Review and Meta-Analysis of Effect Modifiers 
PLoS Medicine  2014;11(11):e1001756.
Michael Valenzuela and colleagues systematically review and meta-analyze the evidence that computerized cognitive training improves cognitive skills in older adults with normal cognition.
Please see later in the article for the Editors' Summary
New effective interventions to attenuate age-related cognitive decline are a global priority. Computerized cognitive training (CCT) is believed to be safe and can be inexpensive, but neither its efficacy in enhancing cognitive performance in healthy older adults nor the impact of design factors on such efficacy has been systematically analyzed. Our aim therefore was to quantitatively assess whether CCT programs can enhance cognition in healthy older adults, discriminate responsive from nonresponsive cognitive domains, and identify the most salient design factors.
Methods and Findings
We systematically searched Medline, Embase, and PsycINFO for relevant studies from the databases' inception to 9 July 2014. Eligible studies were randomized controlled trials investigating the effects of ≥4 h of CCT on performance in neuropsychological tests in older adults without dementia or other cognitive impairment. Fifty-two studies encompassing 4,885 participants were eligible. Intervention designs varied considerably, but after removal of one outlier, heterogeneity across studies was small (I2 = 29.92%). There was no systematic evidence of publication bias. The overall effect size (Hedges' g, random effects model) for CCT versus control was small and statistically significant, g = 0.22 (95% CI 0.15 to 0.29). Small to moderate effect sizes were found for nonverbal memory, g = 0.24 (95% CI 0.09 to 0.38); verbal memory, g = 0.08 (95% CI 0.01 to 0.15); working memory (WM), g = 0.22 (95% CI 0.09 to 0.35); processing speed, g = 0.31 (95% CI 0.11 to 0.50); and visuospatial skills, g = 0.30 (95% CI 0.07 to 0.54). No significant effects were found for executive functions and attention. Moderator analyses revealed that home-based administration was ineffective compared to group-based training, and that more than three training sessions per week was ineffective versus three or fewer. There was no evidence for the effectiveness of WM training, and only weak evidence for sessions less than 30 min. These results are limited to healthy older adults, and do not address the durability of training effects.
CCT is modestly effective at improving cognitive performance in healthy older adults, but efficacy varies across cognitive domains and is largely determined by design choices. Unsupervised at-home training and training more than three times per week are specifically ineffective. Further research is required to enhance efficacy of the intervention.
Please see later in the article for the Editors' Summary
Editors' Summary
As we get older, we notice many bodily changes. Our hair goes grey, we develop new aches and pains, and getting out of bed in the morning takes longer than it did when we were young. Our brain may also show signs of aging. It may take us longer to learn new information, we may lose our keys more frequently, and we may forget people's names. Cognitive decline—developing worsened thinking, language, memory, understanding, and judgment—can be a normal part of aging, but it can also be an early sign of dementia, a group of brain disorders characterized by a severe, irreversible decline in cognitive functions. We know that age-related physical decline can be attenuated by keeping physically active; similarly, engaging in activities that stimulate the brain throughout life is thought to enhance cognition in later life and reduce the risk of age-related cognitive decline and dementia. Thus, having an active social life and doing challenging activities that stimulate both the brain and the body may help to stave off cognitive decline.
Why Was This Study Done?
“Brain training” may be another way of keeping mentally fit. The sale of computerized cognitive training (CCT) packages, which provide standardized, cognitively challenging tasks designed to “exercise” various cognitive functions, is a lucrative and expanding business. But does CCT work? Given the rising global incidence of dementia, effective interventions that attenuate age-related cognitive decline are urgently needed. However, the impact of CCT on cognitive performance in older adults is unclear, and little is known about what makes a good CCT package. In this systematic review and meta-analysis, the researchers assess whether CCT programs improve cognitive test performance in cognitively healthy older adults and identify the aspects of cognition (cognitive domains) that are responsive to CCT, and the CCT design features that are most important in improving cognitive performance. A systematic review uses pre-defined criteria to identify all the research on a given topic; meta-analysis uses statistical methods to combine the results of several studies.
What Did the Researchers Do and Find?
The researchers identified 51 trials that investigated the effects of more than four hours of CCT on nearly 5,000 cognitively healthy older adults by measuring several cognitive functions before and after CCT. Meta-analysis of these studies indicated that the overall effect size for CCT (compared to control individuals who did not participate in CCT) was small but statistically significant. An effect size quantifies the difference between two groups; a statistically significant result is a result that is unlikely to have occurred by chance. So, the meta-analysis suggests that CCT slightly increased overall cognitive function. Notably, CCT also had small to moderate significant effects on individual cognitive functions. For example, some CCT slightly improved nonverbal memory (the ability to remember visual images) and working memory (the ability to remember recent events; short-term memory). However, CCT had no significant effect on executive functions (cognitive processes involved in planning and judgment) or attention (selective concentration on one aspect of the environment). The design of CCT used in the different studies varied considerably, and “moderator” analyses revealed that home-based CCT was not effective, whereas center-based CCT was effective, and that training sessions undertaken more than three times a week were not effective. There was also some weak evidence suggesting that CCT sessions lasting less than 30 minutes may be ineffective. Finally, there was no evidence for the effectiveness of working memory training by itself (for example, programs that ask individuals to recall series of letters).
What Do These Findings Mean?
These findings suggest that CCT produces small improvements in cognitive performance in cognitively healthy older adults but that the efficacy of CCT varies across cognitive domains and is largely determined by design aspects of CCT. The most important result was that “do-it-yourself” CCT at home did not produce improvements. Rather, the small improvements seen were in individuals supervised by a trainer in a center and undergoing sessions 1–3 times a week. Because only cognitively healthy older adults were enrolled in the studies considered in this systematic review and meta-analysis, these findings do not necessarily apply to cognitively impaired individuals. Moreover, because all the included studies measured cognitive function immediately after CCT, these findings provide no information about the durability of the effects of CCT or about how the effects of CCT on cognitive function translate into real-life outcomes for individuals such as independence and the long-term risk of dementia. The researchers call, therefore, for additional research into CCT, an intervention that might help to attenuate age-related cognitive decline and improve the quality of life for older individuals.
Additional Information
Please access these websites via the online version of this summary at
This study is further discussed in a PLOS Medicine Perspective by Druin Burch
The US National Institute on Aging provides information for patients and carers about age-related forgetfulness, about memory and cognitive health, and about dementia (in English and Spanish)
The UK National Health Service Choices website also provides information about dementia and about memory loss
MedlinePlus provides links to additional resources about memory, mild cognitive impairment, and dementia (in English and Spanish)
PMCID: PMC4236015  PMID: 25405755
13.  Depressive Symptoms in Oldest-Old Women: Risk of Mild Cognitive Impairment and Dementia 
Increasing evidence suggests that depression is a risk factor for cognitive impairment, but it is unclear if this is true among the oldest old. We determined whether elevated depressive symptoms predicted five-year incident mild cognitive impairment (MCI) or dementia, and neuropsychological test performance among oldest-old women.
Three study sites
302 women ≥85 years (mean, 87 ±2)
Depressive symptoms were measured with the 15-item Geriatric Depression Scale (GDS); scores ≥6 indicated elevated symptoms. Five years later, participants completed neuropsychological testing and clinical cognitive status was adjudicated.
In analyses of MCI vs. normal cognition, 70% of women with GDS ≥6 at baseline developed MCI vs. 37% with GDS <6. After adjustment for age, education, alcohol and benzodiazepine use, and study site, GDS ≥6 remained independently associated with much greater likelihood of developing MCI (multivariable odds ratio (MOR) = 3.71, 95% confidence interval (CI) 1.30, 10.59). In analyses of dementia vs. normal cognition, 65% of women with GDS ≥6 developed dementia compared to 37% of those with GDS <6 (MOR = 3.15, 95% CI 1.03, 9.65). Only 19% of women with GDS ≥6 had normal cognitive status five years later, compared to 46% of those with GDS <6 (MOR = 0.28, 95% CI 0.11, 0.73). Women with elevated depressive symptoms had worse scores on tests of global cognition and working memory.
Elevated depressive symptoms are an important risk factor for cognitive disorders and lower cognitive performance among women living to their ninth and tenth decades.
PMCID: PMC3326212  PMID: 22015706
oldest-old; women; depression; mild cognitive impairment; dementia
14.  A prospective cohort study of long-term cognitive changes in older Medicare beneficiaries 
BMC Public Health  2011;11:710.
Promoting cognitive health and preventing its decline are longstanding public health goals, but long-term changes in cognitive function are not well-documented. Therefore, we first examined long-term changes in cognitive function among older Medicare beneficiaries in the Survey on Assets and Health Dynamics among the Oldest Old (AHEAD), and then we identified the risk factors associated with those changes in cognitive function.
We conducted a secondary analysis of a prospective, population-based cohort using baseline (1993-1994) interview data linked to 1993-2007 Medicare claims to examine cognitive function at the final follow-up interview which occurred between 1995-1996 and 2006-2007. Besides traditional risk factors (i.e., aging, age, race, and education) and adjustment for baseline cognitive function, we considered the reason for censoring (entrance into managed care or death), and post-baseline continuity of care and major health shocks (hospital episodes). Residual change score multiple linear regression analysis was used to predict cognitive function at the final follow-up using data from telephone interviews among 3,021 to 4,251 (sample size varied by cognitive outcome) baseline community-dwelling self-respondents that were ≥ 70 years old, not in managed Medicare, and had at least one follow-up interview as self-respondents. Cognitive function was assessed using the 7-item Telephone Interview for Cognitive Status (TICS-7; general mental status), and the 10-item immediate and delayed (episodic memory) word recall tests.
Mean changes in the number of correct responses on the TICS-7, and 10-item immediate and delayed word recall tests were -0.33, -0.75, and -0.78, with 43.6%, 54.9%, and 52.3% declining and 25.4%, 20.8%, and 22.9% unchanged. The main and most consistent risks for declining cognitive function were the baseline values of cognitive function (reflecting substantial regression to the mean), aging (a strong linear pattern of increased decline associated with greater aging, but with diminishing marginal returns), older age at baseline, dying before the end of the study period, lower education, and minority status.
In addition to aging, age, minority status, and low education, substantial and differential risks for cognitive change were associated with sooner vs. later subsequent death that help to clarify the terminal drop hypothesis. No readily modifiable protective factors were identified.
PMCID: PMC3190354  PMID: 21933430
15.  Frequent Cognitive Activity Compensates for Education Differences in Episodic Memory 
To test the hypothesis that frequent participation in cognitive activities can moderate the effects of limited education on cognitive functioning.
A national study of adult development and aging, Midlife in the United States (MIDUS), with assessments conducted at the second wave of measurement in 2004-2006.
Assessments were made over the telephone (cognitive measures) and in a mail questionnaire (demographic variables, measures of cognitive and physical activity, and self-rated health).
A total of 3343 men and women between the ages of 32 and 84 with a mean age of 55.99.
The dependent variables were Episodic Memory (Immediate and Delayed Word List Recall) and Executive Functioning (Category Fluency, Backward Digit Span, Backward Counting Speed, Reasoning, and Attention Switching Speed). The independent variables were years of education and frequency of cognitive activity (reading, writing, doing word games or puzzles, and attending lectures). The covariates were age, sex, self-rated health, income, and frequency of physical activity.
The two cognitive measures were regressed on education, cognitive activity frequency, and their interaction, while controlling for the covariates. Education and cognitive activity were significantly correlated with both cognitive abilities. The interaction of education and cognitive activity was significant for episodic memory, but not for executive functioning.
Those with lower education had lower cognitive functioning, but this was qualified by level of cognitive activity. For those with lower education, engaging frequently in cognitive activities showed significant compensatory benefits for episodic memory, which has promise for reducing social disparities in cognitive aging.
PMCID: PMC2855891  PMID: 20094014
cognitive activity; education; memory; executive function; cognitive aging
16.  Association of Lifetime Intellectual Enrichment with Cognitive Decline in the Older Population 
JAMA neurology  2014;71(8):1017-1024.
Intellectual lifestyle enrichment throughout life is increasingly viewed as a protective strategy against commonly observed cognitive decline in the elderly.
To investigate the association of lifetime intellectual enrichment with baseline cognitive performance and rate of cognitive decline in a non-demented elderly population and to estimate difference (in years) associated with lifetime intellectual enrichment to the onset of cognitive impairment.
Prospective analysis of subjects enrolled in the Mayo Clinic Study of Aging (MCSA), a longitudinal population-based study of cognitive aging in Olmsted County, Minnesota. We studied 1995 non-demented (1718 cognitively normal, 277 MCI) participants in MCSA who completed intellectual lifestyle measures at baseline and underwent at least one follow-up visit.
We studied the effect of lifetime intellectual enrichment by separating the variables into two non-overlapping principal components: education/occupation-score and mid/late-life cognitive activity measure based on self-report questionnaires. A global cognitive Z-score served as our summary cognition measure. We used linear mixed-effects models to investigate the associations of demographic and intellectual enrichment measures with global cognitive Z-score trajectories.
Baseline cognitive performance was lower in older subjects and in those with lower education/occupation, lower mid/late-life cognitive activity, apolipoprotein E4 (APOE) genotype, and in men. The interaction between the two intellectual enrichment measures was significant such that the beneficial effect of mid/late-life cognitive activity on baseline cognitive performance was reduced with increasing education/occupation. Only baseline age, mid/late-life cognitive activity, and APOE4 genotype were significantly associated with longitudinal change in cognitive performance from baseline. For APOE4 carriers with high lifetime intellectual enrichment (75th percentile of both education/occupation and mid/late-life cognitive activity), the onset of cognitive impairment was about 8.7 years later compared with low lifetime intellectual enrichment (25th percentile of both education/occupation and mid/late-life cognitive activity) in an 80 year old subject.
Higher levels of education/occupation were associated with higher levels of cognition. Higher levels of mid/late-life leisure activity were also associated with higher levels of cognition, but the slope of this relationship slightly increased over time. Lifetime intellectual enrichment might delay the onset of cognitive impairment and be used as a successful preventive intervention to reduce the impending dementia epidemic.
PMCID: PMC4266551  PMID: 25054282
17.  The Fall and Rise of US Inequities in Premature Mortality: 1960–2002 
PLoS Medicine  2008;5(2):e46.
Debates exist as to whether, as overall population health improves, the absolute and relative magnitude of income- and race/ethnicity-related health disparities necessarily increase—or derease. We accordingly decided to test the hypothesis that health inequities widen—or shrink—in a context of declining mortality rates, by examining annual US mortality data over a 42 year period.
Methods and Findings
Using US county mortality data from 1960–2002 and county median family income data from the 1960–2000 decennial censuses, we analyzed the rates of premature mortality (deaths among persons under age 65) and infant death (deaths among persons under age 1) by quintiles of county median family income weighted by county population size. Between 1960 and 2002, as US premature mortality and infant death rates declined in all county income quintiles, socioeconomic and racial/ethnic inequities in premature mortality and infant death (both relative and absolute) shrank between 1966 and 1980, especially for US populations of color; thereafter, the relative health inequities widened and the absolute differences barely changed in magnitude. Had all persons experienced the same yearly age-specific premature mortality rates as the white population living in the highest income quintile, between 1960 and 2002, 14% of the white premature deaths and 30% of the premature deaths among populations of color would not have occurred.
The observed trends refute arguments that health inequities inevitably widen—or shrink—as population health improves. Instead, the magnitude of health inequalities can fall or rise; it is our job to understand why.
Nancy Krieger and colleagues found evidence of decreasing, and then increasing or stagnating, socioeconomic and racial inequities in US premature mortality and infant death from 1960 to 2002.
Editors' Summary
One of the biggest aims of public health advocates and governments is to improve the health of the population. Improving health increases people's quality of life and helps the population be more economically productive. But within populations are often persistent differences (usually called “disparities” or “inequities”) in the health of different subgroups—between women and men, different income groups, and people of different races/ethnicities, for example. Researchers study these differences so that policy makers and the broader public can be informed about what to do to intervene. For example, if we know that the health of certain subgroups of the population—such as the poor—is staying the same or even worsening as the overall health of the population is improving, policy makers could design programs and devote resources to specifically target the poor.
To study health disparities, researchers use both relative and absolute measures. Relative inequities refer to ratios, while absolute inequities refer to differences. For example, if one group's average income level increases from $1,000 to $10,000 and another group's from $2,000 to $20,000, the relative inequality between the groups stays the same (i.e., the ratio of incomes between the two groups is still 2) but the absolute difference between the two groups has increased from $1,000 to $10,000.
Examining the US population, Nancy Krieger and colleagues looked at trends over time in both relative and absolute differences in mortality between people in different income groups and between whites and people of color.
Why Was This Study Done?
There has been a lot of debate about whether disparities have been widening or narrowing as overall population health improves. Some research has found that both total health and health disparities are getting better with time. Other research has shown that overall health gains mask worsening disparities—such that the rich get healthier while the poor get sicker.
Having access to more data over a longer time frame meant that Krieger and colleagues could provide a more complete picture of this sometimes contradictory story. It also meant they could test their hypothesis about whether, as population health improves, health inequities necessarily widen or shrink within the time period between the 1960s through the 1990s during which certain events and policies likely would have had an impact on the mortality trends in that country.
What Did the Researchers Do and Find?
In order to investigate health inequities, the authors chose to look at two common measures of population health: rates of premature mortality (dying before the age of 65 years) and rates of infant mortality (death before the age of 1).
To determine mortality rates, the authors used death statistics data from different counties, which are routinely collected by state and national governments. To be able to rank mortality rates for different income groups, they used data on the median family incomes of people living within those counties (meaning half the families had income above, and half had incomes below, the median value). They calculated mortality rates for the total population and for whites versus people of color. They used data from 1960 through 2002. They compared rates for 1966–1980 with two other time periods: 1960–1965 and 1981–2002. They also examined trends in the annual mortality rates and in the annual relative and absolute disparites in these rates by county income level.
Over the whole period 1960–2002, the authors found that premature mortality (death before the age of 65) and infant mortality (death before the age of 1) decreased for all income groups. But they also found that disparities between income groups and between whites and people of color were not the same over this time period. In fact, the economic disparities narrowed then widened. First, they shrank between 1966 and 1980, especially for Americans of color. After 1980, however, the relative health inequities widened and the absolute differences did not change. The authors conclude that if all people in the US population experienced the same health gains as the most advantaged did during these 42 years (i.e., as the whites in the highest income groups), 14% of the premature deaths among whites and 30% of the premature deaths among people of color would have been prevented.
What Do These Findings Mean?
The findings provide an overview of the trends in inequities in premature and infant mortality over a long period of time. Different explanations for these trends can now be tested. The authors discuss several potential reasons for these trends, including generally rising incomes across America and changes related to specific diseases, such as the advent of HIV/AIDS, changes in smoking habits, and better management of cancer and cardiovascular disease. But they find that these do not explain the fall then rise of inequities. Instead, the authors suggest that explanations lie in the social programs of the 1960s and the subsequent roll-back of some of these programmes in the 1980s. The US “War on Poverty,” civil rights legislation, and the establishment of Medicare occurred in the mid 1960s, which were intended to reduce socioeconomic and racial/ethnic inequalities and improve access to health care. In the 1980s there was a general cutting back of welfare state provisions in America, which included cuts to public health and antipoverty programs, tax relief for the wealthy, and worsening inequity in the access to and quality of health care. Together, these wider events could explain the fall then rise trends in mortality disparities.
The authors say their findings are important to inform and help monitor the progress of various policies and programmes, including those such as the Healthy People 2010 initiative in America, which aims to increase the quality and years of healthy life and decrease health disparities by the end of this decade.
Additional Information.
Please access these Web sites via the online version of this summary at 0050046.
Healthy People 2010 was created by the US Department of Health and Human Services along with scientists inside and outside of government and includes a comprehensive set of disease prevention and health promotion objectives for the US to achieve by 2010, with two overarching goals: to increase quality and years of healthy life and to eliminate health disparities
Johan Mackenbach and colleagues provide an overview of mortality inequalities in six Western European countries—Finland, Sweden, Norway, Denmark, England/Wales, and Italy—and conclude that eliminating mortality inequalities requires that more cardiovascular deaths among lower socioeconomic groups be prevented, as well as more attention be paid to rising death rates of lung cancer, breast cancer, respiratory disease, gastrointestinal disease, and injuries among women and men in the lower income groups.
The WHO Health for All program promotes health equity
A primer on absolute versus relative differences is provided by the American College of Physicians
PMCID: PMC2253609  PMID: 18303941
18.  Late Life Leisure Activities and Risk of Cognitive Decline 
Studies concerning the effect of different types of leisure activities on various cognitive domains are limited. This study tests the hypothesis that mental, physical, and social activities have a domain-specific protection against cognitive decline.
A cohort of a geographically defined population in China was examined in 2003–2005 and followed for an average of 2.4 years. Leisure activities were assessed in 1,463 adults aged 65 years and older without cognitive or physical impairment at baseline, and their cognitive performances were tested at baseline and follow-up examinations.
High level of mental activity was related to less decline in global cognition (β = −.23, p < .01), language (β = −.11, p < .05), and executive function (β = −.13, p < .05) in ANCOVA models adjusting for age, gender, education, history of stroke, body mass index, Apolipoprotein E genotype, and baseline cognition. High level of physical activity was related to less decline in episodic memory (β = −.08, p < .05) and language (β = −.15, p < .01). High level of social activity was associated with less decline in global cognition (β = −.11, p < .05). Further, a dose-response pattern was observed: although participants who did not engage in any of the three activities experienced a significant global cognitive decline, those who engaged in any one of the activities maintained their cognition, and those who engaged in two or three activities improved their cognition. The same pattern was observed in men and in women.
Leisure activities in old age may protect against cognitive decline for both women and men, and different types of activities seem to benefit different cognitive domains.
PMCID: PMC3598354  PMID: 22879456
Cognitive function; Leisure activities; Mental activity; Physical activity; Social activity
19.  Cognitive Impairment in the Age-Related Eye Disease Study 
Archives of ophthalmology  2006;124(4):537-543.
To investigate potential associations between cognitive function and/or impairment and age-related macular degeneration (AMD) and visual impairment in the Age-Related Eye Disease Study (AREDS).
The AREDS is an 11-center natural history study of AMD and age-related cataract. The AREDS Cognitive Function Battery was administered to 2946 participants. The battery consists of 6 neuropsychological tests measuring performance in several cognitive domains. The Dunnett multiple comparison test was used to identify differences by AMD and visual acuity severity. The relationship with cognitive impairment was also assessed using logistic regression.
Mean scores of instruments in the AREDS Cognitive Function Battery declined with increased macular abnormalities and reduced visual acuity. After adjustment for age, sex, race, education, smoking status, diabetes mellitus, hypertension, and depression, increased macular abnormalities (trend P value <.05) reduced mean cognitive function scores as measured by the Modified Mini-Mental State Examination and the Wechsler Logical Memory Scale. Reduced vision was found to be associated with reduced mean cognitive function scores as measured by the Modified Mini-Mental State Examination and letter and verbal fluency tasks. Persons with vision worse than 20/40 OU were more likely to be cognitively impaired (Modified Mini-Mental State Examination score <80) (odds ratio, 2.88 [95% confidence interval, 1.75–4.76]) compared with persons with visual acuity of 20/40 or better OU.
These data suggest a possible association of advanced AMD and visual acuity with cognitive impairment in older persons.
PMCID: PMC1472655  PMID: 16606880
20.  Morning Cortisol Levels and Cognitive Abilities in People With Type 2 Diabetes 
Diabetes Care  2010;33(4):714-720.
People with type 2 diabetes are at increased risk of cognitive impairment but the mechanism is uncertain. Elevated glucocorticoid levels in rodents and humans are associated with cognitive impairment. We aimed to determine whether fasting cortisol levels are associated with cognitive ability and estimated lifetime cognitive change in an elderly population with type 2 diabetes.
This was a cross-sectional study of 1,066 men and women aged 60–75 years with type 2 diabetes, living in Lothian, Scotland (the Edinburgh Type 2 Diabetes Study). Cognitive abilities in memory, nonverbal reasoning, information processing speed, executive function, and mental flexibility were tested, and a general cognitive ability factor, g, was derived. Prior intelligence was estimated from vocabulary testing, and adjustment for scores on this test was used to estimate lifetime cognitive change. Relationships between fasting morning plasma cortisol levels and cognitive ability and estimated cognitive change were tested. Models were adjusted for potential confounding and/or mediating variables including metabolic and cardiovascular variables.
In age-adjusted analyses, higher fasting cortisol levels were not associated with current g or with performance in individual cognitive domains. However, higher fasting cortisol levels were associated with greater estimated cognitive decline in g and in tests of working memory and processing speed, independent of mood, education, metabolic variables, and cardiovascular disease (P < 0.05).
High morning cortisol levels in elderly people with type 2 diabetes are associated with estimated age-related cognitive change. Strategies targeted at lowering cortisol action may be useful in ameliorating cognitive decline in individuals with type 2 diabetes.
PMCID: PMC2845011  PMID: 20097784
21.  Prospective study of type 2 diabetes and cognitive decline in women aged 70-81 years 
BMJ : British Medical Journal  2004;328(7439):548.
Objective To examine the association of type 2 diabetes with baseline cognitive function and cognitive decline over two years of follow up, focusing on women living in the community and on the effects of treatments for diabetes.
Design Nurses' health study in the United States. Two cognitive interviews were carried out by telephone during 1995-2003.
Participants 18 999 women aged 70-81 years who had been registered nurses completed the baseline interview; to date, 16 596 participants have completed follow up interviews after two years.
Main outcome measures Cognitive assessments included telephone interview of cognitive status, immediate and delayed recalls of the East Boston memory test, test of verbal fluency, delayed recall of 10 word list, and digit span backwards. Global scores were calculated by averaging the results of all tests with z scores.
Results After multivariate adjustment, women with type 2 diabetes performed worse on all cognitive tests than women without diabetes at baseline. For example, women with diabetes were at 25-35% increased odds of poor baseline score (defined as bottom 10% of the distribution) compared with women without diabetes on the telephone interview of cognitive status and the global composite score (odds ratios 1.34, 95% confidence interval 1.14 to 1.57, and 1.26, 1.06 to 1.51, respectively). Odds of poor cognition were particularly high for women who had had diabetes for a long time (1.52, 1.15 to 1.99, and 1.49, 1.11 to 2.00, respectively, for ≥ 15 years' duration). In contrast, women with diabetes who were on oral hypoglycaemic agents performed similarly to women without diabetes (1.06 and 0.99), while women not using any medication had the greatest odds of poor performance (1.71, 1.28 to 2.281, and 1.45, 1.04 to 2.02) compared with women without diabetes. There was also a modest increase in odds of poor cognition among women using insulin treatment. All findings were similar when cognitive decline was examined over time.
Conclusions Women with type 2 diabetes had increased odds of poor cognitive function and substantial cognitive decline. Use of oral hypoglycaemic therapy, however, may ameliorate risk.
PMCID: PMC381043  PMID: 14980984
22.  Income non-reporting: implications for health inequalities research 
OBJECTIVES—To determine whether, in the context of a face to face interview, socioeconomic groups differ in their propensity to provide details about the amount of their personal income, and to discuss the likely consequences of any differences for studies that use income based measures of socioeconomic position.
DESIGN AND SETTING—The study used data from the 1995 Australian Health Survey. The sample was selected using a stratified multi-stage area design that covered urban and rural areas across all States and Territories and included non-institutionalised residents of private and non-private dwellings. The response rate was 91.5% for selected dwellings and 97.0% for persons within dwellings. Data were collected using face to face interviews. Income response, the dependent measure, was binary coded (0 if income was reported and 1 for refusals, "don't knows" and insufficient information). Socioeconomic position was measured using employment status, occupation, education and main income source. The socioeconomic characteristics of income non-reporters were initially examined using sex specific age adjusted proportions with 95% confidence intervals. Multivariate analysis was performed using logistic regression.
PARTICIPANTS—Persons aged 15-64 (n=33 434) who were reportedly in receipt of an income from one or more sources during the data collection reference period.
RESULTS—The overall rate of income non-response was 9.8%. Propensity to not report income increased with age (15-29 years 5.8%, 30-49 10.6%, 50-64 13.8%). No gender differences were found (men 10.2%, women 9.3%). Income non-response was not strongly nor consistently related to education or occupation for men, although there was a suggested association among these variables for women, with highly educated women and those in professional occupations being less likely to report their income. Strong associations were evident between income non-response, labour force status and main income source. Rates were highest among the employed and those in receipt of an income from their own business or partnership, and lowest among the unemployed and those in receipt of a government pension or benefit (which excluded the unemployed).
CONCLUSION—Given that differences in income non-reporting were small to moderate across levels of the education and occupation variables, and that propensity to not report income was greater among higher socioeconomic groups, estimates of the relation between income and health are unlikely to be affected by socioeconomic variability in income non-response. Probability estimates from a logistic regression suggested that higher rates of income non-reporting among employed persons who received their income from a business or partnership were not attributable to socioeconomic factors. Rather, it is proposed that these higher rates were attributable to recall effects, or concerns about having one's income information disclosed to taxation authorities. Future studies need to replicate this analysis to determine whether the results can be inferred to other survey and data collection contexts. The analysis should also be extended to include an examination of the relation between socioeconomic position and accuracy of income reporting. Little is known about this issue, yet it represents a potential source of bias that may have important implications for studies that investigate the association between income and health.

Keywords: socioeconomic position; income non-response; data quality
PMCID: PMC1731636  PMID: 10746115
23.  Monounsaturated, trans & saturated fatty acids and cognitive decline in women 
Prospectively assess effects of select dietary fats on cognitive decline
Prospective observational; 3-year follow-up
Subjects recruited at Northwestern University who participated in Women's Health Initiative Observational Study or control group of Diet Modification arm.
482 women ≥ 60 years
We averaged dietary intake from a validated food frequency questionnaire (FFQ) administered twice (mean=2.7 years apart) before baseline cognitive assessment (mean=2.9 years after 2nd FFQ). Testing of memory, vision, executive function, language, and attention was performed at 2 time points, 3 years apart. We created a global Z-score for both time points by averaging all Z-scores for each participant and defined global cognitive change as the difference between follow-up and baseline Z-scores.
Median intakes of saturated fats (SFA), trans-fats, (TFA), dietary cholesterol (DC) and monounsaturated fats (MUFA) were 18.53 g/d, 3.45 g/d, 0.201 g/d and 19.39 g/d, respectively. There were no associations between degree of cognitive decline and intakes of SFA (p=0.69), TFA (p=0.54) or DC (p=0.64) after adjusting for baseline cognition, total energy, age, education, reading ability, Apolipoprotein E (ε4) allele, BMI, estrogen and beta-blocker use, and intake of caffeine and other fatty acids. In contrast, compared with participants in the lowest quartile, MUFA intake was associated with lower cognitive decline in fully adjusted linear regression models, with decline of 0.21 + 0.05 SE in the lowest versus 0.05 + 0.05 SE in the highest quartiles (p=0.02). This effect of MUFA intake was primarily in the visual and memory domains (p=0.03 for both).
Higher intakes of SFA, TFA and DC in these women were not associated with cognitive decline, while MUFA intake was associated with less cognitive decline.
PMCID: PMC3098039  PMID: 21568955
Fatty acids; cognitive decline; monounsaturated fat intake and prospective
24.  Cognitive function is associated with risk aversion in community-based older persons 
BMC Geriatrics  2011;11:53.
Emerging data from younger and middle-aged persons suggest that cognitive ability is negatively associated with risk aversion, but this association has not been studied among older persons who are at high risk of experiencing loss of cognitive function.
Using data from 369 community-dwelling older persons without dementia from the Rush Memory and Aging Project, an ongoing longitudinal epidemiologic study of aging, we examined the correlates of risk aversion and tested the hypothesis that cognition is negatively associated with risk aversion. Global cognition and five specific cognitive abilities were measured via detailed cognitive testing, and risk aversion was measured using standard behavioral economics questions in which participants were asked to choose between a certain monetary payment ($15) versus a gamble in which they could gain more than $15 or gain nothing; potential gamble gains ranged from $21.79 to $151.19 with the gain amounts varied randomly over questions. We first examined the bivariate associations of age, education, sex, income and cognition with risk aversion. Next, we examined the associations between cognition and risk aversion via mixed models adjusted for age, sex, education, and income. Finally, we conducted sensitivity analyses to ensure that our results were not driven by persons with preclinical cognitive impairment.
In bivariate analyses, sex, education, income and global cognition were associated with risk aversion. However, in a mixed effect model, only sex (estimate = -1.49, standard error (SE) = 0.39, p < 0.001) and global cognitive function (estimate = -1.05, standard error (SE) = 0.34, p < 0.003) were significantly inversely associated with risk aversion. Thus, a lower level of global cognitive function and female sex were associated with greater risk aversion. Moreover, performance on four out of the five cognitive domains was negatively related to risk aversion (i.e., semantic memory, episodic memory, working memory, and perceptual speed); performance on visuospatial abilities was not.
A lower level of cognitive ability and female sex are associated with greater risk aversion in advanced age.
PMCID: PMC3182132  PMID: 21906402
25.  Cognitive Impairment in Persons With Rheumatoid Arthritis 
Arthritis care & research  2012;64(8):1144-1150.
To explore the prevalence and possible predictors of cognitive impairment in persons with rheumatoid arthritis (RA).
Individuals from a longitudinal cohort study of RA participated in a study visit that included a range of physical, psychosocial, and biologic metrics. Cognitive function was assessed using a battery of 12 standardized neuropsychological measures yielding 16 indices. Subjects were classified as “impaired” if they performed 1 SD below age-based population norms on at least 4 of 16 indices. Logistic regression analyses were conducted to identify which of the following were significant predictors of cognitive impairment: sex, race, income, education, depression, disease duration, disease severity, C-reactive protein (CRP) level, glucocorticoid use, and cardiovascular disease (CVD) risk factors.
A total of 115 subjects with a mean ± SD age of 58.6 ± 10.8 years were included; 64% were women and 81% were white. The proportion of persons who were classified as cognitively impaired was 31%. Education, income, glucocorticoid use, and CVD risk factors independently predicted cognitive impairment, controlling for sex, race, disease duration, disease severity, CRP level, and depression. Individuals with cognitive impairment were more likely to have low education (odds ratio [OR] 6.18, 95% confidence interval [95% CI] 1.6–23.87), have low income (OR 7.12, 95% CI 1.35–37.51), use oral glucocorticoids (OR 2.92, 95% CI 1.05–8.12), and have increased CVD risk factors (OR 1.61, 95% CI 1.19–2.17 per risk factor).
The findings of this study suggest that the burden of cognitive impairment in RA is significant, and future studies identifying specific etiologic contributors to cognitive impairment are warranted.
PMCID: PMC3744877  PMID: 22505279

Results 1-25 (1230246)