|Home | About | Journals | Submit | Contact Us | Français|
Increasing awareness that minimal or mild cognitive impairment (MCI) in the elderly may be a precursor of dementia has led to an increase in the number of people attending memory clinics. We aimed to develop a way of predicting the period of time before cognitive impairment occurs in community-dwelling elderly. The method is illustrated by the use of simple tests of different cognitive domains.
A cohort of 241 normal elderly volunteers was followed for up to 20 years with regular assessments of cognitive abilities using the Cambridge Cognitive Examination (CAMCOG); 91 participants developed MCI. We used interval-censored survival analysis statistical methods to model which baseline cognitive tests best predicted the time to convert to MCI.
Out of several baseline variables, only age and CAMCOG subscores for expression and learning/memory were predictors of the time to conversion. The time to conversion was 14% shorter for each 5 years of age, 17% shorter for each point lower in the expression score, and 15% shorter for each point lower in the learning score. We present in tabular form the probability of converting to MCI over intervals between 2 and 10 years for different combinations of expression and learning scores.
In apparently normal elderly people, subtle measurable cognitive deficits that occur within the normal range on standard testing protocols reliably predict the time to clinically relevant cognitive impairment long before clinical symptoms are reported.
Mild cognitive impairment (MCI), as operationally defined, often represents the predementia stage of a neurodegenerative disorder in elderly subjects, including Alzheimer disease (AD), vascular dementia, or other dementia.1,2 Amnestic MCI (aMCI), with a predominant impairment in episodic memory, may be a precursor of AD.3–5
Several studies have shown that deficits in memory retention and abstract reasoning are the most important cognitive predictors of AD, problems being evident more than 10 years before diagnosis.6–9 However, little is known about the factors that predict when normal subjects, or those with minimally impaired cognition, will develop aMCI.10–12 Sensitive neuropsychological tests have been shown to detect cognitive dysfunction in elderly who show no signs of dementia.13,14 The Cambridge NeuropsychologicalTest Automated Battery Paired Associates Learning Test was shown to have improved specificity and positive predictive value for conversion to MCI in 4 years when used at 2 consecutive time points in a cohort considered cognitively healthy at baseline.11 Impairments in other cognitive domains are also associated with conversion to MCI,15 as well as early life linguistic ability.16
The focus of this study was the probability of developing future cognitive impairment in the cognitively healthy elderly. Instead of simply identifying risk factors for cognitive decline, we aimed to develop a method of estimating the duration of time before conversion to aMCI, based on an initial cognitive score or a combination of test scores, in subjects initially classified as cognitively healthy.
The Oxford Project to Investigate Memory and Ageing (OPTIMA) study began in 1988 as a case-control study of subjects with probable and possible AD (http://www.medsci.ox.ac.uk/optima). At study entry, participants were classified as being control, probable/possible AD,17 or other dementia syndrome according to clinical diagnostic procedures current at the time. For this study, all participants enrolled in OPTIMA as cognitively healthy controls from 1988 until the end of 2008 with a record of at least 2 visits were included in the selection procedure. At their first episode, all participants were given the Cambridge Examination for Disorders of the Elderly (CAMDEX),18 which consists of an informant interview and a patient interview that includes a cognitive assessment, the Cambridge Cognitive Examination (CAMCOG). The CAMCOG is comprised of subtests including orientation, comprehension, expression, recent memory, remote memory, learning, abstract thinking, perception, praxis, attention, and calculation as well as a derived Mini-Mental State Examination (MMSE) score (appendix e-1 on the Neurology® Web site at www.neurology.org).
Control subjects (n = 241) were defined as those fulfilling all of the following criteria: “National Institute of Neurological and Communicative Disorders and Stroke (NINCDS) negative,” no dementia present, MMSE score ≥24, and CAMCOG learning subscore ≥13 out of 17 points.
The study was approved by the local ethics committee (COREC C1656), and all subjects gave written consent for their participation.
Diagnosis (control or MCI) was checked at each visit for each participant to determine the date when conversion to aMCI (if it occurred) was first identified. Conversion to aMCI was defined when a participant was found to have fulfilled all of the following criteria1 by a neuropsychologist (C.A.d.J.) who examined the clinical notes recorded by physicians:
The important property that characterizes conversion data is that the conversion to MCI (or to any other state), if it happens, always occurs between visits. The exact date of conversion is usually not known, only the interval during which the conversion has occurred. This property defines the interval-censored data framework.20 We defined the main outcome measure as the duration of time (in years) between the first visit and the date conversion to MCI was first identified, if the subject converted, or until the last visit recorded, if the subject had not converted. For those subjects who converted to MCI, the date of conversion is interval-censored because the conversion took place between visits. Subjects who did not convert were right-censored; these included 21 who withdrew and 49 who died. The nature of the outcome variable suggested that survival analysis would be the best method for data analysis. The standard Cox proportional hazard model, in which the interval is replaced by its midpoint, could have been used. However, to avoid the approximations and potential biases of using the midpoint of the interval, we adopted different approaches that accommodate the interval-censored survival data (appendix e-1).
The smoothing accelerated failure time (AFT) procedure, using G-splines,21 was selected. This method is a trade-off between parametric and semiparametric AFT models. It is naturally adapted to interval-censored data, can handle many predictors (in contrast to its semiparametric version), and also profits from the advantages of parametric methods leaving the baseline distribution, in practice, largely unspecified. In this statistical model, all CAMCOG subscores were tested as potential predictors of the duration of conversion to MCI. The following covariates were also included in the model: age, years of school education, further education, gender, and ApoE4 status.
This version of the Cox model22 is adapted to interval-censored data. The p values and confidence intervals (CIs) were calculated using the bootstrap methodology, but no estimated curve could be plotted. This method was used just to confirm the results of the primary method.
We used logistic regression to investigate which cognitive subscores significantly predict the conversion to MCI when the study period was fixed, and also to identify those subscores that are always significant whatever the duration of the study period (details in appendix e-1).
Two hundred forty-one control subjects (at first visit) from the OPTIMA database were eligible for inclusion in this study. During follow-up between 1988 and 2008, 91 controls (37.8%) converted to MCI and their conversion durations were thus interval-censored, whereas the remaining 150 were still control subjects at their last recorded episode and their conversion durations were thus right-censored (table 1). At baseline, the mean (SD) age was 71.85 (8.83) years, years of school education was 11.46 (1.49), years of further education was 2.31 (2.29), plasma total homocysteine was 12.6 (3.9) μmol/L, blood pressure was 153.5 (22.5) mm Hg systolic and 83.8 (11.2) mm Hg diastolic, 53.1% of subjects were male, and 27.4% were ApoE4 positive.
The cognitive scales used in the analysis as potential predictors of the duration of time to convert to MCI were the subscores from the CAMCOG. All subscores at episode 1 (when all subjects were controls) were entered and tested in the model. Table e-1 (appendix e-1) shows the scores at baseline for the whole study cohort. We show below that the only significant predictors of the duration of time to conversion to MCI with all the statistical models used were the CAMCOG expression and learning subscores, and age. The normal range of scores was between 15 and 21 for expression and between 13 and 17 for learning.
The AFT model showed that among all CAMCOG subscores, only expression and learning were significant predictors of the time to conversion to MCI. Among the demographic data, only age at first visit was significant. From the model, the duration of time to convert to MCI was 0.83 (p = 0.0001) times shorter (−17%) when the expression score decreased by 1 point, 0.85 (p = 0.001) times shorter (−15%) when the learning score decreased by 1 point, and 0.86 (p = 0.0001) times shorter (−14%) when age increased by 5 years. To graphically depict these significant effects, we have plotted the model-estimated cumulative distributions (figure 1A). The figure shows the cumulative probabilities of conversion according to 3 combinations of expression and learning scores at baseline. The age at baseline was fixed at 70 years. From the curves, it can be seen that the probability to convert to MCI in less than 5 years was only 7% for the group with the highest scores, whereas it was 4 times greater (28%) for those with the lowest scores. CI curves for these cumulative probabilities of conversion according to 2 combinations of expression and learning are given in figure 1B.
Table 2 gives the estimated cumulative probabilities of conversion in these 3 groups at 2, 4, 6, 8, and 10 years after the initial assessment along with 95% CIs. From this table, a subject aged 70 years with an expression score of 18 and a learning score of 13 has a risk of 38.2% to convert to MCI during the next 6 years. In comparison, another subject with the same age but with an expression score of 21 and a learning score of 16 has a risk of only 8.2% to convert to MCI during this period.
Furthermore, from the fitted model, the probability of conversion can be estimated during any period of time for any combination of age, expression, and learning at baseline. Table 3 shows these predictions when age was fixed at 70 years and also shows a comparison with the average risk for subjects without dementia. The average probability is estimated from the model using average scores for expression and learning at baseline, but leaving age fixed at 70 years.
An example from table 3 based on the 4 years conversion column is as follows. On average, there is a risk of 10% to convert to MCI during the first 4 years for control subjects aged 70 years. A subject with the same age but with an expression score of 15 and a learning score of 13 (first row) has instead a risk of 49% to convert to MCI during this period of time; thus, the odds ratio (OR) (relative to the average population) is 8.65. In contrast, a subject with the same age but with an expression score of 21 and a learning score of 17 (last row) has a risk of 5% to convert to MCI within 4 years; thus, the OR (relative to the average population) is 0.47.
To illustrate graphically how well the AFT model fits the observed data, a plot is shown (figure 2) that depicts the observed cumulative proportions of subjects converting to MCI for 2 different groups of subjects: those with expression scores between 20 and 21 (n = 114) vs those with scores between 15 and 19 (n = 127); these represent approximate median splits. The empirical curves in figure 2 are descriptive in the sense that no statistical model has been used to generate them. The curves were calculated using an extended version of the Kaplan-Meier nonparametric estimator to take into account interval-censored data.23 Figure 2 confirms our finding, first, by showing a significant difference in the observed cumulative proportions of subjects converting to MCI between the 2 groups, and second, by showing how well the curves estimated from the AFT model (red lines) fitted the data.
The extended Cox model using bootstrap (10,000 replications) confirmed the results of the AFT model given above. Indeed, the effects of expression, learning, and age on the risk of conversion (or on the hazard) at any time were highly significant. From the Cox model, the instantaneous risk (or the hazard) of conversion to MCI is 1.37-fold higher (p = 0.001) when age increases by 5 years, 1.52-fold higher (p = 0.001) when expression decreases by 1 point, and 1.47-fold higher (p = 0.001) when learning decreases by 1 point. If both the expression and learning scores each decrease by 1 point, the risk of conversion is 2.2-fold higher, i.e., the risks are multiplicative.
Table e-2 (appendix e-1) shows that for all 9 study periods, between 2 and 12 years, the best logistic model always included expression and learning. Age was also a good predictor once the study period exceeded 4 years, but other covariates, such as perception, ApoE4, and gender, were less consistent predictors. Using the logistic regression approach allowed us to draw receiver operating characteristic curves for the prediction of conversion (figure e-1, appendix e-1); these had areas under the curve ranging from 85% to 90%, showing a high degree of accuracy in the predictions.
Overall, the results for age, expression, and learning were not significantly changed when additional covariates (systolic blood pressure, education, stroke, homocysteine, ApoE4) were included in the models, except that now the presence of ApoE4 exerted a negative effect, i.e., carriers of this allele converted more rapidly than noncarriers (table e-3, appendix e-1).
Thus, the 3 different statistical approaches each show that the 2 subscores, expression and learning, are robust for predicting the conversion to MCI. We prefer to use survival analysis rather than logistic regression as used previously,6 first, because the study period does not need to be fixed; second, because it allows the use of the right- and interval-censored data, which is not the case for logistic regression; and finally, because survival analysis answers some additional questions and most importantly directly models the duration of the period before conversion.
This study showed significant effects of a combination of the CAMCOG expression and learning subscores and age on the duration of time to convert to MCI in elderly subjects who seem to have normal cognitive function at the time of assessment. The period before conversion shortened with increasing age as well as with lower baseline cognitive scores, singly or combined. These effects were independent of other factors known to increase the risk of developing cognitive impairment, such as low education, stroke, blood pressure, and homocysteine, none of which influenced the duration of time to cognitive impairment.
In survival analysis, 2 families of models are broadly used in the statistical literature. The first family is the accelerated failure time models. In this family, the duration of time to conversion to MCI (the “event” in our study) is directly modeled as a function of the potential predictors or covariates. The effect of a given predictor, if found significant, either shortens or lengthens the duration of time to conversion. The second family is that of proportional hazard models usually known as the Cox proportional hazard model when the baseline hazard function is nonparametrically specified. In this family, the hazard function (or the risk of conversion) at any time is modeled as a function of the potential predictors or covariates. The effect of a given predictor, if found significant, then increases or decreases the risk of conversion. Usually in practice, these 2 models need not confirm each other because they are based on different assumptions. In our study, we used both families of models by adapting the second one (the Cox model) to interval-censored data. We found that these 2 independent models both produced the same result, showing significant effects of the same predictors on the period of conversion from control to MCI status in our cohort. This confirmation makes our results robust and consistent.
Although memory impairment was the hallmark of the first criteria for aMCI24 and is also the recommended domain to measure for assessing predementia AD,25 we found that “expression” or language ability was a stronger predictor of duration to conversion to aMCI than “learning” or memory. The importance of expression as a predictor was shown not only in the survival analyses, but also using logistic regression to study fixed time periods.
The CAMCOG expression subtest comprises tasks of verbal fluency, comprehension, spoken language descriptions, and definitions (appendix e-1). It has been recognized previously that the early stages of dementia are associated with linguistic problems, such as word-finding difficulties. Evidence from the Nun Study16,26 on archived autobiographical essays at the time of entry into the order showed that nuns who developed MCI or AD decades later composed simpler narratives (in terms of both content and grammar) than those who died with normal brains. It has also been reported that linguistic changes analogous to those seen in established AD emerged in the later writings of a novelist some time before the clinical signs of AD.27 These studies document some form of change in spontaneously produced language either early in or before the diagnosis of a neurodegenerative disorder and, as in the Nun Study, long before any awareness of cognitive decline.
The present findings have implications in the clinical context. Increasing awareness of the significance of memory impairment has resulted in an increase in the number of referrals of elderly subjects for assessment in memory clinics, which will probably increase with the emphasis on early diagnosis that is a major part of the National Dementia Strategy currently being implemented in England. In many instances, a diagnosis of MCI, AD, or another dementia will be made, but many others referred will seem to have normal cognitive function on existing routine testing paradigms, despite their or their families’ perception of a change. Our approach could have a valuable role in guiding those conducting the assessment, e.g., about relative risk for medical management planning, especially in conjunction with levels of biomarkers if these become established, and what information to give their patient and his or her carers. It may also have potential as an outcome measure in clinical trials of treatment to reduce the rate of disease progression, when therapeutic strategies designed for use early in the disease become available, or to enrich trial cohorts with subjects who have a higher risk of conversion in a shorter time span.
Address correspondence and reprint requests to Prof. A. David Smith, OPTIMA, Level 4, John Radcliffe Hospital, Oxford OX3 9DU, UK firstname.lastname@example.org
Supplemental data at www.neurology.org
Editorial, page 1432
e-Pub ahead of print at www.neurology.org.
During the course of the study, the principal grant support for OPTIMA came from Bristol-Myers Squibb, Merck & Co. Inc., Medical Research Council, Charles Wolfson Charitable Trust, Alzheimer's Research Trust, and Norman Collisson Foundation. Dr. Wilcock was partly supported by the NIHR Biomedical Research Centre, Oxford.
Disclosure: The authors report no disclosures.
Received April 23, 2009. Accepted in final form August 4, 2009.