|Home | About | Journals | Submit | Contact Us | Français|
The 4 allele of the apolipoprotein E (ApoE) gene is associated with alterations in brain function and is a risk factor for Alzheimer’s disease (AD). Changes in components of visuospatial attention with ApoE-4, aging, and AD are described. Healthy middle-aged adults without dementia who have the ApoE-4 gene show deficits in spatial attention and working memory that are qualitatively similar to those seen in clinically diagnosed AD patients. The findings support an association between ApoE polymorphism and specific components of visuospatial attention. Molecular mechanisms that may mediate the ApoE–attention link by modulating cholinergic neurotransmission to the posterior parietal cortex are discussed. Studies of attention and brain function in ApoE-4 carriers without dementia can advance knowledge of the genetics of visual attention, may enhance understanding of the preclinical phase of AD, and may lead to better methods for early AD detection.
Apolipoprotein E (ApoE) is an amino acid glycoprotein that is important in lipid storage, transport, and metabolism (Mahley, 1988). Although synthesized mainly in the liver, ApoE is also found in the peripheral and central nervous systems, including the brain. ApoE has long been of interest in medicine, but its importance in neuroscience increased dramatically with the identification of the 4 allele of the ApoE gene on chromosome 19 as a major risk factor for the development of late-onset Alzheimer’s disease (AD) in older adults (Saunders et al., 1993; Strittmatter et al., 1993). This discovery led to a growing number of studies examining the role of the ApoE gene in normal brain function and cognition, as well as in disorders such as AD, brain injury, and stroke (Higgins, Large, Rupniak, & Barnes, 1997; Horsburgh, McCarron, White, & Nicol, 2000; J. D. Smith, 2000).
Polymorphisms of the ApoE gene are associated with significant alterations in brain morphology (Plassman et al., 1997) and cognitive functioning, including attention (Greenwood, Sunderland, Friz, & Parasuraman, 2000) and memory (Bondi et al., 1995). Studies of ApoE may thus reveal information relevant to the genetics of attention and memory in normal individuals. At the same time, such studies may identify cognitive and neural changes that may be characteristic of preclinical stages of AD. In this article, we review the role of the ApoE gene in normal cognition and in the development of deficits indicative of early AD.
Currently, no reliable methods exist for the early detection and treatment of AD. New techniques for preventing, slowing the progression of, and treating AD are being urgently sought. Such efforts would be aided considerably if AD could be detected prior to the clinical diagnosis of AD and before irreversible brain changes occur (Daffner & Scinto, 2000). Postmortem studies show that neuropathological changes occur decades before the onset of clinical symptoms of AD (Braak & Braak, 1991). Studies using neuroimaging and neuropsychological tests in AD patients with mild dementia have also described the functional changes found in the early stages of AD (R. G. M. Morris, 1996; Nebes, 1992; Parasuraman & Nestor, 1993; Perry & Hodges, 1999; Schwartz, 1990). Nonetheless, studies conducted with clinically diagnosed, mild AD patients, although extremely informative, face a fundamental problem with respect to the issue of early diagnosis: The criteria for the clinical diagnosis of AD, first proposed in 1984 and still used today, require a deficit in at least one (“possible” AD) or two (“probable” AD) areas of cognition (McKhann, Drachman, & Folstein, 1984). Therefore, the precursors of cognitive impairment in AD cannot be examined by using such participants, even those having only mild dementia.
An alternative approach is to examine cognition and brain function in individuals who do not have dementia but are at risk for developing AD. Functional changes in such at-risk individuals, if found, might be indicative of the development of AD. Several genetic risk factors for AD have been identified. Three genes with autosomal dominant inheritance are associated with early-onset AD with almost complete penetrance: presenilin 1 on chromosome 14 (Schellenberg et al., 1992), presenilin 2 on chromosome 1 (Levy-Lahad et al., 1995), and amyloid-β precursor protein (APP) on chromosome 21 (Tanzi et al., 1987). However, these forms of AD are rare, accounting for only about 2% to 5% of cases, in comparison to the more common and late-onset AD. Polymorphisms of various other candidate genes have been examined as risk factors for late onset AD (Bertram et al., 2000; Blacker et al., 1998; Ertekin-Taner et al., 2000; Myers et al., 2000). The strongest evidence to date involves the 4 allele of the ApoE gene (Saunders et al., 1993; Strittmatter et al., 1993). Consequently, one strategy to examine the precursors of AD is to investigate changes in cognition and brain function in individuals without dementia who have the ApoE-4 genotype.
Memory impairment is thought to be a hallmark of the cognitive decline seen in AD (Albert, 1998; Becker, 1988; J. C. Morris, 1996; Nebes, 1989, 1992; Parasuraman & Martin, 1994). Yet it is now well established that significant attentional deficits also occur in the early phases of AD (Parasuraman & Haxby, 1993). Although the existence of attentional deficits in AD, even in the early stages of the disease, has been known for more than 15 years (Baddeley, Della Sala, Logie, & Spinnler, 1986; Nebes, Martin, & Horn, 1984; Parasuraman & Nestor, 1986), AD is still primarily viewed as an amnestic disease. As we discuss further in this article, however, there are close interrelationships between attention and working memory (e.g., Awh & Jonides, 1998; Cowan, 1995), and AD can also justifiably be viewed as an attentional disorder. Attentional changes might therefore be detectable in the preclinical phase of AD with appropriately sensitive tests.
We review studies of attention in adults without dementia who have the ApoE-4 gene and in AD patients, examining specific aspects of visual selective attention that provide sensitive assays of early dysfunction in AD. A brief account of changes in brain morphology, brain metabolism, and general cognition in healthy individuals genotyped for ApoE is presented first to set the stage for the subsequent description of changes in visual attention. We then discuss these attentional changes in relation to the ApoE gene and brain function and examine the role of cholinergic neurotransmission in mediating the link between ApoE genotype and attention.
On autopsy, the brains of AD patients show extensive neuronal loss, amyloid deposits (plaques), and neurofibrillary tangles (NFTs) in the entorhinal cortex, hippocampus, and in the temporal and parietal cortices (Arriagada, Marzloff, & Hyman, 1992; Braak & Braak, 1991; Hyman, Van Hoesen, & Damasio, 1984; Kemper, 1994; Mann, 1997). These pathological changes occur early in the development of the disease. On the basis of an extensive analysis of a large sample (N > 3,500) of postmortem brains, Braak and Braak showed that NFTs appear in an orderly manner, being seen first in entorhinal cortex in nondemented individuals and, subsequently, in the hippocampus before spreading to the superior temporal and orbitofrontal cortex and eventually to all of the association cortex. Do such changes also occur, even if diminished in magnitude, in the presymptomatic and preclinical phases of AD? Antemortem studies indicate that neuropathological signs of early AD can occur in individuals with no observable cognitive deficits at the time of testing (Crystal et al., 1988; Katzman et al., 1988; J. C. Morris et al., 1996). Middle-aged adults (and even some individuals in their 30s) without any symptoms of dementia may have NFTs and plaques (Braak & Braak, 1991). Prospective studies indicate that subtle cognitive symptoms may occur years before AD can be clinically diagnosed (G. W. Small et al., 1997). A recent study also reported that neuropsychological measures of memory showed longitudinal changes that were predictive of subsequent clinical diagnosis of AD 1.5 years prior to symptom onset and diagnosis (Chen et al., 2001). Figure 1 shows a theoretical time line for the development of AD, starting with the presymptomatic phase, progressing through the preclinical stage, and, when the “AD pathological burden” exceeds a threshold, continuing to the clinical stages of dementia (Daffner & Scinto, 2000). The challenge is to identify the specific biological and cognitive features associated with each of these stages. Studies of cognitive and brain function in individuals without dementia whose ApoE genotype is known may be of considerable help in this regard.
The ApoE gene is inherited as one of three alleles—2, 3, and 4—with mean frequencies in the general population of about 8%, 78%, and 14%, respectively (Wilson et al., 1991). Inheritance of one 4 allele is associated with increased risk of late-onset AD in older adults. If two 4 alleles are inherited the risk is increased further, pointing to a gene dose effect of ApoE 4 (Corder et al., 1993; see also Henderson et al., 1995). Possession of the 4 allele also lowers the age of onset of AD (Meyer et al., 1998). In contrast, the 2 allele appears to confer a protective effect with regard to AD risk (Corder et al., 1994; Farrer et al., 1997; Lippa et al., 1997). These findings have converged to indicate that ApoE genotype is a powerful risk factor for AD. However, ApoE-4 is just that, a risk factor: it is neither necessary nor sufficient for the development of AD (Henderson et al., 1995; Hyman et al., 1996). Only about half of ApoE-4 homozygotes develop AD by age 90 (Henderson et al., 1995), and only about 60% of AD patients are 4 carriers (Mayeux et al., 1998). When pitted against the “gold standard” of autopsy-based neuropathological diagnosis, therefore, clinical diagnosis has higher sensitivity (93%) than does ApoE-4 carrier status alone in detecting AD (65%; Mayeux et al., 1998). Thus, any cognitive and neural changes associated with ApoE-4 in individuals without dementia may be less reliably linked to AD than will similarly measured changes in clinically diagnosed AD patients. Nevertheless, a meta-analysis of more than 15,000 cases confirmed the importance of ApoE as a major susceptibility gene for AD at all ages (including as young as 40) and in all ethnic groups (Farrer et al., 1997). Studies of cognitive function in individuals without dementia who have the ApoE-4 gene may therefore provide important clues to the precursors of AD.
Disease progression in preclinical AD eventually leads to neuronal and synaptic degradation that is manifest as brain morphological loss. Structural imaging studies using magnetic resonance imaging (MRI) have shown that hippocampal volume reduction is consistently found in clinically diagnosed AD patients (Convit et al., 1997; Kesslak, Nacclioglu, & Cotman, 1991; Killiany, Moss, Albert, & Tamas, 1993). Hippocampal atrophy in older persons with mild cognitive impairment (MCI) is associated with subsequent development of clinical AD (de Leon, Golomb, & George, 1993; Killiany et al., 2000). Similar but less marked changes occur in healthy adults with the ApoE-4 allele. A study of a small sample of twins concordant for ApoE genotype (mean age = 63 years) found that individuals possessing one 4 allele (n = 6) had smaller hippocampi than those lacking an 4 allele (n = 14), despite similar performance levels on standardized neuropsychological tests (Plassman et al., 1997). In a slightly larger and somewhat younger (50–62 years) sample of 4 homozygotes (n = 11) and non-4 carriers (n = 22), Reiman et al. (1998) found a nonsignificant trend toward smaller hippocampal volume in the 4 group.
Positron emission tomography (PET) studies of clinically diagnosed AD patients have shown that there are reductions in cerebral metabolism and blood flow in temporal and parietal cortices (Alexander et al., 1997; Foster, Chase, & Fedio, 1983; Haxby, Duara, Grady, Cutler, & Rapoport, 1985; Ibanez et al., 1998; Rossor, Kennedy, & Frackowiak, 1996). Similar PET metabolic changes are seen in individuals without dementia who have the ApoE-4 gene (see Rapoport, 2000). G. W. Small et al. (1995) tested middle-aged (mean age = 56 years) individuals with MCI and with a positive family history of AD. The overall level of parietal metabolism was lower and hemispheric asymmetry was greater in ApoE-4 carriers than in non-4 carriers. In a follow-up study, G. W. Small et al. (2000) confirmed these findings in another sample of participants who had MCI but did not qualify for a diagnosis of dementia. In another study of individuals without dementia who were cognitively normal on neuropsychological evaluation, Reiman et al. (1996) found that ApoE-4 homozygotes showed reduced rates of glucose metabolism in parietal, temporal, prefrontal, and posterior cingulate regions.
Two studies using functional imaging have been reported. C. D. Smith et al. (2000) carried out an fMRI study of object and face recognition in normal individuals. They reported reduced activation in ApoE-4 carriers in bilateral inferotemporal areas that typically subserve visual object processing. In contrast, in another fMRI study, Bookheimer et al. (2000) found greater and more widespread activation of the left hippocampal, prefrontal, and parietal regions during a memory task in individuals with the 4 allele than in those with the 3 allele. A pattern of reduced activation in object processing areas and increased activation in pre-frontal cortex is also seen in AD patients and could be indicative of a compensatory response (Grady, Haxby, Horwitz, & Rapoport, 1993; Grady & Parasuraman, 1995). Bookheimer et al. interpreted the increased level and volume of activation in the ApoE-4 group as a compensatory response in which additional brain regions are recruited to perform a cognitive operation.
These neuroimaging studies have yielded generally consistent results. Individuals without dementia who have the ApoE-4 gene show reductions in the level and pattern of regional cerebral glucose metabolism that are qualitatively the same as in clinically diagnosed AD patients. It is noteworthy that these results have been obtained in relatively young samples (50–65 years), which increases confidence that they are not confounded with age-related changes. It would be preferable if the changes could have been demonstrated in individuals who were completely cognitively normal (i.e., did not have MCI, as in the study by G. W. Small et al., 1995, although Reiman et al.’s, 1996, sample was reportedly free of any neuropsychological deficits). Furthermore, the apparently greater sensitivity to the ApoE-4 genotype of PET and fMRI compared with structural MRI suggests that functional evaluation of brain activity may be particularly useful. In terms of the stage model of Daffner and Scinto (2000) shown in Figure 1, PET and fMRI may provide assessment of the preclinical stage, whereas structural MRI of hippocampus and standard neuropsychological testing of memory may index later stages.
Clinical neuropsychological studies of individuals genotyped for ApoE began fairly soon after the identification of the ApoE-4 gene as a risk factor for AD (Saunders et al., 1993). Reed, Carmelli, and Swan (1994) examined performance on several standard neuropsychological tests in 20 dizygotic twins without dementia who were discordant for ApoE-4 (mean age = 63 years) and found lower scores in ApoE-4 carriers than in those without an 4 allele. Bondi et al. (1995) later examined more specifically whether episodic memory, as measured by a verbal learning test, was linked to ApoE genotype in a sample of older adults without dementia. They found that memory scores were lower in the presence of the 4 allele compared with its absence. A number of other studies have found altered neuropsychological test performance in nondemented carriers of the ApoE-4 gene (Berr et al., 1996; Bondi, Salmon, Galasko, & Thomas, 1999; Caselli et al., 1999; Feskens, Havekes, & Kalmijn, 1994; Helkala et al., 1995; Jonker, Schman, Lindeboom, Havekes, & Launer, 1998). Flory, Manuck, Ferrell, Ryan, and Muldoon (2000) extended these findings to middle-aged adults by showing that ApoE-4 carriers with a mean age of 46 had lower verbal memory performance than those with the ApoE-2 or ApoE-3 alleles. Also consistent with these results linking ApoE and cognition is the finding that the absence of the ApoE 4 allele is associated with higher levels of cognitive functioning in very old (75–98 years) adults (Riley et al., 2000).
However, some studies finding no differences in neuropsychological test performance as a function of ApoE genotype have also been reported (Plassman et al., 1997; Reiman et al., 1996; B. J. Small et al., 2000; G. E. Smith et al., 1998). In another large community sample (N = 1,750), Yaffe, Cauley, Sands, and Browner (1997) found that when using an extensive neuropsychological battery, only a single test of attention, the Trails B test, distinguished between ApoE-4 carriers and non-4 carriers (see also Chen et al., 2001).
These neuropsychological studies of ApoE-4 are less consistent than the neuroimaging studies reviewed earlier. The studies have differed in a number of ways, which may explain the variability in the results. Another possibility is that examination of the specific information-processing components underlying attention and memory may provide for more sensitive assessment of the effects of the ApoE gene on cognition. We now turn to these studies.
Studies of attention in AD patients were extensively reviewed by Parasuraman and Haxby (1993). Research reported since this time has also been the subject of periodic reviews (R. G. M. Morris, 1996; Parasuraman, in press; Parasuraman & Greenwood, 1998; Perry & Hodges, 1999), as have studies of attention in healthy aging (Greenwood & Parasuraman, 1997; McDowd & Shaw, 2000). Accordingly, we do not revisit this burgeoning literature but briefly survey the major results before focusing on two aspects of attentional function that have been shown to be sensitive to the ApoE gene and AD.
Although there is no completely agreed on taxonomy of attention, a good case can be made for the relative independence of at least three components: selection, vigilance, and executive control (Parasuraman, 1998; Parasuraman & Davies, 1984; Posner & Boies, 1971). Selection refers to the preferential processing of particular stimuli that are relevant to an organism’s current goal; vigilance ensures that processing is maintained over time so that the goal can be achieved; and executive control allows for the time sharing and coordination of these processing activities with other goal-directed activities (Parasuraman, 1998). Attentional functioning in early AD can be viewed from the perspective of these three categories. Selective attention is markedly impaired in mild AD, as reflected in deficits in covert attention (Parasuraman, Greenwood, Haxby, & Grady, 1992) and visual search tasks (Foster, Behrmann, & Stuss, 1999; Parasuraman, Greenwood, & Alexander, 1995). Executive control is also impaired, as revealed by deficits in divided attention (Baddeley et al., 1986; Nestor, Parasuraman, Haxby, & Grady, 1991), Stroop (Spieler, Balota, & Faust, 1996), and other tasks requiring changes in attentional set (Albert, 1998; Collette, van der Linden, & Salmon, 1999; R. G. M. Morris, 1996). In contrast, arousal and vigilance decrement—the decrease in target detection performance with time on task—are minimally affected in mild AD (Johannsen, Jakobsen, Bruhn, & Gjedde, 1999). (The exception is when the target for vigilance involves a memory load [Parasuraman, 1979] in which case AD patients do show impairment [Baddeley, Cocchini, Della Sala, Logie, & Spinnler, 1999]). Furthermore, attentional functions are not uniformly impaired within these domains of attention. Rather, Parasuraman and Haxby (1993) concluded that in early AD, some component attentional operations are impaired in efficiency while others are preserved. Moreover, this profile is qualitatively and quantitatively different to that associated with healthy aging (Greenwood & Parasuraman, 1997; McDowd & Shaw, 2000).
Component operations underlying selective attention, and in particular spatial attention, provide for sensitive markers of early attentional dysfunction in AD (Parasuraman & Greenwood, 1998). We now review the evidence indicating that two specific aspects of spatial selective attention provide sensitive behavioral assays of attentional dysfunction in early AD: covert attention shifting and dynamic changes in the spatial scale of attention.
Selective attention acts on multiple representations in the brain (Pashler, 1998; Posner & Petersen, 1990; Treisman, 1996). Selection may be based on location, stimulus features such as color or spatial frequency, or groupings of stimulus features that form an object—so-called object-based selection (Duncan, 1984). Nevertheless, there is strong evidence for the primacy of location-based, or spatial, selection (Cave & Pashler, 1995). Typically, people move their eyes to a particular location to select a particular object that is of current interest. This allows an object at that location to be foveated and thus accurately perceived. Eye movements provide the principal means of spatial selection in many everyday tasks such as reading, driving, navigation, and search. The type, number, frequency, order, and randomness (or entropy) of the areas fixated in sequential eye movements can indicate selection efficiency (e.g., Hilburn, Jorna, Byrne, & Parasuraman, 1997; Zelinsky, Rao, Hayhoe, & Ballard, 1997).
Saccadic eye movements are a somewhat inefficient method of selection: it takes about 200 ms to move one’s eyes, during which time vision is suppressed. If one had to search for an object among 10 distractors by using only saccades, at least 1 s would elapse if, on average, five locations were fixated in turn before the object was found (5 × 200 ms = 1 s). People can generally search for targets among distracters at a much quicker rate than this. Studies of visual search also indicate that when free to move their eyes, people typically make few and sometimes no saccades, even when the search array may be exposed for as long as 3 s (Previc, 1996). Therefore, another spatial selection mechanism must function when the eyes are fixated (Briand & Klein, 1987; Koch & Ullman, 1985; Treisman & Gelade, 1980).
It has been known for more than a century that attention can be allocated to a location other than where the eyes are fixated. Posner (1980) developed a location-cuing task to study this mechanism of covert attention. In the covert attention task, participants maintain their gaze at the center of a display while a cue directs them to attend to a given location in the periphery. The cue speeds reaction time (RT) to a target at that location, compared with an uninformative (neutral) or incorrect (invalid) cue. Several studies have shown that cues enhance sensory processing at the attended location, as reflected in benefits in accuracy or in enhancement of early latency event-related brain potential (ERP) components elicited by a stimulus at the attended location (Hawkins et al., 1990; Luck et al., 1994; Mangun, 1995). In contrast to such valid location cues, invalid cues that direct attention to another location result in costs in RT (selective slowing) or accuracy (reduced d′), presumably because of the need to “disengage” or shift attention away from the incorrect to the correct location.
The covert attention-shifting task has been widely used in studies with clinical populations, including AD (Buck, Black, Behrmann, Caldwell, & Bronskill, 1997; Oken, Kishiyama, Kaye, & Howieson, 1994). Spatial attention is associated with activation of a distributed network of brain regions including the parietal cortex, pulvinar, and superior colliculus. This network is involved in shifts of covert (Corbetta, Kincade, Ollinger, McAvoy, & Shulman, 2000; Corbetta, Miezin, Shulman, & Petersen, 1993; LaBar, Gitelman, Parrish, & Mesulam, 1999; Mesulam, 1981; Nobre et al., 1997; Posner, Walker, Friderich, & Rafal, 1984) and overt (Anderson et al., 1994; Corbetta et al., 1998) attention (see also Posner & Dahaene, 1994). PET studies have shown that major components of this network (e.g., the posterior parietal lobe) are hypometabolic in the early stages of AD (Haxby et al., 1985, 1986). This would suggest that spatial attention shifting should be impaired in early AD. This hypothesis was tested in an early study of covert attention in AD by Parasuraman et al. (1992).
AD participants with mild dementia and age matched controls were tested on a cued letter discrimination task. Participants were required to discriminate between different letters presented in either the left or the right visual field while fixating on a central point. A discrimination task was used to examine the influence of spatial attention at a level higher than simple energy detection. However, to facilitate comparisons with other neuropsychological studies that have generally only used the luminance detection task originally designed by Posner (1980), a letter detection task requiring a simple RT response was also used. The cue (an arrow) was correct (valid cue), incorrect (invalid cue), or uninformative (neutral cue) regarding the location of the target and was presented either centrally or peripherally. For both the detection and the discrimination tasks, the AD group, like the controls, was faster to respond to a target when the cue was valid (RT benefit) compared with when it was neutral or invalid, indicating that the ability to focus attention on the target is not substantially compromised in AD. In contrast, for either peripheral or central cues, the AD group had longer RTs to targets when the location cue was invalid (RT cost), pointing to an attention-shifting or disengagement (Posner et al., 1984) deficit in early AD (see Figure 2). The deficit was significant only for the discrimination and not for the detection task, suggesting that the greater focal attention demands of discrimination exert a top-down effect on attention shifting that is particularly sensitive to a dementing disease.
Following the original report by Parasuraman et al. (1992), several studies have confirmed that spatial attention shifting from an incorrectly cued location is deficient in early AD. Oken et al. (1994) used a location-cuing task in which AD and control participants had to discriminate between a circle and a square presented to the left or right visual field. The AD group had disproportionately longer invalid cue RTs compared with controls. A number of other studies have also found evidence in AD participants for increased RT costs associated with invalid cues in covert attention-shifting tasks (Buck et al., 1997; Danckert, Maruff, Crowe, & Currie, 1998; Johnson, Mapstone, Hays, & Weintraub, 1999; Maruff & Currie, 1995; Parasuraman, Greenwood, & Alexander, 2000a; see also Faust & Balota, 1997). The attention-shifting deficit therefore appears to be a reliable indicator of early attentional dysfunction and is consistent with the effects of AD on the metabolic integrity of the parietal lobe. Using PET, Parasuraman et al. (1992) found that the attention-shifting deficit in these AD patients was correlated with the degree of hypometabolism of the right posterior parietal lobe, a finding replicated by using single photon emission tomography (SPECT) by Buck et al. A recent longitudinal study also confirmed that the deficit not only persists on repeat testing, but also increases with the progression of dementia over time (Parasuraman, Greenwood, & Alexander, 2000b). Figure 3 shows the RT costs for shifting from an invalid location for different values of stimulus onset asynchrony (SOA) for two testing periods separated by about 1 year. As Figure 3 indicates, RT costs increased longitudinally, but especially so at long SOAs.
At the same time, some studies have reported that RTs in covert attention tasks do not differ significantly between AD patients and controls (Caffrara, Riggio, Malvezzi, Scaglioni, & Freedman, 1997; Maruff, Malone, & Currie, 1995; Wright, Geffen, & Geffen, 1997), although Maruff et al. (1995) reported that cue-validity effects in their AD group were greater for right but not for left visual field targets. The discrepancy may reflect such factors as task sensitivity, small sample sizes, and dementia severity. Each of these studies used a simple detection task instead of the discrimination task used by Parasuraman et al. (1992), who also found that AD patients did not differ from age-matched controls in a detection task. Simple detection of a target in an otherwise empty field imposes only minimal demands on focal attention, compared with a discrimination or search task when distractors are present (Pashler, 1998). That sample size and task sensitivity are probable contributory factors is supported by a closer examination of the study by Caffrara et al. Despite their AD group showing mean RT costs (80 ms) that were twice as high as those of controls (38 ms), the difference was not significant for the small sample of patients Caffrara et al. tested (N = 7). Visual discrimination tasks may therefore provide for more sensitive assessment of attentional shifting in AD than do detection tasks.
Slowing of attentional shifting from an invalid location is not specific to AD but can occur with any disorder that affects the integrity of the posterior parietal lobe (Posner et al., 1984). Nevertheless, this attentional-shifting deficit in AD can be distinguished from other conditions. For example, the changes in spatial attention shifting in AD differ both qualitatively and quantitatively from those associated with healthy adult aging (Greenwood, Parasuraman, & Haxby, 1993). There is only modest slowing of voluntary attention shifting (driven by central, symbolic location cues) with healthy aging up to about 75 years, whereas reflexive shifting (with peripheral location cues) is unaffected (Greenwood et al., 1993; Hartley, Kieley, & Slabach, 1990). However, both forms of attention shifting are impaired in old–old individuals 75 years and older (Greenwood & Parasuraman, 1994). Thus, advanced age (over 75) and early-stage AD have qualitatively (but not quantitatively) similar effects on attention shifting. This may possibly reflect the greater likelihood that some adults of advanced age are in a preclinical stage of AD compared with young–old persons (Sliwinski, Lipton, Buschke, & Stewart, 1996).
The attentional disengagement deficit in AD can also be qualitatively distinguished from spatial attention deficits in other neurodegenerative disorders, such as Huntington’s disease (HD) and Parkinson’s disease (PD). In contrast to the increased RT to invalid cues of AD patients, PD participants show reduced RTs to invalid cues (Wright, Burns, Geffen, & Geffen, 1990). This suggests that whereas AD patients exhibit an attention-shifting deficit, PD patients have a deficit in the maintenance of attention, leading to abnormally fast disengagement. This differential pattern of attention-shifting deficits between AD and PD patients is also found for attention shifts between different levels of a compound stimulus, as opposed to shifts between spatial locations. Filoteo et al. (1992) administered a version of the global–local task (Navon, 1977), in which a large object (global level) is made up of smaller objects (local level) and attention has to be focused on either the global or the local level. AD patients were abnormally slowed when attention had to be switched from the global to the local level, or vice versa. This finding is consistent with increased slowing in shifting from an invalidly cued location in AD patients. In contrast, Filoteo et al. (1994) found that PD patients were abnormally fast in shifting between levels in the global–local task, which is also consistent with the abnormally fast disengagement on the covert attention task found by Wright et al. There is thus a remarkable consistency in the pattern of results for two different tasks involving shifts of attention: the location-cued covert attention task and the global–local task. For example, whereas AD patients show an attentional-shifting deficit in both tasks, PD patients show a deficit in the maintenance of attention across trials in both tasks (see Filoteo et al., 1995 for a review). This consistency also mirrors the disparate pathologies and neurochemical deficits characterizing PD and AD.
Finally, the deficit in covert attention shifting in AD patients is also reflected in deficits in overt shifts of attention (Daffner, Scinto, Weintraub, & Mesulam, 1992; Rosler et al., 2000; Scinto, Daffner, Castro, & Mesulam, 1994). This is not surprising given that the parietal cortical areas mediating shifts of covert attention and eye movements overlap (Anderson et al., 1994; Corbetta et al., 1998) and that cognitive studies have also shown links between covert and overt attention (Hoffman, 1998; Klein, Kingstone, & Pontefract, 1992). Scinto et al. found that individuals with AD were less accurate and slowed in shifting their gaze between a central fixation point and a target dot presented in sequence at peripheral locations. A major contributor to error was perseverative fixation of the center point of one of the peripheral targets. Scinto et al. suggested that perseveration of gaze may be associated with slowed disengagement of covert attention in AD. Using fMRI with a visually guided saccade task, Thulborn, Martin, and Voyvodic (2000) found reduced right parietal activation in AD patients, consistent with the PET and SPECT; findings of right parietal hypometabolism associated with covert attention (Buck et al., 1997; Parasuraman et al., 1992). Individuals with AD are also impaired in making antisaccades; that is, eye movements in a direction opposite to that of a peripheral stimulus with sudden onset (Fletcher & Sharpe, 1988). This represents an inhibitory failure of overt spatial attention that is also found with covert attention (Maruff & Currie, 1995).
In summary, attentional shifting in AD, as reflected in increased RT to an invalid spatial location cue, (a) is deficient in early AD patients compared with age-matched controls; (b) increases with progression of dementia; (c) is correlated with hypometabolism of the right parietal lobe; (d) differs qualitatively from spatial attention changes associated with healthy aging up to 75 years, as well as other neurodegenerative disorders such as PD; (e) differs quantitatively but not qualitatively from spatial attention changes in the old–old (over 75 years); and (f) is accompanied by abnormal patterns of overt attention shifts (eye movements).
In location-cuing studies, participants attend to a single target at a cued location in an otherwise empty visual field. This simple covert attention task has the advantage that it can be performed by monkeys and hence related to neurophysiological (Robinson, Goldberg, & Kertzman, 1995) and pharmacological studies (Witte, Davidson, & Marrocco, 1997). However, most natural visual scenes are not as impoverished as this task. The visual search task provides a better analog to such everyday visual tasks.
Behavioral (Treisman, 1996) and neuroimaging studies (Corbetta, Shulman, Miezin, & Petersen, 1995) suggest that the covert attention mechanism also operates in visual search tasks. However, the area containing a target in a search array may be large or small and attention may need to be distributed broadly or narrowly. Consider searching for a hair on a dinner plate versus trying to locate the face of a friend at a crowded bar. Efficient visual search thus requires a third mechanism in addition to overt eye movements and covert shifting of attention: changes in the spatial scale of attention. A relatively small scale may be optimal when searching for a small object. For larger objects or composite objects made up of smaller parts, however, a wider attentional focus may be more efficient (Castiello & Umilta, 1990; Eriksen & Yeh, 1985; Navon, 1977). People can voluntarily adjust the effective area of the attentional focus from large to small, or vice versa, but, just like a “zoom lens,” resolving power must be traded off against the size of the attended area (Eriksen & St. James, 1986). An alternative conceptualization is that observers distribute their spatial attention along a “gradient” that peaks at the attended location, with the falloff from the peak being relatively sharp or diffuse (LaBerge & Brown, 1989). Either of these views predicts that spatial cues that vary in their precision of localization should affect search efficiency. In particular, a small, target-sized cue should facilitate search compared with a larger sized cue because of its greater precision.
The dynamic scaling of spatial attention can be examined by trial-to-trial variations in the precision of location cues. To examine this aspect of spatial attention, we developed a visual search task using location cues that varied in size and hence in precision of target localization (Parasuraman, Greenwood, & Alexander, 1995). Participants were required to identify a target presented in an array of objects such as letters (Greenwood, Parasuraman, & Alexander, 1997). The search array was preceded by a cue that varied in size across trials (see Figure 4). When a range of cues from small to large was provided within a block of trials, target RT showed continuous modulation with such trial-to-trial changes in cue size. Target RT increased monotonically with cue size, pointing to a mechanism of dynamic adjustment of the spatial scale of attention (Greenwood & Parasuraman, 1999).
The slope of the RT/cue size function (reflecting the additional response time accompanying each increase in cue size) indicates the efficiency and dynamic range of the attention scaling mechanism. Figure 5 schematically illustrates variation in the scaling mechanism across conditions. Four theoretical response patterns are shown: normal, enhanced, reduced, and abolished. Compared with young adults, healthy older individuals under the age of 75 years show the enhanced pattern (Greenwood & Parasuraman, 1999). AD patients, on the other hand, show a reduced slope compared with age-matched controls (Parasuraman et al., 1995), indicating that the dynamic range of attentional scaling is reduced.
PET, fMRI, and ERP studies have identified the brain networks that are involved in the control and execution of covert shifts of attention (Corbetta et al., 2000; Mangun, 1995; Posner & Petersen, 1990). In contrast, the networks mediating the spatial scale of attention as assessed by the cued visual search task are less well understood. To the extent that the global–local task invokes a similar mechanism as the cued visual search task, activation of the temporoparietal cortex has been reported (Fink et al., 1997). In addition, because RT reflects both early and postperceptual changes in processing, the temporal locus of effects of the spatial scale of attention is uncertain. In the covert attention-shifting paradigm discussed previously, ERP studies have provided strong evidence that attention shifting modulates neural activity in the early visual processing (extrastriate) cortex (Luck & Girelli, 1998; Mangun, 1995). Effects on attention scaling have not been as extensively studied. However, recently, Luo, Greenwood, and Parasuraman (2001) found that shifts in the spatial scale of attention, as elicited by variations in cue size in a search task, modulated early-latency ERP components (P1 and N1) recorded from scalp regions overlying posterior cortical areas. These results indicate that the spatial scale of attention also acts as a sensory gain control mechanism, as does attention shifting.
Using PET in an uncued search task, Corbetta et al. (1995) found that in comparison to feature search, conjunction search was associated with activation of the parietal lobe in a region closely overlapping the same region they had previously shown to be involved in covert shifts of attention. Furthermore, brain-damaged individuals with deficits in covert orienting are slower to search for targets defined by a conjunction of color and orientation, but are unimpaired for detection of either feature in isolation (Arguin, Joanette, & Cavanagh, 1993).
In general, these results suggest that individuals with AD, who have prominent parietal lobe hypometabolism and show an attentional-shifting deficit (Buck et al., 1997; Parasuraman et al., 1992), should be impaired when asked to perform a visual search task in which repeated shifts of spatial attention are required. Greenwood et al. (1997) tested this hypothesis by using a cued search task. The slope of the RT–cue size function was reduced in AD patients compared with that of a young–old (65–75) group, whereas an old–old (75–85) group had an intermediate slope. These findings indicate that AD patients with mild dementia exhibit an overall benefit of cuing in the cued visual search task but that the benefit is markedly reduced. This, in turn, suggests an impairment in AD in the ability to adjust the spatial scale of attention during visual search. Greenwood et al. also found that healthy older adults showed a qualitatively different pattern of results. Older adults under the age of 75 had higher slopes than the young, suggesting that they relied more on the cues in identifying targets. Thus, whereas AD reduced the spatial scaling effect, normal aging up to the age of 75 was associated with an enhanced scaling effect. This finding is consistent with the view that for complex tasks, older adults require greater “environmental support,” as provided by the cue (Craik & McDowd, 1987). The reduction in the cue size effect in AD was replicated in a subsequent study using a greater range of cue sizes (Parasuraman et al., 2000a). AD patients showed a benefit only for the most precise cue and not for intermediate and large cues. Healthy older adults, on the other hand, showed a continuous and greater modulation of search efficiency with changes in cue size (see Figure 6). These results suggest that AD constricts the spatial scale of attention to a narrow range (see also Coslett, Stark, Rajaram, & Saffran, 1995).
In summary, dynamic spatial scaling of attention, as reflected in the slope of the RT/cue size function, is (a) reduced overall in AD patients compared with age-matched controls; (b) restricted in AD to small, relatively precise spatial cues; (c) increased with normal aging, until the age of 75; and (d) reduced in older adults without dementia over the age of 75.
Attention shifting and the dynamic scaling of spatial attention provide two sensitive behavioral assays for tracking cognitive changes in early AD. Three features are noteworthy: (a) specific component operations (rather than global cognitive function) are targeted, (b) changes are seen in early AD, and (c) changes are distinguishable from healthy aging. The effects of AD on these tasks also differ qualitatively from those associated with other conditions, including healthy aging (up to age 75), PD, and HD (Filoteo et al., 1995; Parasuraman & Greenwood, 1998). The findings corroborate the view that tasks of covert attentional shifting and attentional scaling provide reliable behavioral assays of early attention dysfunction in AD. Converging evidence from PET, fMRI, and ERP studies indicate that these tasks modulate early sensory-perceptual activity that is mediated by neural networks in the posterior parietal and temporal lobes, areas that are the first neocortical sites of dysfunction in early AD. These results, therefore, suggest that it would be fruitful to examine these aspects of visuo-spatial attention in nondemented individuals at genetic risk for developing AD.
Greenwood et al. (2000) recently reported a study investigating the effects of the ApoE gene on spatial attention. This study is part of a large-scale longitudinal investigation of changes in putative AD biomarkers, structural and functional neuroimaging patterns, and measures of cognitive function in persons without dementia who are at genetic risk for AD (Sunderland et al., 1999, 2000). Greenwood et al. examined 97 middle-aged (mean age = 58 years) adults genotyped for ApoE on tests of attention shifting, attention scaling, and vigilance. These participants did not have dementia and showed no deficits on an extensive battery of standard neuropsychological tests such as the Wechsler Adult Intelligence Scale—Revised, the Buschke Selective Reminding Test, the Boston Naming Test, and the Trail Making Test. ApoE genotypes were classified into the following groups: an 2 group (including 2/2 and 2/3), an 3 group (including 3/3), and an 4 group (including 2/4, 3/4, and 4/4). The tasks used were a covert attention task, a cued visual search task (described previously), and a 20-min vigilance task in which participants had to a discriminate a target letter (present on 20% of trials) from distractors against a patterned background. This task was used to examine whether attentional changes associated with ApoE genotype, if found, were linked to more general changes in arousal and vigilance.
For the covert attention task, median RTs were fastest for valid cues, slowest for invalid cues, and intermediate for neutral cues. Effects of cue validity developed as cue–target SOA increased from 200 to 2,000 ms (see Figure 7A, B, and C). There was no main effect of ApoE group on overall RT. However, the effect of cue validity on RT was greatest in the 4 group (Figure 7D). RT to invalid cues was slowed in the 4 group compared with the 2 and 3 groups. RT costs, but not benefits, were significantly greater in the 4 group. Thus, attentional shifting from an invalid location was impaired in the ApoE-4 group.
RTs to targets in the search task increased with cue size. There was no main effect of group, but the cue size effect varied with group, particularly at the smaller cues (see Figure 8A and 8B). A measure of the cue size effect was calculated as the slope of the regression of RT on all four cue sizes. The slope of this RT–cue size function reflects the extent of the postulated attention scaling mechanism: the lower the slope, the lower the effective use of the mechanism (see also Figure 5). Slope differed significantly across the ApoE groups, being lower in the 4 group than in the 2 or 3 groups (Figure 8B). Thus, the spatial scaling of attention was reduced in individuals with the 4 allele compared with those without the ApoE-4 allele.
Finally, the vigilance task showed the expected decline in performance over time, indicating that the task reliably indexed vigilance decrement (Parasuraman, 1979). However, neither sensitivity d′ nor criterion c on this task differed significantly across ApoE groups. Moreover, the 4-related group differences in the cued discrimination and cued visual search tasks were uncorrelated with measures of performance on this vigilance task.
The attentional changes associated with ApoE-4 are distinct in many respects from those associated with normal adult aging. Slowed attentional disengagement following invalid cues has been shown previously to occur only in older adults over the age of 75 (Greenwood & Parasuraman, 1994), well beyond the mean age (58 years) of the 4 group in this study. Furthermore, the effect of cue precision on visual search increases in older adults until about age 75 (Greenwood & Parasuraman, 1999), in contrast to the present study in which the same effect decreased in middle-aged adults with an 4 allele. Effects of cue size (precision) decrease in healthy, older adults after age 75 (Greenwood & Parasuraman, 1999) and in AD patients (Greenwood & Parasuraman, 1997; Parasuraman et al., 1995). Thus, the reduction in cue size effects during visual search in ApoE-4 carriers is different from the pattern seen in aging before age 75.
These findings support the hypothesis that middle-aged adults without dementia who have the ApoE-4 genotype exhibit qualitatively similar (though quantitatively smaller) deficits in components of visuospatial attention to clinically diagnosed AD patients. Although the results need to be replicated and extended, it is notable that these findings were obtained in a relatively young group of adults who showed no deficits on standard neuropsychological tests. These attentional changes must also be examined in relation to other conditions (e.g., cerebrovascular disease) that can affect cognitive functioning in older adults, particularly in those with the ApoE-4 gene (Carmelli et al., 1998). Because individuals with cerebrovascular disease were excluded from the previously described studies of attention in ApoE-4 carriers without dementia, the attentional changes associated with ApoE cannot be attributed to the influence of vascular disease. Nevertheless, additional research on the impact of cerebrovascular disease on attentional functioning in ApoE-4 carriers is warranted, particularly to examine how specific these attentional changes are to the development of AD.
Greenwood et al. (2000) showed that sensitive probing of attentional operations may be revealing of the preclinical phase of AD, even though standard neuropsychological tests that assess whole-task performance may be insufficiently sensitive. However, additional longitudinal studies must be conducted to verify this view, which would be strengthened if other attentional component operations are examined (e.g., divided attention or phasic arousal following a warning stimulus). These functions are impaired and preserved, respectively, in mild AD patients (Parasuraman & Haxby, 1993). In addition, given that memory decline occurs prominently in early AD, studies using information-processing tests of memory in nondemented ApoE-4 carriers would be informative.
Severe deficits in spatial working memory are found in AD (Flicker, Bartus, & McCarthy, 1984; Freedman & Oscar-Berman, 1986). Simone and Baylis (1997) also found that AD patients showed a deficit on a delayed-response task involving retention of spatial information over a short delay. The effect was increased when distractors were included in the retention interval, indicating that attentional impairment magnified the memory deficit. One might, therefore, predict that spatial working memory is also impaired in nondemented ApoE-4 carriers.
We recently obtained preliminary findings supporting this prediction. A task was designed to examine the effect of spatial attention on memory for a visual location. A cue of varying size was used to direct attention with variable precision to a location indicated by a briefly visible black dot. Participants were instructed to remember the location of the dot over a delay of 2 s. At the end of the delay period, a second location was indicated with a briefly visible red dot and participants were required to indicate whether the red dot appeared in the same location as the previous black dot. This occurred on 50% of trials (termed match condition). On the other 50% (the nonmatch condition), the red dot appeared at one of three distances away from the location of the black dot in a randomly selected direction. Cues to the first (black dot) location were either valid or invalid (i.e., cues either predicted or not the location of the black dot). Furthermore, to manipulate the quality of the information about the region of the black dot, cues varied in size and, hence, in precision (Figure 9).
A sample of 75 healthy individuals varying in risk of AD was tested and grouped according to ApoE-4 gene dose (i.e., whether they possessed 0, 1, or 2 4 alleles) with genotypes of 3/3, 3/4, and 4/4. In addition, a sample of individuals possessing the 2 allele was obtained. The cues to subsequent event location altered the ability to retain a spatial representation in memory. The smaller the cued region, the better the memory, even though the cue was not present during the delay. This finding extends the effects of attention—previously shown to modulate perception and discrimination—to the modulation of memory. The speed with which the same–different decision was made (match RT) increased significantly with cue size (Figure 10) in the 3/3 and 3/4 groups, but not in the 4/4 group. These differences produced a marginal Cue Size × Genotype Group interaction ( p < .10). More definitive results await larger sample sizes. Nevertheless, speed of response was clearly optimized when cues were small and valid, indicating that spatial working memory, like perception, benefits from a constricted attentional focus at the target location. Moreover, that effect may be disrupted by ApoE genotype. ApoE-4 homozygotes exhibit a trend toward memory deficits (Bondi et al., 1995), and our finding that the 4 homozygotes showed the lowest accuracy is consistent with that work. Moreover, those with the highest dose of the 4 allele (the 4/4 group) also appear to obtain the least benefit of cue precision, as we have reported in a non-memory search task (Greenwood et al., 2000).
These results suggest that attentional cuing enhances spatial working memory in both young and middle-aged adults, but that ApoE-4/4 homozygotes may be impaired in their ability to form and retain memory for a restricted region of space. This may arise, in part, because of reduced use of top-down information about regions of space.
The 4 allele of the ApoE gene is a major risk factor for late-onset AD (Saunders et al., 1993). This discovery sparked extensive research on the role of ApoE in normal and abnormal brain function. The work conducted to date shows that polymorphisms of the ApoE gene are associated with alterations in brain morphology and brain metabolism in otherwise healthy individuals, principally reductions in hippocampal volume and cerebral metabolism in temporoparietal cortex. These findings suggest that cognitive functioning should also be affected in healthy adults, but standard neuropsychological tests have provided only mixed evidence for changes in cognition associated with ApoE-4 (e.g., Bondi et al., 1999; B. J. Small et al., 2000; Yaffe et al., 1997). Furthermore, although the positive neuropsychological studies generally point to a deficit in the domain of memory, no study using standard neuropsychological measures has identified the specific cognitive operations that are affected by ApoE-4.
Recently, however, studies using information-processing approaches to dissect cognition into component operations have found impairments in attention and working memory in individual carriers of the ApoE-4 gene (Greenwood et al., 2000; Parasuraman, 2001). These changes were found in healthy middle-aged adults in their 50s who were clinically without dementia. Moreover, the attentional deficits occurred without nonspecific changes in vigilance and with preserved whole-task performance on standard neuropsychological tests of cognitive function. The attentional changes are qualitatively (but not quantitatively) the same as those reported previously in individuals in the early, mild stages of AD. Specifically, adults without dementia who have the ApoE-4 gene show the same selective pattern of attentional performance as do clinically diagnosed AD patients: (a) a deficit in covert attentional shifting (Parasuraman et al., 1992), (b) a reduction in the ability to scale spatial attention dynamically (Parasuraman et al., 1995, 2000a), and (c) no change in vigilance decrement (Parasuraman & Haxby, 1993). In addition, preliminary findings suggest that ApoE-4 homozygotes without dementia are impaired in components of spatial working memory.
These results have implications for an understanding of the genetic basis of normal attentional functioning and its variation in adulthood. Because ApoE-4 is a risk factor for AD, the findings are also relevant to the analysis of the preclinical stage of AD. What are the mechanisms by which the ApoE gene influences attention? We discuss different possibilities, focusing on modulation of cholinergic transmission to brain regions involved in attention.
A detailed answer to the question of how the ApoE gene influences attention must await the results of gene expression studies that examine changes in the proteins expressed by the ApoE gene with development, age, and other factors. A major possibility is modulation of ApoE-related alterations of cholinergic transmission to association cortical areas that are important for the attentional operations of shifting and scaling attention. There are several strands of evidence that when taken together are consistent with such a modulatory influence.
First, human neuroimaging studies point to the association of metabolic decline in the parietal cortex with the development and course of AD (Reiman et al., 1996; G. W. Small et al., 2000). De Leon et al. (2001) also recently reported that ApoE-4 carriers without dementia who declined to a diagnosis of AD over a 3-year period showed pronounced metabolic decline longitudinally in the temporo-parietal cortex. Furthermore, regional blood flow in the parietal cortex—but in no other brain region—has been shown to predict years of survival of AD patients (Jagust, Haan, Reed, & Eberling, 1998).
Second, the posterior parietal cortex and subcortical structures form a distributed neural network that is involved in attentional shifting (Mesulam, 1981; Posner & Petersen, 1990), as revealed by single-unit studies in monkeys (Bushnell, Goldberg, & Robinson, 1981; Robinson et al., 1995) and human neuroimaging studies (Corbetta et al., 1993). A recent fMRI study found the inferior parietal/superior temporal area to be specifically involved in the disengage component of attentional shifting (Corbetta et al., 2000). The necessity of the parietal cortex for shifting attention has also been demonstrated in normal participants with the application of transcranial magnetic stimulation (TMS), which produces a reversible “lesion” of a particular brain area. TMS applied to the parietal cortex produces a neglect-like impairment of contralateral detection during bilateral, but not unilateral, visual stimulation (Pascal-Leone, Gomez-Tortosa, Grafman, & Always, 1994) and also impairs performance on conjunction, but not feature, search (Ashbridge, Walsh, & Cowey, 1997). The role of the parietal cortex in the dynamic scaling of attention is less well established, although there is some supporting evidence from imaging (Fink et al., 1997) and lesion (Robertson, Lamb, & Knight, 1988) studies.
Third, the efficiency of the parietal cortex in carrying out these attentional operations depends on basal forebrain activation and the integrity of cholinergic input to the cortex (Everitt & Robbins, 1997; Marrocco & Davidson, 1998; McGaughy, Everitt, Robbins, & Sarter, 2000; Sarter, Givens, & Bruno, 2001; Wenk, 1993). Lesions of the cholinergic basal forebrain by ibotenic acid in monkeys lead to increased attentional disengagement of covert attention, without concomitant effects on memory (Voytko et al., 1994). Davidson and Marrocco (2000) also showed that infusion of scopolamine, a muscarinic cholinergic antagonist, into the intraparietal cortex in monkeys slowed RTs to peripheral cues, particularly when cues were invalid, suggesting a modulation of the disengagement effect. Davidson and Marrocco concluded that attentional shifting is facilitated by increased levels of acetylcholine and impaired by reductions of acetylcholine in the parietal cortex. Similar effects have been demonstrated in the rat, by administration of nicotine (facilitation) and scopolamine (impairment; Phillips, McAlona, Robb, & Brown, 2000).
Pharmacological studies in humans provide converging evidence that the attentional shifting deficit in AD is due to reduced cholinergic innervation of the parietal cortex. The attention-shifting deficit in mild AD is linked to hypometabolism of the posterior parietal cortex (Parasuraman et al., 1992). Consistent with this finding, acetylcholine depletion in normal human participants by administration of scopolamine also leads to an AD-like pattern of hypometabolism in parietal and temporal cortical areas (Molchan et al., 1994). A cholinergic role in attentional scaling has also been established. Using the cued-visual search task discussed previously, Levy, Parasuraman, Greenwood, Dukoff, and Sunderland (2000) found that the combined effects of scopolamine and the intrinsic cholinergic deficit in AD led to a complete abolition of the cue size effect, indicating that the scaling mechanism was no longer operative (see also Figure 5). Furthermore, cholinesterase inhibitors and acetylcholine agonists have been found to improve sustained attention in AD patients (Lawrence & Sahakian, 1995; Sahakian et al., 1993). Despite these supportive findings, there are two areas of uncertainty. First, there is only limited evidence for pharmacological enhancement of selective attention in AD (as opposed to sustained attention and vigilance). Second, the effects of scopolamine and other muscarinic antagonists have been well studied, but nicotinic receptors are also important in AD (Little, Johnson, Minichiello, Weingartner, & Sunderland, 1998; Whitehouse et al., 1986). The nicotinic system has been implicated in attention, in both animal and human studies (Rusted, Newhouse, & Levin, 2000), but further work is needed to determine the precise roles of the muscarinic and nicotinic systems in specific components of attention.
Various other sources of evidence also point to the dependence of the parietal cortex/attentional-shifting system on the integrity of the cholinergic system. Iyo et al. (1997) used PET with a radioactively labeled acetylcholine analog to obtain in vivo measures of acetylcholinesterase activity in early onset AD patients and healthy controls. They found significant AD-related reductions in the parietal and temporal cortical areas, with the greatest reduction (38%) in the parietal cortex. (This result in AD patients with mild dementia obtained in vivo assumes importance in light of the finding by Davis et al., 1999, that cholinergic markers obtained from postmortem tissue are not significantly depleted in early AD cases.) Also, Marutle, Warpman, Bogdanovic, Lannfelt, and Nordberg (1999) reported that the parietal cortex was burdened with the highest beta amyloid load compared with other cortical regions in patients with the Swedish APP Alzheimer susceptibility mutation. In these early onset AD cases, as well as in late-onset AD patients, nicotinic binding sites were correspondingly reduced in the parietal cortex (Marutle et al., 1999). A gene for a receptor of nerve growth factor (tyrosine kinase [trkA]) is expressed at lower levels in the Alzheimer parietal cortex compared with the healthy-aged parietal cortex (Hock et al., 1998).
Together, these pharmacological, neurophysiological, and neuroimaging data indicate that both attentional shifting and scaling are dependent on acetylcholine levels in the parietal cortex. These findings also indicate that the parietal mediators of the components of attention may be particularly susceptible to physiological changes related to AD pathology. As converging sources of evidence, these data support a link between cholinergic transmission from basal forebrain sources to target sites in the parietal and temporal cortices and the efficiency of component operations of spatial attention. The influence of the ApoE gene on this link has not yet been reliably established, although there are a number of promising lines of evidence.
First, increased ApoE-4 gene dose is associated with reduced hippocampal and cortical choline acetlytransferase (ChAT) activity (Poirer et al., 1995). This is clearly consistent with the notion that if the cholinergic deficit falls below a threshold, cognitive decline sets in, with the possibility of later development of AD (Bartus, 2000). Additional evidence comes from studies of genetically altered mice. Impairments in cognition and in ChAT levels in basal fore-brain projections have been reported in ApoE-deficient (knockout) mice (Gordon, Grauer, Genis, Sehayek, & Michaelson, 1995). Raber et al. (1998) used a neuron-specific enolase promoter to express human 3 and 4 ApoE isoforms in ApoE knockout mice. Compared with those with ApoE-3, the ApoE-4 mice showed deficits in spatial working memory (a water maze task); this effect was greater in females (see also Raber et al., 2000). Of importance, deficits in spatial working memory in ApoE knockout mice have been shown to be reversed by cholinergic replacement therapy (Fisher, Brandeis, Chapman, Pittel, & Michaelson, 1998), including acetycholinesterase inhibitors and muscarinic agonists (Chapman et al., 1998). These effects are consistent with our finding of a deficit in spatial working memory in human ApoE-4 carriers.
Polymorphisms in ApoE and other genes may influence the acetylcholine–attention link through modulation of acetylcholine receptor subunits. A polymorphism in chrna7, an acetylcholine receptor subunit gene, appears to control sensorimotor gating in schizophrenic patients (Freedman et al., 1997), a function that involves selective attention. Jones, Sudweeks, and Yakel (1999) proposed that the α7 and α4β2 subunits control cholinergic synaptic transmission in the hippocampus and cortex. These and other nicotinic receptors are becoming well characterized at the molecular level, but their specific functional role is still unclear (Weiland, Bertrand, & Leonard, 2000). Moreover, attentional operations have not been comprehensively assayed in studies examining the functional properties of these receptor units. Nevertheless, the initial findings are promising and suggest that the ApoE–attention link is a plausible one. Additional genetic studies will need to be carried out to further identify the molecular mechanisms that mediate the role of ApoE and other genes in the parietal cholinergic/attentional function. As a highly pleiotropic protein, ApoE is involved in several aspects of lipid redistribution and metabolism. Various molecular roles of ApoE may therefore be involved in the modulation of cholinergic function in healthy adults and AD patients. These mechanisms include the effects of ApoE on oxidative stress (Ihara et al., 2000) and synaptic plasticity (Nathan et al., 1994), among others (Herz & Buffert, 2000).
In addition to ApoE, it will be important to know what other genes are linked to attention. Attentional performance is known to be heritable in persons with genetic abnormalities such as the fragile X syndrome (Finkel & Pedersen, 2000; Gecz & Mulley, 2000) as well as in neuropsychiatric disorders such as schizophrenia (Cannon et al., 2000). However, additional genetic studies of normal individuals are needed. For example, Fan, Wu, Fossella, and Posner (2001) recently provided preliminary evidence for the heritability of components of visual attention in normal adults. Further work is also needed to examine which other genes are linked to attentional function in the posterior parietal cortex and which genes are influenced by the loss of acetylcholine modulation. A first step in answering this question is the identification of genes whose expression is enriched in the posterior parietal cortex. One possibility is to use high-throughput cDNA microarray analysis on postmortem parietal cortex tissue samples. Ho et al. (2001) recently reported alterations in the expression of the synaptic vesicle protein synapsin II in entorhinal cortical tissue obtained postmortem from early stage AD patients. Another avenue for research would be to apply these new genetic analytic tools to parietal cortical tissue obtained from ApoE and chrna7 knockout mice.
What is the significance of changes in attention in healthy and otherwise asymptomatic individuals of middle age? The findings of Greenwood et al. (2000) are of interest because the attentional changes they observed are qualitatively the same as those reported previously in individuals in the early, mild stages of AD. Of course, the extent of the deficit is much greater in AD patients than in ApoE-4 carriers without dementia. Is the pattern in ApoE-4 carriers without dementia indicative of preclinical changes that could develop into dementia? This is a possibility, but validation may require long-term longitudinal studies. On the other hand, the results could reflect a direct effect of ApoE genotype on cognition, or a so-called “cognitive phenotype” (Reed et al., 1994). Such a view would suggest that the influence of ApoE-4 on cognition should be observed in individuals younger than the middle-aged participants tested by Greenwood et al., say adults in their 30s and 40s, or indeed, even younger adults. To our knowledge, no such study has yet been conducted, although Flory et al. (2000) found deficits in standard neuropsychological tests in ApoE-4 carriers in the 24–60 age range (mean age = 46 years).
The notion of a cognitive phenotype effect is given some support by work showing a link between cognition in early life and risk of AD in old age. Two research groups have shown that the level of cognitive functioning in early life predicts the risk of AD in later life, such that either higher IQ at age 11 (Whalley et al., 2000) or greater “idea density” in written work at age 20 leads to reduced risk of AD later in life (Snowdon et al., 1996). Braak and Braak (1999) showed that AD-related pathology can be seen as early as in the third decade of life. However, AD pathology is unlikely to be a major factor in 11-year-olds (Whalley et al., 2000).
Another possibility is that these findings reflect the consequences of early life environment, which has been linked to the later development of such chronic adult diseases as heart disease and diabetes (Osmond & Barker, 2000). Nutrient limitation in utero and infancy leads to changes of structure and metabolism that are the origin of hypertension and diabetes in adulthood. AD pathology is seen preferentially in brain regions undergoing later maturation in early life (Rapoport, 1990). Moceri, Kukull, Emanuel, van Belle, and Larson (2000) have argued that a suburban residence and a low number of siblings reduce the risk that lower levels of brain maturation predispose to AD. Consistent with this view is evidence for a link between education level and risk of AD (Katzman, 1993; Yu et al., 1989). Therefore, rather than affecting a cognitive phenotype, ApoE, which is known to play a role in neuron health and repair (Higgins et al., 1997), may influence neurotoxicity. This effect may be particularly potent when nonoptimal development has increased susceptibility to cognitive consequences of neurotoxic events such as head injury (Horsburgh, McCulloch, Nilsen, Roses, & Nicol, 2000; Kutner, Erlanger, Tsai, Jordan, & Relkin, 2000; Lichtman, Seliger, Tycko, & Marder, 2000) and to neurotoxic agents such as beta amyloid (Knowles, Gomez-Isla, & Hyman, 1998).
The ApoE gene is associated with significant alterations in brain morphology and function in otherwise healthy adults. Among the brain regions affected are the hippocampus and the posterior parietal cortex. Consequently, middle-aged carriers of the ApoE-4 allele without dementia show deficits in component operations of spatial selective attention and working memory, compared with those with the 2 or 3 alleles. These deficits are qualitatively (but not quantitatively) similar to those exhibited by mildly demented persons with clinically diagnosed AD. The association between ApoE-4 and attention may arise through modulation of cholinergic neurotransmission to the posterior parietal cortex. Molecular genetic studies will be needed to identify the underlying mechanisms. Cognitive and molecular studies of attention and brain function in ApoE-4 carriers without dementia can contribute to knowledge on the genetics of visual attention and to a better understanding of the preclinical phase of AD.
This research was supported by Grant AG19653 from the National Institute on Aging. We thank John Fossella and Richard Marrocco for critical comments on an earlier version of this article.
Raja Parasuraman and Pamela M. Greenwood, Cognitive Science Laboratory, Catholic University of America; Trey Sunderland, Geriatric Psychiatry Branch, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland.