The dopamine receptor D4 gene (DRD4) 7-repeat allele has been found to interact with environmental factors such as parenting in children and peer attitudes in adults to influence aspects of behavior such as risk taking. We previously found that in toddlers, lower-quality parenting in combination with the 7-repeat allele of the DRD4 gene was associated with greater parent-reported Sensation Seeking (SS), but was unrelated to Effortful Control (EC). We now report findings from a followup assessment with the same sample of children showing that parenting quality interacts with the presence of the 7-repeat allele to predict EC in 3-to 4-year-old children. The change in these patterns of results may reflect the increased role of the executive attention network in older children and adults. However, due to the small sample size (N = 52) and the novelty of the results, these findings should be treated with caution and considered preliminary until they are replicated in an independent sample.
The study of attention has largely been about how to select among the various sensory events but also involves the selection among conflicting actions. Prior to the late 1980s, locating bottlenecks between sensory input and response dominated these studies, a different view was that attentional limits involved the importance of maintaining behavioral coherence rather than resulting from a bottleneck. In both cases ideas of resource limits taken over from economics were important. Early evidence relating to the anatomy of attention came from neurological investigations of lesioned patients, but the major impetus for the anatomical approach came from neuroimaging studies that provided evidence of brain networks related to orienting to sensory events and control of response tendencies. The presence of a functional anatomy has supported studies of the development of attention networks and the role of neuromodulators and genetic poymorphisms in their construction. Together these developments have enhanced our understanding of attention and paved the way for significant applications to education, pathology and prevention of mental illness.
Although the study of brain states is an old one in neuroscience, there has been growing interest in brain state specification owing to MRI studies tracing brain connectivity at rest. In this review, we summarize recent research on three relatively well-described brain states: the resting, alert, and meditation states. We explore the neural correlates of maintaining a state or switching between states, and argue that the anterior cingulate cortex and striatum play a critical role in state maintenance, whereas the insula has a major role in switching between states. Brain state may serve as a predictor of performance in a variety of perceptual, memory, and problem solving tasks. Thus, understanding brain states is critical for understanding human performance.
In adults most cognitive and emotional self-regulation is carried out by a network of brain regions, including the anterior cingulate, insula and areas of the basal ganglia, related to executive attention. We propose that during infancy control systems depend primarily upon a brain network involved in orienting to sensory events that includes areas of the parietal lobe and frontal eye fields. Studies of human adults and alert monkeys show that the brain network involved in orienting to sensory events is moderated primarily by the nicotinic cholinergic system arising in the nucleus basalis. The executive attention network is primarily moderated by dopaminergic input from the ventral tegmental area. A change from cholinergic to dopaminergic modulation would be a consequence of this switch of control networks and may be important in understanding early development. We trace the attentional, emotional and behavioral changes in early development related to this developmental change in regulative networks and their modulators.
Neural activity that is evoked naturalistically in children during educational television viewing can be used to predict math and verbal knowledge.
It is not currently possible to measure the real-world thought process that a child has while observing an actual school lesson. However, if it could be done, children's neural processes would presumably be predictive of what they know. Such neural measures would shed new light on children's real-world thought. Toward that goal, this study examines neural processes that are evoked naturalistically, during educational television viewing. Children and adults all watched the same Sesame Street video during functional magnetic resonance imaging (fMRI). Whole-brain intersubject correlations between the neural timeseries from each child and a group of adults were used to derive maps of “neural maturity” for children. Neural maturity in the intraparietal sulcus (IPS), a region with a known role in basic numerical cognition, predicted children's formal mathematics abilities. In contrast, neural maturity in Broca's area correlated with children's verbal abilities, consistent with prior language research. Our data show that children's neural responses while watching complex real-world stimuli predict their cognitive abilities in a content-specific manner. This more ecologically natural paradigm, combined with the novel measure of “neural maturity,” provides a new method for studying real-world mathematics development in the brain.
In the real world, children learn new information by participating in classrooms, interacting with their family and friends, and watching educational videos. While previous neuroimaging research has typically used simple tasks and short-lasting stimuli, in this study we examined brain development using a more complex and naturalistic educational stimulus. Children and adults all watched the same Sesame Street video as we measured their neural activity using functional magnetic resonance imaging (fMRI). We examined the timecourses of neural activity over the length of the video for children and adults. We found that the degree to which children showed adult-like brain responses was correlated with their math and verbal knowledge levels. In the intraparietal sulcus, children's neural correlation with adults depended on their mathematics knowledge whereas in Broca's area, it depended on their verbal knowledge. Additional experiments showed that children's neural responses in the intraparietal sulcus are selectively driven by numerical content both when children are watching Sesame Street and when they engage in a number matching task. These convergent results highlight the broad role of the intraparietal sulcus in processing numerical information. In addition, our study validates the use of naturalistic stimuli and child-to-adult neural timecourse correlations for studying brain development. We suggest that this new approach can enrich our understanding of how children's brains process information in the real world.
Here, we update our 1990 Annual Review of Neuroscience article, “The Attention System of the Human Brain.” The framework presented in the original article has helped to integrate behavioral, systems, cellular, and molecular approaches to common problems in attention research. Our framework has been both elaborated and expanded in subsequent years. Research on orienting and executive functions has supported the addition of new networks of brain regions. Developmental studies have shown important changes in control systems between infancy and childhood. In some cases, evidence has supported the role of specific genetic variations, often in conjunction with experience, that account for some of the individual differences in the efficiency of attentional networks. The findings have led to increased understanding of aspects of pathology and to some new interventions.
alerting network; executive network; orienting network; cingulo-opercular network; frontoparietal network
The term consciousness is an important one in the vernacular of the western literature in many fields. It is no wonder that scientists have assumed that consciousness will be found as a component of the human brain and that we will come to understand its neural basis. However, there is rather little in common between consciousness as the neurologist would use it to diagnose the vegetative state, how the feminist would use it to support raising male consciousness of the economic plight of women and as the philosopher would use it when defining the really hard question of the subjective state of awareness induced by sensory qualities. When faced with this kind of problem it is usual to subdivide the term into more manageable perhaps partly operational definitions. Three meanings that capture aspects of consciousness are: (1) the neurology of the state of mind allowing coherent orientation to time and place (2) the selection of sensory or memorial information for awareness and (3) the voluntary control over overt responses. In each of these cases the mechanisms of consciousness overlap with one or more of the attentional networks that have been studied with the methods of cognitive neuroscience. In this paper we explore the overlap and discuss how to exploit the growing knowledge of attentional networks to constrain ideas of consciousness.
attention networks; alerting; orienting; executive
Although the study of brain development in non-human animals is an old one, recent imaging methods have allowed non-invasive studies of the gray and white matter of the human brain over the lifespan. Classic animal studies show clearly that impoverished environments reduce cortical gray matter in relation to complex environments and cognitive and imaging studies in humans suggest which networks may be most influenced by poverty. Studies have been clear in showing the plasticity of many brain systems, but whether sensitivity to learning differs over the lifespan and for which networks is still unclear. A major task for current research is a successful integration of these methods to understand how development and learning shape the neural networks underlying achievements in literacy, numeracy, and attention. This paper seeks to foster further integration by reviewing the current state of knowledge relating brain changes to behavior and indicating possible future directions.
childhood poverty; brain networks; plasticity; attention; literacy; numeracy
Asthma is one of the most common, serious chronic diseases in pediatric and young adult populations. Health-risk behaviors, including cigarette smoking and alcohol use, may exacerbate chronic diseases and complicate their management. The aim of this study was to longitudinally analyze rates of cigarette smoking and alcohol use in adolescents and young adults who have asthma and those who do not have asthma. A secondary analysis of data from the National Longitudinal Study of Adolescent Health was undertaken. Individuals with asthma were found to exhibit increasing rates of cigarette smoking and alcohol use as they aged. When an adolescent with a chronic health issue begins health-risk-taking behaviors, behavior change interventions must be planned. Pediatric nurses, practitioners, and clinicians are uniquely positioned to assess for health-risk behaviors in youth with asthma and to intervene with plans of care that are tailored for the needs of this vulnerable population.
A combined behavioral and brain imaging study shows how sensory awareness and stimulus visibility can influence the dynamics of decision-making in humans.
Human decisions are based on accumulating evidence over time for different options. Here we ask a simple question: How is the accumulation of evidence affected by the level of awareness of the information? We examined the influence of awareness on decision-making using combined behavioral methods and magneto-encephalography (MEG). Participants were required to make decisions by accumulating evidence over a series of visually presented arrow stimuli whose visibility was modulated by masking. Behavioral results showed that participants could accumulate evidence under both high and low visibility. However, a top-down strategic modulation of the flow of incoming evidence was only present for stimuli with high visibility: once enough evidence had been accrued, participants strategically reduced the impact of new incoming stimuli. Also, decision-making speed and confidence were strongly modulated by the strength of the evidence for high-visible but not low-visible evidence, even though direct priming effects were identical for both types of stimuli. Neural recordings revealed that, while initial perceptual processing was independent of visibility, there was stronger top-down amplification for stimuli with high visibility than low visibility. Furthermore, neural markers of evidence accumulation over occipito-parietal cortex showed a strategic bias only for highly visible sensory information, speeding up processing and reducing neural computations related to the decision process. Our results indicate that the level of awareness of information changes decision-making: while accumulation of evidence already exists under low visibility conditions, high visibility allows evidence to be accumulated up to a higher level, leading to important strategical top-down changes in decision-making. Our results therefore suggest a potential role of awareness in deploying flexible strategies for biasing information acquisition in line with one's expectations and goals.
When making a decision, we gather evidence for the different options and ultimately choose on the basis of the accumulated evidence. A fundamental question is whether and how conscious awareness of the evidence changes this decision-making process. Here, we examined the influence of sensory awareness on decision-making using behavioral studies and magneto-encephalographic recordings in human participants. In our task, participants had to indicate the prevailing direction of five arrows presented on a screen that each pointed either left or right, and in different trials these arrows were either easy to see (high visibility) or difficult to see (low visibility). Behavioral and neural recordings show that evidence accumulation changed from a linear to a non-linear integration strategy with increasing stimulus visibility. In particular, the impact of later evidence was reduced when more evidence had been accrued, but only for highly visible information. By contrast, barely perceptible arrows contributed equally to a decision because participants needed to continue to accumulate evidence in order to make an accurate decision. These results suggest that consciousness may play a role in decision-making by biasing the accumulation of new evidence.
Children show increasing control of emotions and behavior during their early years. Our studies suggest a shift in control from the brain's orienting network in infancy to the executive network by the age of 3–4 years. Our longitudinal study indicates that orienting influences both positive and negative affect, as measured by parent report in infancy. At 3–4 years of age, the dominant control of affect rests in a frontal brain network that involves the anterior cingulate gyrus. Connectivity of brain structures also changes from infancy to toddlerhood. Early connectivity of parietal and frontal areas is important in orienting; later connectivity involves midfrontal and anterior cingulate areas related to executive attention and self-regulation.
attention; connectivity; development; orienting
Research in cognitive neuroscience now considers the state of the brain prior to the task an important aspect of performance. Hypnosis seems to alter the brain state in a way which allows external input to dominate over internal goals. We examine how normal development may illuminate the hypnotic state.
Attention influences many aspects of cognitive development. Variations in the COMT gene, known to affect dopamine neurotransmission, have frequently been found to influence attention in adults and older children. In this paper we examined 2 year old children and found that variation in the COMT gene influenced attention in a task involving looking to a sequence of visual stimuli. Because the influence of another dopamine related gene (DRD4) has been shown to interact with parenting quality at this age, we explored parenting in relation to variations in the COMT gene. Variations in COMT interacted with parenting quality to influence our attention measure. The Val108/158Met polymorphism of COMT is commonly used to determine allelic groups, but recently haplotypes of several polymorphisms within this gene have been shown to do a better job in reflecting perceived pain. Since attention and pain both involve the activation of the anterior cingulate gyrus in imaging studies, we compared the Val108/158Met influence with the COMT haplotypes and found the latter to be more predictive of attention. Our results confirm that important aspects of cognitive development including attention depend on the interaction of genes and early environment.
One current conceptualization of attention subdivides it into functions of alerting, orienting, and executive control. Alerting describes the function of tonically maintaining the alert state and phasically responding to a warning signal. Automatic and voluntary orienting are involved in the selection of information among multiple sensory inputs. Executive control describes a set of more complex operations that includes monitoring and resolving conflicts in order to control thoughts or behaviors. Converging evidence supports this theory of attention by showing that each function appears to be subserved by anatomically distinct networks in the brain and differentially innervated by various neuromodulatory systems. Although much research has been dedicated to understanding the functional separation of these networks in both healthy and disease states, the interaction and integration among these networks still remain unclear. In this study, we aimed to characterize possible behavioral interaction and integration in healthy adult volunteers using a revised attentional network test (ANT-R) with cue-target interval and cue validity manipulations. We found that whereas alerting improves overall response speed, it exerts negative influence on executive control under certain conditions. A valid orienting cue enhances but an invalid cue diminishes the ability of executive control to overcome conflict. The results support the hypothesis of functional integration and interaction of these brain networks.
attention; attentional networks; alerting; orienting; executive control
The concept of self-regulation is central to the understanding of human development. Self-regulation allows effective socialization and predicts both psychological pathologies and levels of achievement in schools. What has been missing are neural mechanisms to provide understanding of the cellular and molecular basis for self-regulation. We show that self-regulation can be measured during childhood by parental reports and by self-reports of adolescents and adults. These reports are summarized by a higher order factor called effortful control, which reflects perceptions about the ability of a given person to regulate their behavior in accord with cultural norms. Throughout childhood effortful control is related to children’s performance in computerized conflict related tasks. Conflict tasks have been shown in neuroimaging studies to activate specific brain networks of executive attention. Several brain areas work together at rest and during cognitive tasks to regulate competing brain activity and thus control resulting behavior. The cellular structure of the anterior cingulate and insula contain cells, unique to humans and higher primates that provide strong links to remote brain areas. During conflict tasks, anterior cingulate activity is correlated with activity in remote sensory and emotional systems, depending upon the information selected for the task. During adolescence the structure and activity of the anterior cingulate has been found to be correlated with self-reports of effortful control.
Studies have provided a perspective on how genes and environment act to shape the executive attention network, providing a physical basis for self-regulation. The anterior cingulate is regulated by dopamine. Genes that influence dopamine levels in the CNS have been shown to influence the efficiency of self-regulation. For example, alleles of the COMT gene that influence the efficiency of dopamine transmission are related to the ability to resolve conflict. Humans with disorders involving deletion of this gene exhibit large deficits in self-regulation. Alleles of other genes influencing dopamine and serotonin transmission have also been found to influence ability to resolve conflict in cognitive tasks. However, as is the case for many genes, the effectiveness of COMT alleles in shaping self-regulation depends upon cultural influences such as parenting. Studies find that aspects of parenting quality and parent training can influence child behavior and the efficiency of self-regulation.
During development, the network that relates to self-regulation undergoes important changes in connectivity. Infants can use parts of the self-regulatory network to detect errors in sensory information, but the network does not yet have sufficient connectivity to organize brain activity in a coherent way. During middle childhood, along with increased projection cells involved in remote connections of dorsal anterior cingulate and prefrontal and parietal cortex, executive network connectivity increases and shifts from predominantly short to longer range connections. During this period specific exercises can influence network development and improve self-regulation. Understanding the physical basis of self-regulation has already cast light on individual differences in normal and pathological states and gives promise of allowing the design of methods to improve aspects of human development.
Attention; genetic alleles; neural networks; self-regulation
Medical respite programs offer medical, nursing, and other care as well as accommodation for homeless persons discharged from acute hospital stays. They represent a community-based adaptation of urban health systems to the specific needs of homeless persons. This paper examines whether post-hospital discharge to a homeless medical respite program was associated with a reduced chance of 90-day readmission compared to other disposition options. Adjusting for imbalances in patient characteristics using propensity scores, Respite patients were the only group that was significantly less likely to be readmitted within 90 days compared to those released to Own Care. Respite programs merit attention as a potentially efficacious service for homeless persons leaving the hospital.
homeless; readmission; retrospective studies; discharge planning; health services
Brain imaging genetic research involves a multitude of methods and spans many traditional levels of analysis. Given the vast permutations among several million common genetic variants with thousands of brain tissue voxels and a wide array of cognitive tasks that activate specific brain systems, we are prompted to develop specific hypotheses that synthesize converging evidence and state clear predictions about the anatomical sources, magnitude and direction (increases vs. decreases) of allele- and task-specific brain activity associations. To begin to develop a framework for shaping our imaging genetic hypotheses, we focus on previous results and the wider imaging genetic literature. Particular emphasis is placed on converging evidence that links system-level and biochemical studies with models of synaptic function. In shaping our own imaging genetic hypotheses on the development of Attention Networks, we review relevant literature on core models of synaptic physiology and development in the anterior cingulate cortex.
To examine how much distress children report in response to violence that they have witnessed and how this is associated with parental reports of children’s behavior.
As part of a study of in utero exposure to cocaine, children completed the Levonn interview for assessing children’s symptoms of distress in response to witnessing violence. The children’s care givers completed the Exposure to Violence Interview (EVI), a caretaker-report measure of the child’s exposure to violent events during the last 12 months. The EVI was analyzed as a 3-level variable: no exposure, low exposure, and high exposure. The caregivers also completed the Children’s Behavior Checklist (CBCL).
Of 94 six-year-old children, 58% had no exposure to violence, 36% had low exposure to violence, and 6% had high exposure to violence, according to caretaker reports. The children’s median ±SD Levonn score was 64 (SD ± 19.3). The mean (SD ± CBCL total T-score was 53 (SD ± 10.2). In multiple regression analyses with gender, low and high exposure on EVI, Levonn, and prenatal cocaine exposure status as predictors, the Levonn score explained 4.8% of total variance in children’s CBCL internalizing scores, 9.1% of the total variance in CBCL externalizing score, and 12.2% of the total variance in CBCL total score (P = .04, P = .004, and P <.001, respectively).
After accounting for the caretaker’s report of the level of the child’s exposure to violence, the child’s own report significantly increased the amount of variance in predicting child behavior problems with the CBCL. These findings indicate that clinicians and researchers should elicit children’s own accounts of exposure to violence in addition to the caretakers’ when attempting to understand children’s behavior.
Voluntarily shifting attention to a location of the visual field improves the perception of events that occur there. Regions of frontal cortex are thought to provide the top-down control signal that initiates a shift of attention, but because of the temporal limitations of functional brain imaging, the timing and sequence of attentional-control operations remain unknown. We used a new analytical technique (beamformer spatial filtering) to reconstruct the anatomical sources of low-frequency brain waves in humans associated with attentional control across time. Following a signal to shift attention, control activity was seen in parietal cortex 100–200 ms before activity was seen in frontal cortex. Parietal cortex was then reactivated prior to anticipatory biasing of activity in occipital cortex. The magnitudes of early parietal activations were strongly predictive of the degree of attentional improvement in perceptual performance. These results show that parietal cortex, not frontal cortex, provides the initial signals to shift attention and indicate that top-down attentional control is not purely top down.
To extract important details about objects in the environment, people must focus their attention on a specific location in space at any given moment. Research using functional magnetic resonance imaging (fMRI) has suggested that regions of the frontal and parietal lobes work together to control our ability to direct attention to a specific location in space in preparation for an expected visual object. However, the sluggishness of the hemodynamic response has made it difficult to obtain information from fMRI about the timing of activity. Electroencephalography (EEG) has provided information about the timing of neural activity, but the limitations of traditional source estimation techniques have made it difficult to obtain information about the precise location in the brain that the EEG signals are coming from. Thus, the sequence of activities within this frontal-parietal network remains unclear. We used a recently developed electrical neuroimaging technique—called beamforming—to localize the neural generators of low-frequency electroencephalographic (EEG) signals, which enabled us to determine both the location and temporal sequence of activations in the brain during shifts of visuospatial attention. Our results indicate that low-frequency signals in parietal cortex provide the initial signal to shift attention.
Localization of low-frequency brain rhythms reveals that within the frontal-parietal attentional control network, the parietal lobes provide the initial signal to shift attention in space.
All humans, regardless of their culture and education, possess an intuitive understanding of number. Behavioural evidence suggests that numerical competence may be present early on in infancy. Here, we present brain-imaging evidence for distinct cerebral coding of number and object identity in 3-mo-old infants. We compared the visual event-related potentials evoked by unforeseen changes either in the identity of objects forming a set, or in the cardinal of this set. In adults and 4-y-old children, number sense relies on a dorsal system of bilateral intraparietal areas, different from the ventral occipitotemporal system sensitive to object identity. Scalp voltage topographies and cortical source modelling revealed a similar distinction in 3-mo-olds, with changes in object identity activating ventral temporal areas, whereas changes in number involved an additional right parietoprefrontal network. These results underscore the developmental continuity of number sense by pointing to early functional biases in brain organization that may channel subsequent learning to restricted brain areas.
Behavioural experiments indicate that infants aged 4½ months or older possess an early “number sense” that, for instance, enables them to detect changes in the approximate number of objects in a set. However, the neural bases of this competence are unknown. We recorded the electrical activity evoked by the brain on the surface of the scalp as 3-mo-old infants were watching images of sets of objects. Most images depicted the same objects and contained the same number of objects, but occasionally the number or the identity of the objects changed. As indicated by the voltage potential at the surface of the scalp, the infants' brains reacted when either object identity or number changes were introduced. Using a 3-D model of the infant head, we reconstructed the cortical sources of these responses. Brain areas responding to object or number changes are distinct, and reveal a basic ventral/dorsal organization already in place in the infant brain. As in adults and children, object identity in infants is encoded along a ventral pathway in the temporal lobes, although number activates an additional right parietoprefrontral network. These results underscore the developmental continuity of number sense by pointing to early functional biases in brain organization.
Cerebral imaging reveals that human infants are sensitive to numerical quantity at a very early age and that the basic dorsal/ventral functional organization is already in place in the infant brain.
When a flashed stimulus is followed by a backward mask, subjects fail to perceive it unless the target-mask interval exceeds a threshold duration of about 50 ms. Models of conscious access postulate that this threshold is associated with the time needed to establish sustained activity in recurrent cortical loops, but the brain areas involved and their timing remain debated. We used high-density recordings of event-related potentials (ERPs) and cortical source reconstruction to assess the time course of human brain activity evoked by masked stimuli and to determine neural events during which brain activity correlates with conscious reports. Target-mask stimulus onset asynchrony (SOA) was varied in small steps, allowing us to ask which ERP events show the characteristic nonlinear dependence with SOA seen in subjective and objective reports. The results separate distinct stages in mask-target interactions, indicating that a considerable amount of subliminal processing can occur early on in the occipito-temporal pathway (<250 ms) and pointing to a late (>270 ms) and highly distributed fronto-parieto-temporal activation as a correlate of conscious reportability.
Understanding the neural mechanisms that distinguish between conscious and nonconscious processes is a crucial issue in cognitive neuroscience. In this study, we focused on the transition that causes a visual stimulus to cross the threshold to consciousness, i.e., visibility. We used a backward masking paradigm in which the visibility of a briefly presented stimulus (the “target”) is reduced by a second stimulus (the “mask”) presented shortly after this first stimulus. (Human participants report the visibility of the target.) When the delay between target and mask stimuli exceeds a threshold value, the masked stimulus becomes visible. Below this threshold, it remains nonvisible. During the task, we recorded electric brain activity from the scalp and reconstructed the cortical sources corresponding to this activity. Conscious perception of masked stimuli corresponded to activity in a broadly distributed fronto-parieto-temporal network, occurring from about 300 ms after stimulus presentation. We conclude that this late stage, which could be clearly separated from earlier neural events associated with subliminal processing and mask-target interactions, can be regarded as a marker of consciousness.
The sequence of neural events associated with the unfolding of conscious awareness is revealed by comparing electrical brain responses to visual stimuli above and below the behavioral threshold for perception.
OBJECTIVE: Given limited prior evidence of high rates of cervical cancer in Haitian immigrant women in the U.S., this study was designed to examine self-reported Pap smear screening rates for Haitian immigrant women and compare them to rates for women of other ethnicities. METHODS: Multi-ethnic women at least 40 years of age living in neighborhoods with large Haitian immigrant populations in eastern Massachusetts were surveyed in 2000-2002. Multivariate logistic regression analyses were used to examine the effect of demographic and health care characteristics on Pap smear rates. RESULTS: Overall, 81% (95% confidence interval 79%, 84%) of women in the study sample reported having had a Pap smear within three years. In unadjusted analyses, Pap smear rates differed by ethnicity (p=0.003), with women identified as Haitian having a lower crude Pap smear rate (78%) than women identified as African American (87%), English-speaking Caribbean (88%), or Latina (92%). Women identified as Haitian had a higher rate than women identified as non-Hispanic white (74%). Adjustment for differences in demographic factors known to predict Pap smear acquisition (age, marital status, education level, and household income) only partially accounted for the observed difference in Pap smear rates. However, adjustment for these variables as well as those related to health care access (single site for primary care, health insurance status, and physician gender) eliminated the ethnic difference in Pap smear rates. CONCLUSIONS: The lower crude Pap smear rate for Haitian immigrants relative to other women of color was in part due to differences in (1) utilization of a single source for primary care, (2) health insurance, and (3) care provided by female physicians. Public health programs, such as the cancer prevention programs currently utilized in eastern Massachusetts, may influence these factors. Thus, the relatively high Pap rate among women in this study may reflect the success of these programs. Public health and elected officials will need to consider closely how implementing or withdrawing these programs may impact immigrant and minority communities.
To see if changes in the demographics and illness burden of Medicare patients hospitalized for acute myocardial infarction (AMI) from 1995 through 1999 can explain an observed rise (from 32 percent to 34 percent) in one-year mortality over that period.
Utilization data from the Centers for Medicare and Medicaid Services (CMS) fee-for-service claims (MedPAR, Outpatient, and Carrier Standard Analytic Files); patient demographics and date of death from CMS Denominator and Vital Status files. For over 1.5 million AMI discharges in 1995–1999 we retain diagnoses from one year prior, and during, the case-defining admission.
We fit logistic regression models to predict one-year mortality for the 1995 cases and apply them to 1996–1999 files. The CORE model uses age, sex, and original reason for Medicare entitlement to predict mortality. Three other models use the CORE variables plus morbidity indicators from well-known morbidity classification methods (Charlson, DCG, and AHRQ's CCS). Regressions were used as is—without pruning to eliminate clinical or statistical anomalies. Each model references the same diagnoses—those recorded during the pre- and index admission periods. We compare each model's ability to predict mortality and use each to calculate risk-adjusted mortality in 1996–1999.
The comprehensive morbidity classifications (DCG and CCS) led to more accurate predictions than the Charlson, which dominated the CORE model (validated C-statistics: 0.81, 0.82, 0.74, and 0.66, respectively). Using the CORE model for risk adjustment reduced, but did not eliminate, the mortality increase. In contrast, adjustment using any of the morbidity models produced essentially flat graphs.
Prediction models based on claims-derived demographics and morbidity profiles can be extremely accurate. While one-year post-AMI mortality in Medicare may not be worsening, outcomes appear not to have continued to improve as they had in the prior decade. Rich morbidity information is available in claims data, especially when longitudinally tracked across multiple settings of care, and is important in setting performance targets and evaluating trends.
Risk adjustment; Charlson; DCG; CCS; AMI; event-centered database