Evidence that tic behaviour in individuals with Tourette syndrome reflects difficulties inhibiting prepotent motor actions is mixed. Response conflict tasks produce sensitive measures of response interference from prepotent motor impulses and the proficiency of inhibiting these impulses as an act of cognitive control. We tested the hypothesis that individuals with Tourette syndrome show a deficit in inhibiting prepotent motor actions.
Healthy controls and older adolescents/adults with persistent Tourette syndrome without a history of obsessive–compulsive disorder or attention-deficit/hyperactivity disorder and presenting with stable mood functioning (i.e., no history of well-treated anxiety or depression) participated in this study. They performed a Simon task that induced conflict between prepotent actions and goal-directed actions. A novel theoretical framework distinguished group differences in acting impulsively (i.e., fast motor errors) from the proficiency of inhibiting interference by prepotent actions (i.e., slope of interference reduction).
We included 27 controls and 28 individuals with Tourette syndrome in our study. Both groups showed similar susceptibility to making fast, impulsive motor errors (Tourette syndrome 26% v. control 23%; p = 0.10). The slope (m) reduction of the interference effect was significantly less pronounced among participants with Tourette syndrome than controls (Tourette syndrome: m = −0.07 v. control: m = −0.23; p = 0.022), consistent with deficient inhibitory control over prepotent actions in Tourette syndrome.
This study does not address directly the role of psychiatric comorbidities and medication effects on inhibitory control over impulsive actions in individuals with Tourette syndrome.
The results offer empirical evidence for deficient inhibitory control over prepotent motor actions in individuals with persistent Tourette syndrome with minimal to absent psychiatric comorbidities. These findings also suggest that the frontal–basal ganglia circuits involved in suppressing unwanted motor actions may underlie deficient inhibitory control abilities in individuals with Tourette syndrome.
The functional role of the right inferior frontal cortex (rIFC) in mediating human behavior is the subject of ongoing debate. Activation of the rIFC has been associated with both response inhibition and with signaling action adaptation demands resulting from unpredicted events. The goal of this study is to investigate the role of rIFC by combining a go/no-go paradigm with paired-pulse transcranial magnetic stimulation (ppTMS) over rIFC and the primary motor cortex (M1) to probe the functional connectivity between these brain areas. Participants performed a go/no-go task with 20% or 80% of the trials requiring response inhibition (no-go trials) in a classic and a reversed version of the task, respectively. Responses were slower to infrequent compared to frequent go trials, while commission errors were more prevalent to infrequent compared to frequent no-go trials. We hypothesized that if rIFC is involved primarily in response inhibition, then rIFC should exert an inhibitory influence over M1 on no-go (inhibition) trials regardless of no-go probability. If, by contrast, rIFC has a role on unexpected trials other than just response inhibition then rIFC should influence M1 on infrequent trials regardless of response demands. We observed that rIFC suppressed M1 excitability during frequent no-go trials, but not during infrequent no-go trials, suggesting that the role of rIFC in response inhibition is context dependent rather than generic. Importantly, rIFC was found to facilitate M1 excitability on all low frequent trials, irrespective of whether the infrequent event involved response inhibition, a finding more in line with a predictive coding framework of cognitive control.
rIFC; go/no-go task; paired-pulse; TMS; motor cortex; prediction; inhibition
Risk-taking behavior is characterized by pursuit of reward in spite of potential negative consequences. Dopamine neurotransmission along the mesocorticolimbic pathway is a potential modulator of risk behavior. In patients with Parkinson's Disease (PD), impulse control disorder (ICD) can result from dopaminergic medication use, particularly Dopamine Agonists (DAA). Behaviors associated with ICD include hypersexuality as well as compulsive gambling, shopping, and eating, and are potentially linked to alterations to risk processing. Using the Balloon Analogue Risk task, we assessed the role of agonist therapy on risk-taking behavior in PD patients with (n=22) and without (n=19) active ICD symptoms. Patients performed the task both ‘on’ and ‘off’ DAA. DAA increased risk-taking in PD patients with active ICD symptoms, but did not affect risk behavior of PD controls. DAA dose was also important in explaining risk behavior. Both groups similarly reduced their risk-taking in high compared to low risk conditions and following the occurrence of a negative consequence, suggesting that ICD patients do not necessarily differ in their ability to process and adjust to some aspects of negative consequences. Our findings suggest dopaminergic augmentation of risk-taking behavior as a potential contributing mechanism for the emergence of ICD in PD patients.
Impulse Control Disorders; Dopamine Agonists; Parkinson Disease; Risk behavior
Reward-based decision-learning refers to the process of learning to select those actions that lead to rewards while avoiding actions that lead to punishments. This process, known to rely on dopaminergic activity in striatal brain regions, is compromised in Parkinson’s disease (PD). We hypothesized that such decision-learning deficits are alleviated by induced positive affect, which is thought to incur transient boosts in midbrain and striatal dopaminergic activity. Computational measures of probabilistic reward-based decision-learning were determined for 51 patients diagnosed with PD. Previous work has shown these measures to rely on the nucleus caudatus (outcome evaluation during the early phases of learning) and the putamen (reward prediction during later phases of learning). We observed that induced positive affect facilitated learning, through its effects on reward prediction rather than outcome evaluation. Viewing a few minutes of comedy clips served to remedy dopamine-related problems associated with frontostriatal circuitry and, consequently, learning to predict which actions will yield reward.
Parkinson’s disease; positive affect; frontostriatal circuitry; probabilistic learning
Performance on cognitive control tasks deteriorates when control tasks are performed together with other control tasks, that is, if simultaneous cognitive control is required. Surprisingly, this is also observed if control tasks are preceded by other control tasks, that is, if sequential cognitive control is required. The typical explanation for the latter finding is that previous acts of cognitive control deplete a common resource, just like a muscle becomes fatigued after repeated usage. An alternative explanation, however, is that previous acts of cognitive control reduce motivation to match allocated resources to required resources. In this paper we formalize these muscle and motivation accounts, and show that they yield differential predictions regarding the interaction between simultaneous and sequential cognitive control. These predictions were tested using a paradigm where participants had to perform multiple stop-signal tasks, which varied in their demands on simultaneous and sequential control. Results of two studies supported predictions derived from the motivation account. Therefore, we conclude that the effects of sequential cognitive control are best explained in terms of a reduction of motivation to match allocated to required resources.
cognitive control; resource depletion; ego-depletion; motivation; stop-signal task; stimulus response compatibility; formal models; multilevel analysis
This study examined stopping and performance adjustments in four age groups (M ages: 8, 12, 21, and 76 years). All participants performed on three tasks, a standard two-choice task and the same task in which stop-signal trials were inserted requiring either the suppression of the response activated by the choice stimulus (global stop task) or the suppression of the response when one stop-signal was presented but not when the other stop-signal occurred (selective stop task). The results showed that global stopping was faster than selective stopping in all age groups. Global stopping matured more rapidly than selective stopping. The developmental gain in stopping was considerably more pronounced compared to the loss observed during senescence. All age groups slowed the response on trials without a stop-signal in the stop task compared to trials in the choice task, the elderly in particular. In addition, all age groups slowed on trials following stop-signal trials, except the elderly who did not slow following successful inhibits. By contrast, the slowing following failed inhibits was disproportionally larger in the elderly compared to young adults. Finally, sequential effects did not alter the pattern of performance adjustments. The results were interpreted in terms of developmental change in the balance between proactive and reactive control.
stop-signal paradigm; development; cognitive aging; lifespan; cognitive control
Past studies show beneficial as well as detrimental effects of subthalamic nucleus deep-brain stimulation on impulsive behaviour. We address this paradox by investigating individuals with Parkinson’s disease treated with subthalamic nucleus stimulation (n = 17) and healthy controls without Parkinson’s disease (n = 17) on performance in a Simon task. In this reaction time task, conflict between premature response impulses and goal-directed action selection is manipulated. We applied distributional analytic methods to separate the strength of the initial response impulse from the proficiency of inhibitory control engaged subsequently to suppress the impulse. Patients with Parkinson’s disease were tested when stimulation was either turned on or off. Mean conflict interference effects did not differ between controls and patients, or within patients when stimulation was on versus off. In contrast, distributional analyses revealed two dissociable effects of subthalamic nucleus stimulation. Fast response errors indicated that stimulation increased impulsive, premature responding in high conflict situations. Later in the reaction process, however, stimulation improved the proficiency with which inhibitory control was engaged to suppress these impulses selectively, thereby facilitating selection of the correct action. This temporal dissociation supports a conceptual framework for resolving past paradoxical findings and further highlights that dynamic aspects of impulse and inhibitory control underlying goal-directed behaviour rely in part on neural circuitry inclusive of the subthalamic nucleus.
Parkinson’s disease; deep-brain stimulation; response inhibition; impulsivity; subthalamic nucleus
Khat consumption has increased during the last decades in Eastern Africa and has become a global phenomenon spreading to ethnic communities in the rest of the world, such as The Netherlands, United Kingdom, Canada, and the United States. Very little is known, however, about the relation between khat use and cognitive control functions in khat users.
We studied whether khat use is associated with changes in working memory (WM) and cognitive flexibility, two central cognitive control functions.
Khat users and khat-free controls were matched in terms of sex, ethnicity, age, alcohol and cannabis consumption, and IQ (Raven's progressive matrices). Groups were tested on cognitive flexibility, as measured by a Global-Local task, and on WM using an N-back task.
Khat users performed significantly worse than controls on tasks tapping into cognitive flexibility as well as monitoring of information in WM.
The present findings suggest that khat use impairs both cognitive flexibility and the updating of information in WM. The inability to monitor information in WM and to adjust behavior rapidly and flexibly may have repercussions for daily life activities.
So far no studies have systematically looked into the cognitive consequences of khat use. This study compared the ability to inhibit and execute behavioral responses in adult khat users and khat-free controls, matched in terms of age, race, gender distribution, level of intelligence, alcohol and cannabis consumption. Response inhibition and response execution were measured by a stop-signal paradigm. Results show that users and non-users are comparable in terms of response execution but users need significantly more time to inhibit responses to stop signals than non-users.
khat; SSRT; dopamine; stop-signal task; chronic use; long-term effect
Speaking is a complex natural behavior that most people master very well. Nevertheless, systematic investigation of the factors that affect adaptive control over speech production is relatively scarce. The present experiments quantified and compared inhibitory control over manual and verbal responses using the stop-signal paradigm. In tasks with only two response alternatives, verbal expressions were slower than manual responses, but the stopping latencies of hand and verbal actions were comparable. When engaged in a standard picture-naming task using a large set of pictures, verbal stopping latencies were considerably prolonged. Interestingly, stopping was slower for naming words that are less frequently used compared to words that are used more frequently. These results indicate that adaptive action control over speech production is affected by lexical processing. This notion is compatible with current theories on speech self-monitoring. Finally, stopping latencies varied with individual differences in impulsivity, indicating that specifically dysfunctional impulsivity, and not functional impulsivity, is associated with slower verbal stopping.
response inhibition; stop task; cognitive control; word frequency; word production; picture naming
Increasing evidence suggests that religious practice induces systematic biases in attentional control. We used Navon's global–local task to compare attentional bias in Taiwanese Zen Buddhists and Taiwanese atheists; two groups brought up in the same country and culture and matched with respect to race, intelligence, sex, and age. Given the Buddhist emphasis on compassion for the physical and social environment, we expected a more global bias in Buddhist than in Atheist participants. In line with these expectations, Buddhists showed a larger global-precedence effect and increased interference from global distracters when processing local information. This pattern reinforces the idea that people's attentional processing style reflects biases rewarded by their religious practices.
Buddhism; attention; global precedence
Processing irrelevant visual information sometimes activates incorrect response impulses. The engagement of cognitive control mechanisms to suppress these impulses and make proactive adjustments to reduce the future impact of incorrect impulses may rely on the integrity of frontal–basal ganglia circuitry. Using a Simon task, we investigated the effects of basal ganglia dysfunction produced by Parkinson's disease (PD) on both on-line (within-trial) and proactive (between-trial) control efforts to reduce interference produced by the activation of an incorrect response. As a novel feature, we applied distributional analyses, guided by the activation–suppression model, to differentiate the strength of incorrect response activation and the proficiency of suppression engaged to counter this activation. For situations requiring on-line control, PD (n = 52) and healthy control (n = 30) groups showed similar mean interference effects (i.e., Simon effects) on reaction time (RT) and accuracy. Distributional analyses showed that although the strength of incorrect response impulses was similar between the groups PD patients were less proficient at suppressing these impulses. Both groups demonstrated equivalent and effective proactive control of response interference on mean RT and accuracy rates. However, PD patients were less effective at reducing the strength of incorrect response activation proactively. Among PD patients, motor symptom severity was associated with difficulties in on-line, but not in proactive, control of response impulses. These results suggest that basal ganglia dysfunction produced by PD has selective effects on cognitive control mechanisms engaged to resolve response conflict, with primary deficits in the on-line suppression of incorrect responses occurring in the context of a relatively spared ability to adjust control proactively to minimize future conflict.
Substance dependence is associated with executive function deficits, but the nature of these executive defects and the effect that different drugs and sex have on these defects has not been fully clarified. Therefore, we compared the performance of alcohol- (n = 33; 18 women), cocaine- (n = 27; 14 women), and methamphetamine-dependent individuals (n = 38; 25 women) with sex-matched healthy comparisons (n = 36; 17 women) on complex decision-making as measured with the Iowa Gambling Task, working memory, cognitive flexibility, and response inhibition. Cocaine- and methamphetamine-dependent individuals were impaired on complex decision-making, working memory, and cognitive flexibility, but not on response inhibition. The deficits in working memory and cognitive flexibility were milder than the decision-making deficits and did not change as a function of memory load or task switching. Interestingly, decision-making was significantly more impaired in women addicted to cocaine or methamphetamine than men addicted to these drugs. Together, these findings suggest that drug of choice and sex have different effects on executive functioning, which, if replicated, may help tailor intervention.
Homosexuals are believed to have a “sixth sense” for recognizing each other, an ability referred to as gaydar. We considered that being a homosexual might rely on systematic practice of processing relatively specific, local perceptual features, which might lead to a corresponding chronic bias of attentional control. This was tested by comparing male and female homosexuals and heterosexuals – brought up in the same country and culture and matched in terms of race, intelligence, sex, mood, age, personality, religious background, educational style, and socio-economic situation – in their efficiency to process global and local features of hierarchically-constructed visual stimuli. Both homosexuals and heterosexuals showed better performance on global features – the standard global precedence effect. However, this effect was significantly reduced in homosexuals, suggesting a relative preference for detail. Findings are taken to demonstrate chronic, generalized biases in attentional control parameters that reflect the selective reward provided by the respective sexual orientation.
sexual orientation; attention; global precedence
The interest in the influence of videogame experience on our daily life is constantly growing. “First Person Shooter” (FPS) games require players to develop a flexible mindset to rapidly react to fast moving visual and auditory stimuli, and to switch back and forth between different subtasks. This study investigated whether and to which degree experience with such videogames generalizes to other cognitive-control tasks. Video-game players (VGPs) and individuals with little to no videogame experience (NVGPs) performed on a task switching paradigm that provides a relatively well-established diagnostic measure of cognitive flexibility. As predicted, VGPs showed smaller switching costs (i.e., greater cognitive flexibility) than NVGPs. Our findings support the idea that playing FPS games promotes cognitive flexibility.
videogame; task-switching; cognitive flexibility
To head rather than heed to temptations is easier said than done. Since tempting actions are often contextually inappropriate, selective suppression is invoked to inhibit such actions. Thus far, laboratory tasks have not been very successful in highlighting these processes. We suggest that this is for three reasons. First, it is important to dissociate between an early susceptibility to making stimulus-driven impulsive but erroneous actions, and the subsequent selective suppression of these impulses that facilitates the selection of the correct action. Second, studies have focused on mean or median reaction times (RT), which conceals the temporal dynamics of action control. Third, studies have focused on group means, while considering individual differences as a source of error variance. Here, we present an overview of recent behavioral and imaging studies that overcame these limitations by analyzing RT distributions. As will become clear, this approach has revealed variations in inhibitory control over impulsive actions as a function of task instructions, conflict probability, and between-trial adjustments (following conflict or following an error trial) that are hidden if mean RTs are analyzed. Next, we discuss a selection of behavioral as well as imaging studies to illustrate that individual differences are meaningful and help understand selective suppression during action selection within samples of young and healthy individuals, but also within clinical samples of patients diagnosed with attention deficit/hyperactivity disorder or Parkinson's disease.
action control; response inhibition; prefrontal cortex; basal ganglia; interference control
The inhibitory control of actions has been claimed to rely on dopaminergic pathways. Given that this hypothesis is mainly based on patient and drug studies, some authors have questioned its validity and suggested that beneficial effects of dopaminergic stimulants on response inhibition may be limited to cases of suboptimal inhibitory functioning. We present evidence that, in carefully selected healthy adults, spontaneous eyeblink rate, a marker of central dopaminergic functioning, reliably predicts the efficiency in inhibiting unwanted action tendencies in a stop-signal task. These findings support the assumption of a modulatory role for dopamine in inhibitory action control.
Response inhibition; Dopamine; Spontaneous eyeblink
Cocaine is Europe's second preferred recreational drug after cannabis but very little is known about possible cognitive impairments in the upcoming type of recreational cocaine user (monthly consumption). We asked whether recreational use of cocaine impacts early attentional selection processes. Cocaine-free polydrug controls (n = 18) and cocaine polydrug users (n = 18) were matched on sex, age, alcohol consumption, and IQ (using the Raven's progressive matrices), and were tested by using the Global-Local task to measure the scope of attention. Cocaine polydrug users attended significantly more to local aspects of attended events, which fits with the idea that a reduced scope of attention may be associated with the perpetuation of the use of the drug.
Despite the abundance of evidence that human perception is penetrated by beliefs and expectations, scientific research so far has entirely neglected the possible impact of religious background on attention. Here we show that Dutch Calvinists and atheists, brought up in the same country and culture and controlled for race, intelligence, sex, and age, differ with respect to the way they attend to and process the global and local features of complex visual stimuli: Calvinists attend less to global aspects of perceived events, which fits with the idea that people's attentional processing style reflects possible biases rewarded by their religious belief system.
Chronic use of cocaine is associated with a reduced density of dopaminergic D2 receptors in the striatum, with negative consequences for cognitive control processes. Increasing evidence suggests that cognitive control is also affected in recreational cocaine consumers. This study aimed at linking these observations to dopaminergic malfunction by studying the spontaneous eyeblink rate (EBR), a marker of striatal dopaminergic functioning, in adult recreational users and a cocaine-free sample that was matched on age, race, gender, and personality traits. Correlation analyses show that EBR is significantly reduced in recreational users compared to cocaine-free controls, suggesting that cocaine use induces hypoactivity in the subcortical dopamine system.
Chronic use of cocaine is associated with impairment in response inhibition but it is an open question whether and to which degree findings from chronic users generalize to the upcoming type of recreational users. This study compared the ability to inhibit and execute behavioral responses in adult recreational users and in a cocaine-free-matched sample controlled for age, race, gender distribution, level of intelligence, and alcohol consumption. Response inhibition and response execution were measured by a stop-signal paradigm. Results show that users and non users are comparable in terms of response execution but users need significantly more time to inhibit responses to stop-signals than non users. Interestingly, the magnitude of the inhibitory deficit was positively correlated with the individuals lifetime cocaine exposure suggesting that the magnitude of the impairment is proportional to the degree of cocaine consumed.