Human cognition is characterized by the ability to parse and evaluate a stream of constantly changing environmental stimuli in order to choose the most appropriate response in evolving conditions. The dorsal anterior cingulate cortex (dACC) is thought to play an important role in regulating cognitive control over goal-directed behavior. Various theories postulate its involvement in linking reward-related information to action1,5,6
, monitoring for conflict between competing responses2,7,8
, or detecting the likelihood of error commission3,9,10
. Despite substantial information from functional magnetic resonance imaging (fMRI), event-related potential, and lesion studies, considerable debate continues to exist regarding the neurophysiological basis of its regulatory role.
We studied dACC function using a combination of fMRI, single-neuronal recordings, and with pre- and post-lesion behavior in human subjects undergoing surgical cingulotomy—a procedure in which a precise stereotactically targeted lesion is created in the dACC. Microelectrode recordings, which are routinely performed during the procedure1,11,12
, allowed us to record from individual dACC neurons. Six subjects participated, and in four of these we also obtained a preoperative fMRI using the same task. In four we recorded behavioral responses using the same task immediately following cingulotomy.
Subjects performed the multi-source interference task (MSIT)13
, a Stroop-like task in which they viewed a cue consisting of three numbers and had to indicate via button-press the unique number (‘target’) that differed from the other two numbers (‘distracters’) (). By varying the position of the target and the identity of the distracters, the task established four distinct trial types (), which were presented to the subject randomly. These trial types contained three levels (Type 0, 1, and 2 trials; ) of cognitive interference
, operationally defined here as the tendency of an irrelevant stimulus feature (e.g.
, position of the target) to impede simultaneous processing of the relevant stimulus feature (e.g.
, identity of the target).
Fig. 1 Behavioral task, functional MRI, and subject performance. (A) The Multi-Source Interference Task. (B) Four trial types are possible, based on whether there is spatial congruence between the position of the target and correct button response, and whether (more ...)
When comparing high-interference (Type 2) to non-interference (Type 0) trials, there was increased fMRI signal within the dACC (), indicating increased neuronal population activity during trials with greater cognitive interference. Other cortical regions known be involved in this decision-making network, such as the dorsolateral pre-frontal cortex (DLPFC), were similarly activated to a greater degree in the high interference condition (Figure S1
). The spatial distribution and magnitude of these changes were similar to those previously observed in healthy volunteers14,15
, suggesting that this function is spared in the dACC in our subject population and comparable to normal subjects. We co-registered the postoperative MRI () with the preoperative fMRI, confirming that the recording and lesion site co-localized with the region of fMRI activation.
During intraoperative microelectrode recordings, subjects performed the task accurately, with an error rate of 1.4%. Reaction times (RTs) were modulated by degree of interference in a dose-dependent fashion (, S2
< 1 × 10−20
, ANOVA). The trial type-dependent reaction times and low error rates were consistent with the tendency to sacrifice speed for accuracy that is often observed in Stroop-like tasks8,16
We recorded 59 well-isolated, single dACC neurons, with an average baseline firing rate of 5.7 ± 0.7 (mean ± s.e.m.) spikes/sec. We identified three distinct sub-populations of neurons based on their maximal task-responsiveness: those firing preferentially (1) before the cue (n
= 12; 20%); (2) after presentation of the cue (n
= 24; 41%); and (3) after the behavioral choice (n
= 23; 39%). The largest group, or cue-responsive neurons, showed distinct modulation of firing based on the degree of interference present in the cue (p
= 0.02, ANOVA, Table S1
). Paralleling the pattern for RTs, firing rates for Type 2 trials were higher than those for Type 1 trials, which were higher than those for Type 0 trials. This effect during the cue epoch was observable at the level of individual neurons (), as well as at the cue neuron population level (). Inclusion of the entire recorded neuronal population produced similar effects (Figure S3
), and using raw rather than normalized rates did not change this result. Neuronal activity within the dACC thus correlated with the degree of cognitive interference present in the cue.
Fig. 2 Individual and population neuronal responses. (A) Example neuron showing modulation of firing based on cue-related interference. Rasters for Type 0 (green), 1 (blue), and 2 (red) trials are shown aligned to the cue (black line) and choice (gray line). (more ...)
The trial type-dependent modulation in firing rate could either be a consequence of dACC neuron sensitivity to the amount of conflict engendered by the cue2,17
or to the number of potential responses activated by the cue18
(Supplementary Note 1
). To distinguish between these possibilities, we identified trials in which the number of potential responses remained constant (two), but the amount of conflict varied (one or two types of conflict). Firing rates were significantly higher (p
= 6.4 × 10−3
, Mann-Whitney test) in higher conflict trials, indicating that dACC neurons were encoding conflict per se
, and not the potential number of responses (Figure S4
). Reaction times for the higher conflict trials were also significantly higher (p
= 1.5 × 10−4
, t-test), providing behavioral evidence for the increase in perceived conflict. In a two-way ANOVA (with degree of conflict as one variable and the number of possible responses as the other variable) including all trials, the degree of conflict was a significant independent predictor of firing rate (p
= 5.7 × 10−3
), whereas the number of possible responses was not (p
Current models of dACC function, whether predicated upon conflict monitoring2,8,17
, reinforcement learning3,19
, or reward-based decision making1,20,21
, require that future dACC activity reflects past experience, but modulation of dACC firing based on recent history has not been demonstrated at the single-neuronal level. To determine whether dACC neuronal firing rates are influenced by previous activity, we separated Type 0 and Type 2 trials based on whether they were immediately preceded by a trial containing interference (Type 1 or 2) or not (Type 0). In both cases, dACC neuronal activity increased more rapidly following the cue when the preceding trial contained interference (). Average magnitude of the cue neuron signal was greater in trials preceded by interference. This finding held for the entire neuronal population as well (Figure S5
), and was not altered by using raw rather than normalized rates.
Fig. 3 Effect of previous trial on dACC firing and reaction time. (A) Activity was greater for current non-interference trials immediately preceded by interference trials (1,2→0; black) than non-interference trials (0→0; purple). (B) Similarly, (more ...)
The association between previous and current trial activity was maintained across all successive trial pairs. On a trial-by-trial basis including all trial types, previous trial activity during the cue period significantly correlated with current trial activity (r
= 0.15, p
for cue neurons; r
= 0.12, p
for all neurons, Supplementary Note 2
), demonstrating that dACC neurons encode information about both the current task context and the recent past. This neural-neural correlation was not simply an effect of drift in the recordings (Supplementary Note 3
The behavioral correlate of this neuronal pattern of activity depended upon the identity of the current trial. Reaction times for Type 0 trials correlated positively with previous trial activity (r
= 0.13, p
), meaning that preceding elevated activity (consistent with a previous difficult trial) predicted a longer RT on the current non-interference trial. Reaction times during Type 2 trials, however, correlated negatively with previous trial activity (r
= -0.09, p
), meaning that preceding elevated activity predicted a shorter RT on the current interference trial. Taken together, these findings would predict that RTs for a particular trial type would be shorter when the preceding trial was of the same type, and longer when of a different type (Supplementary Note 4
The behavioral responses bore out these predictions. Reaction times during noninterference trials were shorter when preceded by another non-interference trial (0→0) vs. an interference (1,2→0) trial (, S6A
). Conversely, RTs during high interference trials were shorter when preceded by an interference (1,2→2) trial vs. a noninterference (0→2) trial (, S6B
). This history-dependence of dACC neuronal activity provides a neurophysiological basis for the current data and previous observations of behavioral adaptations, known as micro-adjustments, conflict adaptation, or the Gratton effect2,4,8,17,22,23
We analyzed post-cingulotomy task performance, and thereby captured the acute behavioral manifestations of a precise, stereotyped, reproducible lesion to the previously recorded area (Figure S7
). Error rate following cingulotomy was 1.3%, indicating that subjects had not changed in their ability to perform the task. The post-cingulotomy reaction time distribution was similar to that observed prior to the lesion (, S8A
), with longer RTs associated with increasing interference (p
< 1 × 10−12
, ANOVA). Furthermore, there was no difference in pre- vs. post-cingulotomy mean RT (p
= 0.76, t-test). Thus the dACC lesion did not significantly disrupt the subjects’ ability to perform the task, nor did it affect the dependence of RT on the cognitive load presented by the current stimulus.
Fig. 4 Abolition of behavioral adaptation following a targeted dACC lesion. RTs were recorded following cingulotomy, in which a stereotactic lesion was created precisely in the region of the dACC from which fMRI signals and microelectrode recordings were obtained. (more ...)
Strikingly, cingulotomy caused abolition of the history-dependent modulation of reaction times. The pre-lesion trial-to-trial adaptations in RT were significantly reduced after cingulotomy for both non-interference (p
= 2.2 × 10−8
, bootstrap test) and high-interference trials (p
= 7.1 × 10−3
). Consistently, the difference in RT attributable to the previous trial that existed before the lesion (, S6A, B
) disappeared after the lesion (, S8B, C
); that is, reaction times did not depend upon the preceding trial type. This effect was observable at the population level and at the level of individual subjects (Figure S9
). Thus, although dACC lesions did not globally degrade subject performance, they eliminated the dependency of behavioral responses on recent experience.
These trial-to-trial behavioral adjustments (faster RT when the preceding trial type was the same as the current) are concordant with those reported by others2,4,8,17,22,23
. During high interference trials preceded by a high interference trial, however, we observed increased dACC single neuron activity, whereas others have reported decreased BOLD fMRI signal2,17
. This apparent discrepancy may be explained by the fact that the peak fMRI signal (which occurs 5-7 seconds after the cue) reflects input synaptic activity evoked by both the appearance of the cue and evaluation of the response (which all occur within the first second after the cue). On the other hand, we recorded output spiking activity occurring within 500-750 ms after the cue. These complementary measures may thus reflect the spatiotemporal dynamics of conflict processing in the dACC.
In this task, increasing interference within the cue could variably be interpreted as representing increasing conflict between competing potential responses8
, likelihood of error commission3
, or energetic decision-making cost20
. Consistent with these theories, we find that dACC activity correlates with these manifestations of cognitive demand. However, though the dACC is modulated by the cognitive load within the current task context, its function is not essential for generating the load-dependent behavioral response16,24
, as interference-dependent behavior was not altered following dACC ablation. In contrast, an intact dACC is required for trial-to-trial behavioral adjustments.
Previous studies have proposed that the dACC monitors for conflict between competing responses7,8
and drives behavior towards efficient strategies20
. Whereas previous human fMRI2,17
, and lesion26
studies have implicated the dACC in this function, single unit data supporting this theory have been lacking. Moreover, non-human primate single unit recordings18,21
studies have arrived at opposite conclusions and cast doubt on the conflict monitoring theory27
. We reconcile these issues by demonstrating fMRI and single neuronal conflict signals in the human dACC, as well as behavioral adjustments that disappear after a precisely targeted lesion. Our results support the view that the dACC is specifically responsible for providing a continuously updated account of predicted demand on cognitive resources. This account is particularly sensitive to relative shifts in situational complexity from instance to instance, weighted by the recent past. The salient influence of current dACC activity on future neuronal activity and behavior permits implementation of behavioral adjustments that optimize performance. In situations in which cognitive demands remain constant, this signal facilitates efficiency by accelerating responses. In situations involving rapidly changing demands, it promotes accuracy by retarding responses.