Strain differences were apparent for most dependent measures of the Go/No-go task including hits, false alarms, and latency to respond on Go and No-go trials, as well as the index of discrimination, d′. However, there were no significant differences among strains for rate of precue responding or efficiency. Those strain differences observed likely indicate the presence of significant genetic influences on this task. Possible influences of early rearing environment, such as differences in maternal care, cannot be ruled out as a source of some variation among strains, but such maternal differences among strains are likely also associated with genetic differences. Future studies using cross-fostering techniques would address the possible impact of early environment on these impulsivity measures, which to the best of our knowledge has not been examined.
Both false alarms (errors of commission) and rate of precue responding have been used in prior research as measures of behavioral inhibition or motor impulsivity (e.g., Mitchell, 2004
). However, other interpretations of both measures are possible. One interpretation is that false alarms reflect the ability of the subject to learn the instrumental omission contingency. This suggestion stems from the observation that two strains exhibited no appreciable discrimination between the Go and No-go signals (C57BL/6J and NZB/B1NJ). Although they did learn to respond during the Go signal (), it could be argued that they did not learn the No-go contingency. However, it could also be argued that ignoring the No-go signal is the essence of a failure of inhibition. This highlights one of the limitations of the current task, that is, its failure to easily differentiate between a failure to learn the No-go signal and complete failures of inhibition. A second interpretation of the false alarms measure is that it reflects the degree of generalization of the approach behavior elicited by the Go signal. This suggestion stems from the observation that Go and No-go response differences could also be partially explained in terms of classical conditioning processes, specifically sign tracking (autoshaping: Brown & Jenkins, 1968
). That is, the light above the nose poke hole where food is delivered may elicit approach behavior, because it is a spatially restricted visual cue proximal to the food delivery location. Although it should be noted that, in the archetypal sign tracking procedure, food presentation is independent of responding, while in our procedure food delivery is contingent on nose poking. The tone, being spatially diffuse, is less likely to acquire such a conditioned response, or may create an orienting conditioned response, a behavior incompatible with nose poking (Holland, 1977
). Thus, differences in sign tracking for these two stimuli could contribute significantly to the apparent instrumental discrimination, and the number of No-go responses may index the extent of generalization of the sign tracking response from the Go trials. Genetic differences in sign tracking have previously been noted in mice (e.g., O’Connell, 1980
) and rats (e.g., Kearns et al., 2006
). So it is entirely reasonable that differences in the generalization of a Pavlovian-based conditioned approach response could underlie strain differences in false alarms. However, it can be argued that generalization of approach responding to a situation in which this is inappropriate (No-go trials) is not incompatible with the use of false alarms as a measure of failure to inhibit responding. This explanation would provide a Pavlovian-based mechanism to explain the behavior rather than an instrumentally based one. Future studies could examine the extent to which false alarms are determined by generalization of sign tracking by counterbalancing the stimuli associated with the Go and No-go trials. Additional manipulations to the intensity of the light and tone stimuli could also be used to determine the role of strain differences in responses to stimulus salience on the approach behavior. Finally, alternative interpretations of precue responding are also possible. The nose poking that constitutes the precue responding can be viewed as an anticipatory, classically conditioned response to the onset of the houselight, a conditioned stimulus signaling the possible delivery of food in the food receptacle. Thus, individual differences in precue responding could also reflect an individual’s ability to acquire conditioned responses in general, rather than impairment of behavioral inhibition.
It is somewhat surprising that strain differences were only observed for the false alarm measure, although false alarms and precue responding were highly correlated (r
= 0.84). There are several differences between the measures that might account for the lack of effect. First, the contingencies operating on precue responding are different. A response during the No-go signal is followed by the tone and houselights being turned off and a 10-s timeout period, i.e., an omission or negative punishment contingency is in effect. A response during the precue period does not alter the stimulus conditions in the chambers. Thus, the strain effects may reflect a differential sensitivity to the omission contingency, which would not impact precue response rate. This hypothesis could be investigated by examining the first five sessions of the experimental phase to determine whether some strains exhibited decreases in the number of false alarms, suggesting that the omission contingency was affecting behavior. While analyses indicated that there was a significant decline in the number of false alarms over the first five sessions [F
(4, 304) = 3.54, p
< 0.01], presumably indicating that mice were learning to withhold responses, the strain×session interaction was not statistically significant [F
(36, 304) = 1.23, p
= 0.18], indicating that differences were not reliable. Future studies specifically targeted to determine whether there are strain differences in sensitivity to omission contingencies might be valuable, given that differences in sensitivity to positive reinforcement contingencies are known to exist and contribute to problems such as drug and alcohol abuse (e.g., Robinson and Berridge, 1993
). Further, it should be noted that, while no strain differences were identified for precue rate of responding, this does not negate its value as a measure of impulsivity.
Levels of behavioral inhibition do not appear to be the result of differences in the motivational properties of the reinforce between strains, measured using sucrose consumption, and sucrose preference, as the correlation between these measures and false alarms and rate of precue responding was not significant. However, this does not indicate that performance would be insensitive to changes in motivation, e.g., if levels of food restriction were altered. Further, locomotor activity and level of habituation also did not correlate with precue responding or false alarms, indicating that the Go/No-go task does not simply reflect levels of activity by these strains.
A major goal of the current study was to examine genetic correlations between Go/No-go task performance and previous measures of ethanol responses. Our data did indicate that both false alarms and precue responding were significantly positively correlated with severity of ethanol withdrawal, as measured by scoring handling-induced convulsions (Kosobud & Crabbe, 1986
). The chronic measure of withdrawal data were drawn from Metten and Crabbe (2005)
and were obtained after mice were exposed to ethanol for 72 hours using the vapor inhalation method (e.g., Crabbe et al., 1983
), scores corrected for strain differences in the air/pyrazole control condition. The acute measure of withdrawal data were drawn from Metten and Crabbe (1994)
and were obtained following a single 4 g/kg injection, corrected for average baseline scores. The connection between augmented activity in food-restricted animals in the face of signals indicating food is unavailable and higher levels of central nervous system excitability during alcohol withdrawal is not entirely clear but may reflect the action of common activating mechanisms that are initiated when homeostasis is disrupted.
It is somewhat surprising that these impulsivity measures are positively correlated with severity of withdrawal, as other data have indicted that withdrawal severity exhibits a negative genetic correlation with ethanol drinking (Metten et al., 1998
). Indeed, there is a tendency for false alarms and precue responding to exhibit a negative relationship, i.e., higher number of false alarms and precue responses (lower inhibition/impulsivity) and lower ethanol consumption. Instead it would be anticipated that lower levels of impulsivity should be associated with higher levels of ethanol consumption. Future research to understand this relationship and the genetic networks are that are responsible is required to resolve these apparent contradictions.
Previous work indicated that strain-dependent levels of efficiency on an appetitive nose poke task could predict strain differences in consumption of 10% ethanol (Logue et al., 1998
). The measure of efficiency provided by Logue and colleagues has aspects in common with our measure of precue responding, in that both measures are determined by the number of responses performed that are extraneous to earning reinforcers. Contrary to the results reported in the Logue and colleagues study, we observed negative correlations between ethanol consumption, at a number of concentrations, and precue responding. While these correlations were not statistically significant given the low sample size, they did appear to be relatively similar across conditions, suggesting their robustness. Unfortunately, direct comparisons between measures obtained in the Logue and colleagues study and the current study are very difficult for several reasons. First, it is somewhat unclear the extent to which the strains used by the current study and that of Logue and colleagues overlapped, as Logue and colleagues did not provide entire strain designations, e.g., A versus A/HeJ. Second, while some of their strains originated at The Jackson Laboratory, as did all of the strains in the current study, others were bred on site or were obtained from a different vendor. Third, measures in the Logue and colleagues study were not stable for all strains at the time of testing, so there is a possibility that their results were compromised. Fourth, the current study used a procedure in which precue responding was punished by resetting the precue period, while this was not the case for the Logue study. Thus, our study may be picking up a relationship between sensitivity to this form of punishment and ethanol consumption, while Logue and colleagues would not have been able to address this possibility.
Our previous work using lines of mice selectively bred for high and low levels of ethanol consumption did identify a greater level of impulsivity in the high ethanol drinking line, when measured using the Go/No-go task, but not a delay discounting task (Wilhelm et al., 2007
; but also see Oberlin and Grahame, 2009
). In rats bred for high and low alcohol drinking, impulsivity measured with a delay discounting task was higher in the high drinking line (Wilhelm and Mitchell, 2008
); these rats were not examined using the Go/No-go task. Outcomes of genetic correlation analyses measured using inbred strain panels often do not agree with those from selected lines. We have addressed possible reasons for this in a previous publication (Crabbe et al., 1990
). Additional research examining whether the relationship between Go/No-go performance in lines selected for withdrawal severity would be an appropriate way to examine whether the significant correlations between impulsivity and that ethanol-related trait would be informative. In addition, it would be useful to conduct studies examining whether the genetic correlation observed in this study can be observed within subjects, i.e., whether individuals exhibiting lower levels of impulsivity exhibit greater severity to withdrawal examined acutely and chronically. Finally, given the data that performance on the Go/No-go was controlled to some degree by genetic factors, additional research would be valuable that identified the gene networks contributing to the relative sensitivity of the different strains to the positive and the negative outcomes that presumably drives performance on this task.