|Home | About | Journals | Submit | Contact Us | Français|
If people are brought into the laboratory and given alcohol, there are pronounced differences among individuals in many responses to the drug. Some participants in alcohol challenge protocols show a cluster of “low level of responses to alcohol,” determined by observing post-drinking related changes in subjective, motor and physiological effects at a given dose level. Those individuals characterized as having low Level of Response (LR) to alcohol have been shown to be at increased risk for a lifetime diagnosis of alcohol dependence (AD), and this relationship between low LR and AD appears to be in part genetic. LR to alcohol is an area where achieving greater consilience between the human and rodent phenotypes would seem to be highly likely. However, despite extensive data from both human and rodent studies, few attempts have been made to evaluate the human and animal data systematically in order to understand which aspects of LR appear to be most directly comparable across species and thus the most promising for further study. We review four general aspects of LR that could be compared between humans and laboratory animals: 1) behavioral measures of subjective intoxication; 2) body sway; 3) endocrine responses; and 4) stimulant, autonomic and electrophysiological responses. None of these aspects of LR provide completely face-valid direct comparisons across species. Nevertheless, one of the most replicated findings in humans is the low subjective response, but, as it may reflect either aversively-valenced and/or positively-valenced responses to alcohol as usually assessed, it is unclear which rodent responses are analogous. Stimulated heart rate appears to be consistent in animal and human studies, although at-risk subjects appear to be more, rather than less sensitive to alcohol using this measure. The hormone and electrophysiological data offer strong possibilities of understanding the neurobiological mechanisms, but the rodent data in particular are rather sparse and unsystematic. Therefore, we suggest that more effort is still needed to collect data using refined measures designed to be more directly comparable in humans and animals. Additionally, the genetically mediated mechanisms underlying this endophenotype need to be characterized further across species.
In the early 1980’s, Marc Schuckit embarked on a landmark prospective study comparing two cohorts of young men. One group was Family History Positive (FHP) for alcoholism (sons of alcoholics), and the other FH Negative (having no first or second degree relatives who were alcoholic). His research group brought them into the laboratory and gave them alcohol (sometimes referred to as an “alcohol challenge”), and assessed a number of their responses to alcohol and placebo. Among these were body sway, plasma cortisol and prolactin levels, electrophysiological measures and subjective responses to alcohol (self reported “high,” etc). He found that FHP individuals were less responsive to alcohol than FHN individuals. Examples of these early findings are presented in (Schuckit 1985) and (Schuckit, Gold 1988). These findings have been confirmed by a number of groups including a meta-analysis of studies evaluating subjective feelings following an alcohol challenge (Pollock 1992).
These studies were not unchallenged, and some investigators reported finding the opposite – that FHN were more responsive to alcohol [e.g. (McCaul et al. 1990)]. David Newlin surveyed this literature and attempted to explain this contradiction (Newlin, Thomson 1990). In essence, what he found was that the differences that the Schuckit group described generally occurred beginning around one hour after ingestion, when individuals were at the peak or on the descending arm of the blood alcohol curve. Those who found FHN individuals to be more responsive to alcohol tended to find changes earlier after ingestion. The overall suggestion of this analysis was that FHP individuals may really differ most in the development of acute functional tolerance. Because they develop more tolerance, and more quickly, they look less affected than FHN an hour or so after ingestion. Acute functional tolerance was first documented by Mellanby (1919), who placed dogs on a treadmill and slowly infused alcohol intravenously, leading to first rising and then falling blood alcohol levels (BAL). He noted that they stumbled more when BAL was rising than they did at the same BAL during the descending portion of the curve. This showed that they had very rapidly become tolerant to alcohol’s intoxicating effects. Acute functional tolerance has been studied extensively in rats and mice (Kalant et al. 1971;Erwin, Deitrich 1996;Ponomarev, Crabbe 2002).
Why is this important? When the Schuckit group followed these individuals up to 10 and 20 years later, they found that FHP individuals were more likely to develop alcohol related problems associated with alcohol use disorders (AUD) or alcohol dependence (AD) in later life (Schuckit, Smith 1996). However, they also found that individuals characterized as having “low response (LR) to alcohol (most of whom were also FHP, although not all) also had higher risk for developing alcohol related problems. Subsequent hierarchical regression/path analysis/structural equation modeling studies have shown that both the low LR phenotype and FHP status are predictive of alcohol problems in later life, together and independently, and that other variables modulate their predictive value as well [e.g. (Schuckit, Smith 2000;Schuckit et al. 2004;Schuckit, Smith 2006a;Schuckit, Smith 2006b;Schuckit et al. 2008;Trim et al., 2009)].
Subsequent studies of LR in humans and laboratory animals are abundant. However, the phenotypic targets of such studies across species are not necessarily the same. Because of the inherent power of animal studies to explore neurobiological mechanisms including genetic sources of influence, it would be advantageous to be studying related phenotypes across species. We consider four different domains of LR in humans and attempt to evaluate their potential as targets for the further development of more consilient phenotypes in humans and in laboratory animals. In discussing the animal literature, we concentrate on rodents and review evidence for the genetic relationship between “low response” in rodents (variously defined) and two alcohol-related traits – two-bottle preference drinking, and the severity of withdrawal after chronic alcohol exposure. As there is no “gold standard” end point in rodents (the search for a better version of which is one of the goals of the consilience project), we have focused on these two as they are very commonly studied, have some degree of face validity, and have been extensively genetically characterized.
Two-bottle preference tests can be conducted in many ways, but the most common is to offer animals one water bottle and one bottle with an alcohol solution, usually 10%, a little less than most table wines (McClearn, Rodgers 1959;Mardones, Segovia-Riquelme 1983). Access is most often continuous, and the amount drunk from each bottle is measured each day. Intake is expressed as (g alcohol) / (kg body weight), or as the proportion of total fluid taken from the alcohol bottle (preference ratio). To measure withdrawal, animals are made dependent. A low grade of dependence can be induced by a single (or repeated) high-dose bolus injection or intragastric infusion of alcohol (McQuarrie, Fingl 1958;Majchrowicz 1975). To achieve more severe dependence, alcohol is administered chronically in a liquid diet (Tabakoff et al. 1977) or the animal is confined to a sealed chamber filled with alcohol vapor for a period of days (Goldstein, Pal 1971). Recently, interest has increased in exposing animals to chronic, intermittent periods of vapor, which increases the intensity of withdrawal (Becker, Lopez 2004). Dependence cannot be measured directly, but must be inferred from the appearance of signs of withdrawal when drug administration ceases (Kalant et al. 1971). In mice, the most frequent index of withdrawal is the appearance of a handling induced convulsion that waxes and wanes in severity (Goldstein, Pal 1971). In rats, other physical and behavioral signs are used. Both species share with humans a wide spectrum of withdrawal symptoms (Friedman 1980).
To assess potential genetic correlation between a prospective index of LR and either of these endpoints in rodents, we examined two types of evidence (Crabbe et al. 1990). First, there are published data on several inbred strains of mice for two-bottle alcohol preference drinking (Phillips et al. 1994;Belknap et al. 1993a;Wahlsten et al. 2006) and withdrawal severity (Belknap et al. 1993b;Crabbe 1998;Metten, Crabbe 2005). The extent to which strain mean values for these targeted traits correlate with putative low response measures estimates the degree of genetic co-determination of the traits. Second, there are numerous rat lines (P, HAD-1, HAD-2, AA, sP, mSP, and UChB - See Table 1) that have been selectively bred for high alcohol preference, using virtually identical methods [for reviews, see (Bell et al. 2006;Ciccocioppo et al. 2006;Colombo et al. 2006;Quintanilla et al. 2006;Sommer et al. 2006)]. These have been compared with their counterpart lines bred for low drinking (NP, LAD-1, LAD-2, ANA sNP, UChA) on a number of behaviors. We reason that if several of these pairs of rat lines differ consistently on a behavioral response to alcohol in the direction predicted by the work of the Schuckit group, this would suggest substantial genetic correlation of that trait with drinking. There are also rat lines bred for high (HARF) vs low (LARF) alcohol preference drinking during a limited access test (Le et al. 2001), and high (HAP-1, HAP-2) and low (LAP-1, LAP-2) preference drinking mouse lines (Grahame et al. 1999). Mouse lines selectively bred for severe (Withdrawal Seizure-Prone, WSP-1 and WSP-2) versus mild (Withdrawal Seizure-Resistant, WSR-1 and WSR-2) withdrawal handling induced convulsions after chronic alcohol vapor inhalation have provided some relevant data (Crabbe 1996). Other pairs of mouse lines selected for alcohol-related traits [e.g., FAST/SLOW, HAW/LAW (Phillips et al. 1991;Metten et al. 1998), see below] have also been useful. Finally, there are QTL data for human AD diagnoses, and for the low LR trait, and we touch upon the evidence for consistency in this area as well [for a more thorough comparison of the QTL data, see Ehlers this issue (Ehlers et al. 2010)]. We did not explore the data from the nearly 100 genes that have been targeted in mice for over expression, deletion, or knockdown; the gene targeting data have been reviewed elsewhere (Crabbe et al., 2006).
Very early in the discovery of the low LR phenotype Schuckit and colleagues found that the less intense response to alcohol was not just subjective but also extended to hormonal responses. They initially found that sons of alcoholics demonstrated lower cortisol levels after drinking (Schuckit et al. 1987a). In that same year they also reported lower prolactin levels following alcohol challenge in this same group of individuals (Schuckit et al. 1987b) and finally, they reported lower ACTH levels in sons of alcoholics following high doses of alcohol (Schuckit et al. 1988). These findings were partially replicated in a sample of sons of alcoholics selected from the Consortium on the Genetics of Alcoholism (COGA) study where lower level of intensity of response to alcohol as measured by changes in cortisol were found (Schuckit et al. 1996). Although hormone studies have the advantage of using objective measures that may be closer to the mechanisms underlying LR than subjective measures, there have been relatively few other human studies that have used hormone responses to index LR. In one study, Asians with ALDH2*2 alleles, who are at very low risk for the development of alcoholism, were found to have significantly higher cortisol levels after alcohol challenge than Asians who did not have that allele (Wall et al. 1994). Additionally, a group of Native Americans at high risk for the development of alcoholism had a severely dimished cortisol response following an alcohol challenge (Garcia-Andrade et al. 1997). There have been some data also suggesting that alcohol may dampen cortisol, ACTH, vasopressin and prolactin responses to stress to a larger extent in sons of alcoholics compared to controls (Croissant, Olbrich 2004;Zimmermann et al. 2004;Zimmermann et al. 2009), although this has not been found in all studies (Dai et al. 2007).
Release of CRF by alcohol and/or stress also elicits release of beta-endorphin, an endogenous opioid peptide. Beta-endorphin plays a role in brain reward pathways and is affected by all drugs of abuse, including alcohol, with alcohol’s induction of beta-endorphin appearing to be at least in part under genetic control (Froehlich et al. 2000). Moreover, it has been suggested that low levels of endogenous opioid peptides including beta-endorphin may characterize individuals at genetic risk for developing AD. Changes in beta-endorphin synthesis, processing and release after chronic alcohol have been studied frequently, and a confusing array of results have been reported, probably due to differences in subjects, exposure regimens, doses of alcohol, and timing of hormone assessments [for review, see (Gianoulakis 2004)]. The mu opioid receptor antagonist naltrexone is one of the few available treatments for AD, with a polymorphism of the OPRM1 (opioid receptor mu 1) receptor gene providing some predictability for naltrexone’s clinical efficacy [e.g., (Ooteman et al. 2009)].
Rodents, too, show elevated corticosterone (analogous to human cortisol), prolactin, and beta-endorphin, after an alcohol injection. Rats genetically selected for high alcohol preference (AA) have been shown to have a blunted corticosterone response to a high dose of alcohol compared to a line selected for low preference (ANA), but only under conditions of social isolation (Apter, Eriksson 2006). AA rats showed greater POMC levels than non-drinking ANAs, and lower baseline beta-endorphin release, but the two lines did not differ in alcohol-stimulated release (De Waele et al. 1994). However, the HPA-system does appear to regulate alcohol-intake by the AA, but not ANA, rat: adrenalectomy, and corticosterone replacement, altered alcohol intake in AA, but not ANA, rats (Fahlke, Eriksson 2000). In P rats, lower levels of CRF were found in the amygdala, hypothalamus, prefrontal cortex, and cingulate cortex, compared with NP rats (Ehlers et al. 1992). In sP rats, higher levels of serum corticosterone have been reported, compared with sNP rats, suggesting differences in HPA activity (Bano et al. 1998). An examination of the data from selected mouse lines revealed that WSP and WSR mice had similar corticosterone response to acute alcohol injections (D. A. Finn, personal communication). In addition, when exposed to alcohol vapor and tested at the time of withdrawal, both selected lines had equivalent elevations of corticosterone (Beckley et al. 2008;Tanchuck et al. 2009). Differences in opioid receptor levels between the selectively bred high and low alcohol-consuming rat lines have been reported (AA vs. ANA: (De Waele et al. 1995;Soini et al. 1998;Soini et al. 1999); P vs. NP: (McBride et al. 1998); HAD-1 vs. LAD-1: (Learn et al. 2001); sP vs. sNP: (Fadda et al. 1999). However, interpretation is difficult given that: (a) different brain regions were assessed and/or found to display line differences; (b) differences in direction of effect were seen between the different sets of selected lines (e.g., for mu-opioid receptor levels, AA< or > ANA, P > NP, HAD-1 < LAD-1); as well as (c) mixed results as to the importance of the mu-versus the kappa-opioid receptor. Regarding this last point, there is evidence, from family-based studies, of a significant association between a variation in the OPRK1 (opioid receptor kappa 1) gene and a higher risk for developing alcoholism (Edenberg et al. 2008).
Another source for evidence of genetic correlation between hormone release by alcohol and genetic tendencies toward drinking or withdrawal is the BXD recombinant inbred (RI) strains of mice. This panel of strains was developed for gene mapping from the intercrosses of C57BL/6J and DBA/2J inbred strains, and have been tested for acute (Belknap et al. 1993b) and chronic (Crabbe 1998) withdrawal severity and preference for 10% alcohol in the two-bottle choice test (Phillips et al. 1994). Data are archived on Gene Network (www.genenetwork.org). These strains have also been tested for corticosterone levels at several time-points after different doses of alcohol (Roberts et al. 1995). However, when we examined the correlations of strain means, neither two bottle preference drinking nor withdrawal severity were significantly correlated with any corticosterone response measures, across 17-25 strains (*r*< 0.33). Finally, we searched for similar data from standard inbred strains of mice. The only substantial strain survey we could find was an old study from the laboratory of Donald Keith (Crabbe et al. 1983) that assessed resting pituitary beta-endorphin levels in 10 inbred strains that were also characterized for chronic withdrawal severity and drinking. Genetic correlations, however, were non-significant (*r*< 0.20). A few other studies reporting corticosterone levels studied too few mouse strains to make such analyses meaningful.
Few studies have tried to determine the mechanisms of these differences in hormone levels following alcohol challenge. In one study, FHP and FHN nonalcoholic men were given alcohol and subsequently ovine CRF, and FHP men were found to have similar ACTH responses to oCRF in both the alcohol and placebo sessions; in contrast, FHN men had blunted responses to ACTH during the alcohol session (Waltman et al. 1994). In adult rats, acute alcohol increases corticosterone and ACTH levels primarily by stimulating the hypothalamic release of CRF and possibly vasopressin [see (Lee et al. 2004); (Rivier 1996)]. Effects of acute alcohol on beta-endorphin release are biphasic, as higher doses reduce hormone levels. Taken together these studies are consistent with the idea that low LR hormone responses may in part reside in the hypothalamus.
In summary, the human low LR hormone differences reported are rather small, and the low LR and normal response (or FHP and FHN) populations overlap substantially. The animal data are relatively sparse, although there are clearly genetic differences in hormone responses to alcohol and innate differences in hormone and opioid receptor levels between genetically selected rat lines and among inbred mouse strains that could be examined in more detail. Overall, this is an area where we would be close to mechanisms, as rodent and human stress axes appear to work rather similarly. The available data, however, are not terribly encouraging, as the rodent data don’t point strongly toward genetic co-regulation of hormone responses to alcohol challenge and either drinking or withdrawal severity.
One of the most studied and replicated measures of LR to alcohol has been the use of scales that index how intensely an individual feels the effects of alcohol following a challenge dose that is given in the laboratory. Subjective response to the effects of alcohol lends itself to establishing consilience, because the association between this response and alcohol consumption/breath alcohol concentration is under substantial genetic control [c.f. (Viken et al. 2003;Heath, Martin 1991)]. The most commonly used scale is the Subjective High Assessment Scale (SHAS) (Judd et al. 1977). This scale asks subjects to rate themselves on a number of items including: feeling the effects of alcohol overall; feeling drunk, high, clumsy, confused, dizzy, great, terrible, floating; having difficulty concentrating; and other such items [see (Schuckit et al. 2009)] every 15-30 minutes following alcohol administration, usually for a period of 210 minutes. It has been used in a variety of studies in a number of populations including adult children of alcoholics [aCOAs: see (Schuckit, Gold 1988;Eng et al. 2005;Schuckit et al. 1996)], Asians with the ALDH2*2 allele (Wall et al. 1992), and Native Americans (Garcia-Andrade et al. 1997). It appears to index LR to alcohol reliably in these populations and has been shown to account for almost 50% of the variance in response to alcohol in FHP men (Schuckit, Gold 1988).
One criticism of the SHAS is that it primarily indexes effects of alcohol that may be sedative or anxiolytic and does not adequately measure feelings of “activation.” The Biphasic Alcohol Effects Scale (BAES) was developed to allow the subjective rating of both stimulant and sedative effects (Martin et al. 1993). The use of this scale has been limited, and one study did not find statistically significant differences between alcohol and placebo (Davidson et al. 2002). Another study found that aCOAs and controls had similar levels of stimulation following alcohol, but that aCOAs had a lower baseline level of stimulation than adult non COAs (Erblich et al. 2003). Additionally, alcoholics given a low dose of alcohol were found to report more stimulation than social drinkers in a bar-laboratory setting (Thomas et al. 2004). In a novel study using the BAES, Morzorati and colleagues (Morzorati et al. 2002) examined both an initial response to alcohol and adaptation to this response in FHP and FHN individuals. These authors reported that, when breath alcohol content (BrAC) was clamped (i.e., maintained) at 60 mg% for approximately 2 hr, FHP individuals reported a greater response to the alcohol challenge at the beginning of the clamp, but their self-reports did not differ from that of FHN individuals at the end of the clamp experiment. This finding suggests that FHP individuals had developed within-session (i.e., acute) tolerance. Although these results appear interesting, the utility of the BAES has been somewhat limited based on lack of usage in large studies in a number of populations, limited range of alcohol doses tested so far, and the fact that the original instrument did not allow for baseline testing and disclosed to subjects that they received alcohol (Rueger et al. 2009).
Another approach to finding consilience in measures of subjective responses to alcohol between rodents and humans may be the use of the Self Report of the Effects of Ethanol (SRE) questionnaire. The SRE was developed by Schuckit and is used to index the number of drinks necessary to produce a specific effect and is typically used as a retrospective report of the need for more drinks to experience the “effects” of alcohol during the development of AUD and AD (Schuckit et al. 1997a;Schuckit et al. 1997b). This measure was recently found to be more robust than results of the alcohol challenge in a regression analysis predicting AUD outcomes [see (Schuckit et al. 2009)]. It is possible that measures of the number of “drink equivalents” that an animal will take in a drinking paradigm might model the SRE measure in humans; then, the genetic contributions to sensitivity (e.g., QTLs, genetic correlations) for the two phenotypes could be compared. However, there remains the problem of establishing in rodents what the relevant behavioral end point might be that would cue the animal to stop drinking, which we discuss in the next section.
Indices of subjective response are generally obtained through self-report, a behavior we cannot obtain directly from rodents. Nor are there obvious behavioral sequelae of feeling “high” in humans that could be examined in rodents. If “feeling effects” in humans is modeled by behaviors reflecting “reward” in rodents (e.g., selection of the drug-associated environment in a conditioned place preference test, operant responding for alcohol during extinction), there are numerous assays for rodents, but none is considered completely satisfactory in its current incarnation [see Stephens et al, this issue (Stephens et al. 2010)]. Animals can clearly recognize the subjective effects of alcohol, as it is an effective stimulus for drug discrimination studies (Barry, III 1991). However, such studies are directed at demonstrating discrimination between drugged and non-drugged states, or between the effects of one drug versus another, and not between drug state and the state following delivery of another reward.
If animals are repeatedly administered alcohol in the presence of distinctive cues in one location and saline in a different location with different cues, one can then test the animal in an environment where both locations are available. If the animal elects to spend the majority of the time in the drug-paired environment, one infers that it found the subjective experience of the drug pleasant, even though it is not under the influence of the drug during the test. One study compared P and NP rats in a place conditioning paradigm that resulted in a conditioned place aversion (CPA). Aversion of the alcohol-paired location was less pronounced in P rats, suggesting that they were less sensitive to the aversive stimulus effects of alcohol than NP rats (Stewart et al. 1996). On the other hand, in another study, msP rats showed a pronounced conditioned place preference (CPP) (Ciccocioppo et al. 1999). It should be noted that the msP line has no direct comparator line, selected for low drinking—nonetheless, the direction of the effect was opposite to that seen in the P and NP lines. Another behavioral index of the subjective effects of alcohol is the conditioned taste aversion (CTA) assay. Drinking of a novel flavored solution (usually a sweet solution or a weak saline solution) is paired with an injection of alcohol. If selectively bred high alcohol-consuming lines of rats self-administer greater amounts of alcohol than their low alcohol-consuming counterparts because of a low response to alcohol’s aversive properties, it would be predicted that the high drinking lines would display a lower CTA for a flavored solution when conditioned (i.e., preceded by) a moderate dose of alcohol. Such were the findings in a study examining HAD-2 and LAD-2 rats, with HAD-2 rats displaying a low CTA to saccharin compared with LAD-2 rats (N. E. Badia-Elder, personal communication). In a study examining P versus NP rats (Froehlich et al. 1988), P rats were less affected, as indicated by a more modest decrease in saccharin intake following conditioning with a moderate dose (1.0 g/kg) of alcohol than NP rats, and showed a transient facilitation of saccharin intake after conditioning with a low dose (0.25 g/kg) of alcohol, whereas NP rats displayed no effect at the latter dose. Similarly, sP rats were not affected by conditioning with a 1.0 g/kg dose of alcohol, whereas sNP rats displayed a strong CTA to saccharin after conditioning with this dose (Brunetti et al. 2002). Similar findings have been reported for the UChB versus UChA rat lines, with the high alcohol-drinking UChB line displaying a very low CTA to saccharin after conditioning with either alcohol (Quintanilla et al. 2001) or alcohol’s first metabolite, acetaldehyde (Quintanilla et al. 2002) compared with the low alcohol-drinking UChA line. When the selectively bred high alcohol-preferring (HAP-1 and HAP-2 replicates) mice and their low alcohol-preferring (LAP-1 and LAP-2 replicates) counterparts were examined, it was found that, as with the rat lines, HAP-1 and HAP-2 mice showed a significantly lesser CTA to sodium chloride than LAP-1 and LAP-2 mice (Chester et al. 2003). This highly consistent set of findings suggests that low sensitivity to some aversive effects of alcohol accompanies a tendency to prefer alcohol solutions.
Alcohol CTA and CPP have also been examined in the WSP and WSR selected lines, as well as a pair of lines selected for a few generations for High (HAW) vs Low (LAW) Alcohol Withdrawal from the F2 intercross of C57BL/6J and DBA/2J mice (Chester et al. 1998;Risinger et al. 1994). WSP and HAW mice developed stronger CPP than WSR and LAW, respectively. While WSP mice were slightly less sensitive than WSR mice in the CTA test, HAW and LAW lines did not differ. Collectively, these data from mice suggest that enhanced sensitivity to the rewarding effects of alcohol is correlated with a genetic tendency to experience severe withdrawal, and that a tendency toward reduced sensitivity to an aversive effect of alcohol may also be associated.
Together, and especially in the highly consistent CTA data, these findings suggest that low genetic sensitivity to some aversive effects of alcohol accompanies a tendency to prefer alcohol solutions. Data linking effects of alcohol in place conditioning paradigms are more sparse and less consistent. Partly, these studies are complicated by a species difference, whereas rats tend to develop place aversions to an alcohol cue, mice tend to develop CPP (Cunningham, Phillips 2003). Interestingly, mice have been shown to develop either CPP or CPA depending on the timing of the administration of alcohol. This has been attributed to an initial short-lived aversion to alcohol’s stimulus effects, followed by a delayed, positive response to the drug (Cunningham et al. 2002). This raises the possibility that a place conditioning measures could be used in mice to dissociate the perceived subjective effects of alcohol on the rising vs the falling limb of the BAL curve.
Finally, the presumed palatability of a solution can be inferred from facial expressions and/or gustatory-associated behaviors when a novel flavor is tasted [c.f. (Bachmanov et al. 2003;Kiefer 1995)]. One would hypothesize that selectively bred high alcohol-consuming rat lines will display more appetitive/ingestive responses and fewer aversive responses to oral infusions of alcohol than their low alcohol-consuming counterparts. Findings from a study with AA versus ANA rats support this hypothesis (Badia-Elder, Kiefer 1999). In addition, although the msP rat line does not have a low alcohol-consuming counterpart, it is noteworthy that aversive reactivity to oral infusion of alcohol is virtually absent in this line of rat (Polidori et al. 1998). However, the hypothesis for increased appetitive/ingestive responding in high versus low alcohol-consuming rat lines does not always hold in the alcohol-naïve state (i.e., under initial test conditions). For example, P and NP rats have similar initial orofacial and behavioral responses to the taste and smell of alcohol. However P, but not NP, rats significantly increase their appetitive/ingestive as well as significantly decrease their aversive responses after chronic access to alcohol and 4 weeks of abstinence (Bice, Kiefer 1990). Similar findings have been reported for the HAD-1 and LAD-1 lines of rats (Kiefer et al. 1995). It is noteworthy that chronic alcohol access increased appetitive/ingestive responses and decreased aversive responses to oral alcohol infusion in nonselected Holtzman-derived rats, but 4 weeks of abstinence abolished the changes in appetitive/ingestive responding and nearly abolished the changes in aversive responding to alcohol (Kiefer et al. 1994). Therefore, the findings in the P and HAD-1 rat lines suggest selective breeding for alcohol preference predisposes an animal to maintain behaviors associated with increased rewarding/reinforcing effects of alcohol or, conversely, maintain behaviors associated with reduced aversive effects of alcohol after chronic alcohol consumption.
Although not directly related to subjective responses to alcohol, there is a substantial, positive association between the intake of sweets and alcohol. While a review of this literature is beyond the scope of the present article, a brief discussion is warranted due to overlapping QTLs or other genetic indicators between alcohol preference and saccharin/sweet preference in mice and rats (Blednov et al. 2008;Blizard, McClearn 2000;Foroud et al. 2002;Terenina-Rigaldie et al. 2003). Clinically, an individual’s proclivity for sweets is a strong predictor for alcohol consumption/abuse [c.f. (Kampov-Polevoy et al. 1997;Kampov-Polevoy et al. 1999;Kampov-Polevoy et al. 2003)]. AA rats display greater preference for saccharin and/or palatable solutions than ANA rats (Kampov-Polevoy et al. 1996;Sinclair et al. 1992), as has been shown for the P versus NP (Kampov-Polevoy et al. 1996;Overstreet et al. 1993;Sinclair et al. 1992;Stewart et al. 1994) and the UChB versus UChA (Tampier, Quintanilla 2005) lines. When the selectively bred HAP and LAP mouse lines were evaluated, they, like the majority of the rat selected lines, displayed a positive association between alcohol preference over water and levels of saccharin intake (Grahame et al. 1999). However, the positive association between alcohol preference and saccharin/sweet preference does not appear to hold for the sP versus sNP lines (Agabio et al. 2000).
The subjective response to alcohol’s effects in humans is complicated by an initial stimulant phase during the ascending limb of the BAL curve and subsequent responses reflecting unpleasant feelings during the descending limb of the BAL curve, and further complicated by the simultaneous development of tolerance. Most evidence connects low LR to the aversive aspects of alcohol to subsequent AD outcomes. In rodents, the subjective response to alcohol’s effects is difficult to model. Nevertheless, there appears to be a fairly robust negative genetic relationship between sensitivity to some aversive aspects of alcohol intoxication and tendency toward high two-bottle preference drinking, consistent with the low LR hypothesis. Why this is more robust in taste conditioning assays than in place conditioning assays may become clear as the basis for these two different learned responses is better elucidated [see Stephens et al, this issue (Stephens et al. 2010)].
Schuckit (Schuckit 1985) was the first to describe a measure that indexes the amount of body sway or static ataxia a person displays following an alcohol challenge. The Schuckit group initially used this measure in a study of 68 individuals who were either FHP of FHN for AD. He demonstrated that at baseline, the two groups were virtually identical; however, the level of body sway was significantly less for the FHP group after an alcohol challenge especially at the low (0.75 mL/kg) dose. These findings were replicated in a separate lab with a sample of women, where those who were FHP also showed smaller alcohol-induced increases in body sway (Lex et al. 1988). These effects have been replicated in several other populations as well [see (Schuckit 2009)].
The method employed in LR studies by Schuckit’s group to measure body sway involves attachment of the subject to a harness with pulleys, and reading body sway using magnetic sensors (Schuckit 1985). Body sway is converted to an index of low or high response by summing anterior/posterior and lateral sway. To achieve postural steadiness, an individual must simultaneously integrate: visual and proprioceptive sensory information; biomechanical stimuli generated by gravity; and cognitive processes affected by movement, intentions regarding movement, and spatial orientation, just to name a few factors (Horak 2006). Even considering only the biomechanical inputs, the method used to determine body sway LR to alcohol is relatively crude compared to force platforms or other devices [e.g. (Ledin, Odkvist 1991;Goebel et al. 1995;Letz et al. 1996)]. In order to model this effect in rodents more directly it would be important to know exactly which aspects of the complex phenomena resulting in body sway are affected by alcohol in order to target the specific brain systems mediating these aspects. Recent studies have demonstrated that static balance changes caused by acute alcohol are mainly due to a low frequency (0-1 Hz) transversal sway seen when the subject has his or her eyes closed and that the acute response differs from the postural sway seen in some alcoholics (Ando et al. 2008). It has also been demonstrated that alcohol does not cause changes in mean center of foot pressure or frequency (Noda et al. 2004) and that it appears to affect more the amount of body sway rather than the displacement or periodicity of the center of foot pressure (Noda et al. 2005). Significant gender differences have also been found in these body sway parameters (Kitabayashi et al. 2004). If these sophisticated methods could be applied to the at-risk human populations, and if a more precise understanding of the neural source of the deficits could be identified, then perhaps the mechanisms underlying alcohol-induced body sway could be further elucidated in a well-chosen rodent assay.
There are numerous simple rodent behavioral assays that appear to resemble “ataxia” in the broadest sense, and those who use some of them frequently cite “balance” as the construct targeted. Some work has been done that attempts to map specific behaviors (e.g., wobbling gait) to specific neural circuits. Many of these behaviors show deficits after intoxication with an administered alcohol dose. However, very little work has specifically addressed the intoxicating effects of alcohol in rodents at a mechanistic level. In all cases where one has looked, all these behaviors reflect genetic differences in sensitivity to alcohol. However, when 8 inbred mouse strains were screened for sensitivity to alcohol-induced intoxication using 18 variables derived from 11 separate behavioral assays, the pattern of results suggested that alcohol sensitivity in most tasks was influenced by task-specific sets of genes (Crabbe et al. 2005). That is, the genetic correlations across tasks were low—strains that were very intoxicated on one measure were not necessarily very intoxicated on another. The tasks were diverse, and included activity stimulation and wobbling gait. Another test was the rotarod, where mice are placed on a horizontally-oriented dowel which is then rotated at increasing speeds until they fall, and a balance beam, where intoxicated mice show a characteristic foot slip when intoxicated. All such tasks involve many neural systems, including but not limited to vision, proprioceptive feedback, gait patterning, and balance, but also motivation and learning. The pattern of results suggested that these assays do not target a single, monolithic functional domain (e.g., “balance” or “ataxia”).
Thus, in mice (and probably in rats), we have no clear idea which assay(s) we should employ to model human LR body sway. Three types of rodent behavioral assessments after an alcohol challenge have been suggested in various publications to model the LR phenotype in humans—locomotor stimulation (discussed later along with heart rate activation and EEG effects); “ataxia” (using many different assays, some of which were just discussed); and loss of righting reflex (LORR). When mice or rats are given a high dose of alcohol and are placed on their back a few second later, they are unable to roll over onto their stomach, an action which is called the righting reflex (RR). The duration of the LORR (or, more accurately, the brain alcohol concentration at which the RR is lost) differs subtantially across genotypes (McClearn, Kakihana 1981;Browman, Crabbe 2000;Draski, Deitrich 1996). We correlated the published inbred strain withdrawal severity and preference drinking data with the aforementioned 18 indices of strain sensitivity to alcohol stimulated (motor) activity, alcohol intoxication on the “ataxia” measures, and alcohol-induced LORR [the latter collated in (Crabbe et al. 2005)]. In addition, we explored other published data from multi-strain strain comparisons not included in the above publication, as well as private data sets of inbred strain means.
For preference drinking, there were comparable data for 6 inbred strains. For 16 measures, there was little to no evidence of correlation across strains (*r*< 0.37). For only one measure was there the hint of a relationship. Preference drinking was negatively correlated with improvement of accelerating rotarod performance after a low dose of alcohol (r = −.69, p = .13). For withdrawal severity, with data from 7 inbred strains, only the acute hypothermic response was related (r = −.70, p = .08). This latter negative relationship, however, is not consistent with earlier reports that WSP and WSR mice do not differ in sensitivity to alcohol on a number of the same measures assessed in the inbred strain panel, including acute hypothermia (Metten, Crabbe 1996). These correlations did not reach statistical significance, which is not surprising given the few data points available, and would need to be tested in a larger number of strains. But, overall, they do not help in the selection of one measure of intoxication in mice over another for future studies. Neither were any of the measures of sedative response strongly associated with the severity of withdrawal. Finally, using the BXD RI set of data and other published reports [e.g. (Phillips et al. 1996)], we also failed to find strong relationships between either drinking or withdrawal and measures of sedative senstiivity.
It has been postulated that the development of tolerance to alcohol-induced effects is positively associated with a propensity to consume high levels of alcohol [e.g. (Koob et al. 1998); (Le, Mayer 1996)]. We found some support for this hypothesis in the animal literature, as it relates to motor impairment. High-drinking AA (Nikander, Pekkanen 1977;Le, Kiianmaa 1988;Rusi et al. 1977), HAD-1 (Suwaki et al. 2001), P (Rodd et al. 2004;Bell et al. 2001), and UChB (Tampier, Mardones 1999) rats have lower sensitivity and/or develop quicker tolerance to the motor impairing effects of alcohol than their low-drinking ANA, LAD-1, NP, and UChA counterparts. Contrary to this hypothesized association are the findings that HAD-2 rats do not differ from LAD-2 rats in this regard (Suwaki et al. 2001) and HARF rats display greater alcohol-induced motor impairment than LARF rats (Shram et al. 2004). In addition, findings with the selected HAP and LAP mouse lines did not support the hypothesis, with the observation that these lines did not differ in initial sensitivity or the development of tolerance to alcohol-induced motor ataxia (Grahame et al. 2000). Nor was tolerance in the grid test of alcohol-induced motor impairment correlated with preference or drinking across the extensive set of BXD RI strains (Phillips et al. 1996).
In discussions appearing in the literature, one rodent assay appears frequently and is often asserted to model human LR—the duration of the LORR after an anesthetic dose of alcohol. Why this would be considered to be a reasonable surrogate for human body sway is puzzling. While there are strong genetic contributions to LORR, and mouse and rat lines have been bred to differ markedly in LORR to high dose alcohol (McClearn, Kakihana 1981;Deitrich 1993), the available evidence shows that the genes influencing LORR sensitivity in rodents do not overlap substantially with those affecting other measures of physical intoxication across inbred strains (Crabbe et al. 2005) or in the BXD RI series (Browman, Crabbe 2000). Nor did LORR appear to be genetically correlated with alcohol drinking or withdrawal in rodents such as the WSP and WSR lines (Crabbe, Kosobud 1986).
It is not known which of the many potential neural mechanisms in humans lead to low vs high LR in body sway. If the human phenotypes were explored in more mechanistic detail, it could, depending on the results, be modeled rather directly in rodents. Here is an area where we see promise for greater consilience, as the animal resources are considerable.
Other measures have been proposed to predict alcoholism risk. Possibly, the acute locomotor stimulation seen after low doses of alcohol, particularly in mice, may serve as a model of some of the activating effects in humans. However, correlations between acute stimulation in standard inbred mouse strains (Crabbe et al. 2003;Cunningham 1995) or in the BXD RI strains (Phillips et al. 1996) and drinking are not robust. Neither are the correlations substantial with withdrawal severity in either data set (Metten, Crabbe 2005;Crabbe 1998). Furthermore, WSP and WSR mice do not differ in acute stimulation after alcohol (Crabbe et al. 1988), nor do HAP and LAP mice (Grahame et al. 2000).
Unlike mice, heterogeneous stock rats generally do not show a robust stimulant response to alcohol, but a survey of the data from selectively bred high versus low alcohol-consuming rat lines revealed a consistent positive association between alcohol-induced motor stimulation and a propensity for high alcohol intake. In an early study, it was reported that adult P rats displayed alcohol-induced stimulation, whereas NP rats did not (Waller et al. 1986). A recent comprehensive study in both male and female adolescent P versus NP, HAD-1 versus LAD-1 and HAD-2 versus LAD-2 (Rodd et al. 2004) reported that all of the high alcohol-consuming lines displayed alcohol-induced stimulation at lower doses (0.25 to 0.75 g/kg), whereas the low alcohol-consuming lines did not. In addition, using an operant protocol Krimmer and Schechter (Krimmer, Schechter 1991) reported alcohol-induced motor stimulation in HAD-1, but not LAD-1, rats. The University of Chile lines also show the above differences, such that UChB rats displayed alcohol-induced motor stimulation at lower doses (0.25 and 0.50 g/kg), whereas UChA rats were unaffected (Quintanilla 1999). Similarly, sP rats display low dose alcohol-induced locomotor activation, whereas sNP rats do not (Agabio et al. 2001). While direct comparisons with low alcohol-consuming rats cannot be made with alcohol self-administration procedures, it is noteworthy that AA (Paivarinta, Korpi 1993), P (Bell et al. 2002;Melendez et al. 2002), and sP (Colombo et al. 1998) rats display alcohol-induced locomotor activation during self-administration. Finally, HAP and LAP mice do not differ in sensitivity to alcohol-induced locomotor activation, but HAP mice display dose-dependent sensitization whereas LAP mice do not (Grahame et al. 2000).
Another suggested measure is sensitivity to the effect of alcohol to stimulate heart rate. This would parallel the behavioral measure of locomotor stimulation, but on a physiological level. Support for heart rate stimulation as a measure of reinforcement/reward comes from the findings that (a) contingent and noncontingent electrical stimulation of the ventral tegmental area stimulates heart rate in rats (Burgess et al. 1993), (b) increases in heart rate paralleled performance of a conditioned response reinforced by petting in dogs (Kostarczyk, Fonberg 1982), as well as (c) in humans monetary and positive feedback increases heart rate (Fowles et al. 1982) and there is a positive association between level of incentive and magnitude of the heart rate increase (Tranel et al. 1982). This latter effect is important given the discussion of changes in perceived magnitude of reward and measurement of drug reinforcement/reward discussed by Stephens et al. in this issue (Stephens et al. 2010).
Alcohol reinforcement is directly associated with heart rate/autonomic arousal, such that (a) alcohol consumption increases heart rate in nonalcoholic normotensive human subjects (Grassi et al. 1989;Higgins et al. 1993;Ireland et al. 1984;Iwase et al. 1995); (b) increased heart rate is associated with self-reports of the stimulating effects of alcohol using the BAES (Brunelle et al. 2007); similarly, (c) in FHP individuals, increased heart rate during alcohol consumption is positively associated with the hedonic properties of alcohol (Assaad et al. 2003); and (d) FHP individuals experiencing the greatest increases in heart rate during alcohol consumption scored highest on measures of impulsivity and sensation-seeking (Brunelle et al. 2004), with these personality traits being strong predictors of alcohol abuse [e.g. (Andrucci et al. 1989).
Regarding craving and “cue-reactivity,” contextual cues associated with drug and alcohol self- administration and “mental representation of these cues” can elicit heart rate/autonomic arousal (cue-reactivity) in humans (Childress et al. 1993;O’Brien et al. 1992;Rajan et al. 1998). “Cue-reactivity” describes an organism’s (primarily clinical populations) reactivity, often manifested as “craving” and generally autonomic in nature, to stimuli associated with drug or alcohol consumption [c.f. (Newlin 1992;Rohsenow et al. 1990;Vogel-Sprott 1995)]. Additionally, alcoholics display greater autonomic arousal than controls in the presence of these cues (Drummond et al. 1990). In another study, when subjects were told that a beverage contained alcohol, alcoholics displayed increases in heart rate and their level of dependence determined the duration of this heart rate increase (Stormark et al. 1998). However, with an alcohol placebo procedure, decreases in heart rate have also been reported [c.f. (Newlin 1985)]. In addition, Native Americans at high risk for alcoholism were not found to have increases in heart rate following an alcohol challenge (Garcia-Andrade et al. 1997) and, not unexpectedly, increased heart rate following an alcohol challenge was found to be highly associated with “feeling terrible” in Asian men with the ALDH2*2 allele [c.f. (Wall et al. 1992)]. Overall, it is not clear to what extent these heart rate measures may share genetic risk factors with other measures of intoxication (Conrod et al. 2001;Ray et al. 2006). Nevertheless, heart rate changes may provide a useful model to explore in rodents.
We know of only five rodent studies that have examined changes in heart rate during limited access alcohol consumption. Two studies in heterogeneous stock rats indicate that, as with the clinical research, alcohol consumption results in heart rate/autonomic stimulation (El-Mas, Abdel-Rahman 2007;Ristuccia, Spear 2008). Three studies with P rats found this as well (Bell et al. 2002;Bell et al. 2008;Bell et al. 2007). Moreover, all of the studies using P rats reported that the increases in heart rate could be conditioned to the test environment, with alcohol-consuming P rats displaying increased heart rate during the 90 min pre-test session, when only water was available. The earliest study (Bell et al. 2002) reported that whereas saccharin self-administration resulted in a short-term (~15 min) increase in heart rate, alcohol self-administration resulted in a much longer increase in heart rate (the entire length of the 90 min test session). The subsequent study (Bell et al. 2007) reported that alcohol self-administration during the peri-adolescent window of development (post-natal days 30 through 72) resulted in lower basal heart rate when tested during adulthood. These authors also reported that the level of heart rate induced by alcohol self-administration seen in the peri-adolescent alcohol-experienced and peri-adolescent alcohol-naïve P rats did not differ (i.e., with the decreased basal heart rate displayed by the peri-adolescent alcohol-experienced P rats, the magnitude of heart rate increase induced by alcohol consumption was greater in this group compared with the peri-adolescent alcohol-naïve group). The most recent study (Bell et al. 2008) revealed that the heart rate increases induced by alcohol consumption could be conditioned to the test environment not only during the pre-test session, during which only water was available, but also during the usual test session when alcohol was normally available but was deprived from one group of animals.
There have been a number of studies that have demonstrated that central electrophysiological responses to alcohol differ in human subjects with low LR to alcohol. EEG alpha activity at baseline has been demonstrated to predict low level of response as indexed by the SHAS, with high alpha predicting a low response [see (Ehlers, Schuckit 1991;Ehlers et al. 2004)]. Both electroencephalogram (EEG) and event-related potential (ERP) responses to alcohol challenge have also been demonstrated to be attenuated in those high risk individuals who reflect a low level of response to alcohol [see (Ehlers, Schuckit 1991;Ehlers, Schuckit 1990;Ehlers et al. 2004;Ehlers et al. 1999;Ehlers et al. 1998). In one study P300 latencies were found to return to normal more quickly in FHP subjects following an alcohol challenge as compared to FHN (Schuckit et al. 1988). On the other hand, increased beta activity following alcohol challenge has also been reported in FHP subjects (Ehlers, Schuckit 1991).
Electrophysiological responses to alcohol have also been examined to a limited extent in lines of rats selectively bred for differences in alcohol preference. P rats were shown to have a lower response to alcohol than NP rats as measured by the N1-ERP (Ehlers et al. 1991). In addition, the Indiana HAD rats have been shown to have lower levels of response to alcohol in several electrophysiological measures (Slawecki et al. 2000).
Together, these human and rodent findings suggest a promising area of research that could provide better consilience between human and animal physiological responses to alcohol. In addition, with the substantial clinical literature examining alcohol-induced heart rate changes in alcoholics versus controls or FHP versus FHN individuals, future work with rodents should include similar comparisons. It should be noted, however, that the locomotor and heart rate changes tend to indicate increased sensitivity in at-risk subjects administered alcohol, and therefore are in the direction opposite to a uniform interpretation of an underlying low LR phenotype. The locomotor activation data are complicated by the puzzling fact that the species (rat) not generally activated shows the relationship with drinking quite clearly, while the data in mice (usually strongly activated) are generally negative. In the area of electrophysiological responses, more work needs to be done to study a larger array of selectively bred rat and mouse lines and other informative genotypes for their electrophysiological responses to alcohol using measures that are equivalent to those studied in humans.
Adolescence appears to be a period during which both humans and animals display lower responses to many of alcohol’s aversive/moderate- to high-dose effects compared with their adult counterparts, which may promote excessive alcohol intake and the development of AD (Spear, Varlinskaya 2005;Ehlers et al. 2006;Spear 2000;Spear 2004;Witt 1994;Witt 2006). Clinically, it appears that adolescent individuals are also less affected by alcohol withdrawal than their adult counterparts (Martin, Winters 1998). Reports in the preclinical literature are much more abundant and specific on these developmental differences. For instance, there is evidence that during early and/or late alcohol withdrawal, adolescent, compared with adult, rats display (a) lower anxiety as measured using the elevated plus-maze (Doremus et al. 2003), (b) attenuated suppression of social interactions (Varlinskaya, Spear 2004); but see (Wills et al. 2008;Wills et al. 2009); (c) less distress as measured by ultrasonic vocalizations (Brasser, Spear 2002), and (d) decreased seizure threshold [ (Acheson et al. 1999) but see (Wills et al. 2008)]. Similarly, after an acute alcohol challenge, adolescent, compared with adult, rats display (a) shorter duration of the LORR and/or elevated BALs upon recovery from alcohol-induced sedation (York, Chan 1993;Little et al. 1996;Pian et al. 2008;Silveri, Spear 1998); and (b) decreased motor impairment (Silveri, Spear 2001); (White et al. 2002a;White et al. 2002b). In addition, the existing literature suggests that adolescent rats develop tolerance to aversive/moderate- to high-dose effects of alcohol more quickly and/or to a greater extent than that observed in adult rats. These developmental differences for alcohol’s effects are seen when assessing (a) social interaction (Varlinskaya, Spear 2006), (b) LORR (Silveri, Spear 2004;Pian et al. 2008), (c) hypothermia (Swartzwelder et al. 1998), (d) motor impairment (Cook et al. 2008), and (e) electrophysiological effects (Pian et al. 2008). It is noteworthy that developmental differences in alcohol-associated pharmacokinetics cannot fully explain these findings [see (Spear, Varlinskaya 2006) for a cogent discussion and references], although clearly some of the variance is due to ontological differences in alcohol absorption and clearance (c.f. (Walker, Ehlers 2009).
Whether adolescent LR has any firm correspondence with the low LR link to AD outcomes is unknown. LR as discussed in earlier sections is generally assessed in young adulthood. No prospective, laboratory-based alcohol challenge studies are likely to be done with periadolescents for ethical reasons, which limits the opportunity to pursue any links. However, the importance of early self-exposure to alcohol is unquestioned, and this is an area where continued attention should be directed. For example, it may be possible to ascertain useful information based on self-reports or by employing real-time self-recording measures of actual adolescent drinking.
A low LR to alcohol as defined by multiple objective and subjective measures has been successfully established as being a prospective risk factor for developing alcohol-related problems (Pollock 1992;Eng et al. 2005;Erblich, Earleywine 1999;Schuckit et al. 2005a;Schuckit et al. 2005b;Schuckit et al. 2006;Schuckit 2002). Low LR is envisioned as a risk factor for heavy drinking since individuals with low LR must theoretically drink more to obtain the desired effects of alcohol. Heavy drinking behavior, in turn, is hypothesized to promote associations with heavy-drinking peers, alter expectations of alcohol’s effects and contribute to acquired tolerance [see (Schuckit 2009)]. Interestingly, low LR does not seem to alter the course of alcoholism. In one study, while low LR was associated with a high risk for AD it was not related to most aspects of the course of alcohol problems once dependence developed (Schuckit, Smith 2001). This suggests that while low LR may be a risk factor for heavy drinking and subsequently dependence on alcohol it does not appear to be associated with any specific symptoms or subtypes of alcohol dependence.
In order to study the mechanisms underlying low LR it would be important to know whether a single set of mechanisms contribute to all measures of low LR or if different mechanisms underlie the individual measures. Given the range of domains considered in this review, it seems unlikely that the human low LR is a monolithic phenotype. Most studies exploring the genetic basis for low LR as a risk factor in humans have employed individual response variables, differentiating among subjective, body sway, and hormonal variables. Others, however, have created composite “low LR” indices. Several studies have attempted to sort out the genetic and environmental contributions to paths that link family history, LR and eventual diagnosis using statistical approaches including multiple regression, path analysis, or structural equation modeling. These have typically used the composite variables, making it difficult to tell exactly which aspects of LR might be predictive or have different mechanisms (Schuckit, Smith 1996;Schuckit, Smith 2000;Trim et al., 2009). Despite these difficulties, linkage and association studies have found some evidence for genomic localization of risk -promoting alleles for the global low LR phenotype [see, e.g., (Hinckers et al. 2006;Joslyn et al. 2008;Schuckit et al. 1999;Schuckit et al. 2001;Schuckit et al. 2005b)], although each such linkage signal may of course reflect the influence of only some particular aspect of low LR.
Table 2 summarizes the material we have reviewed in this article. We surveyed data from rodents genetically predisposed to drink or not drink, and those predisposed to experience more or less severe withdrawal after dependence on alcohol had been established. In addition to the complexity of the human LR target, the inherent problems of not being able to truly know what motivates animals to drink alcohol or why they elect to drink more or less alcohol makes choosing or developing a LR phenotype in animals that closely parallels the human situation difficult. However, it might not be necessary to understand this in order to successfully identify parallel mechanisms that may underlie LR across species. For instance, even if the phenotypes and motives for drinking in humans and rodents are either opaque or appear different to the experimenter, it may be that some physiological mechanisms that underlie LR (variously described) seen in some rodent genotypes and in humans are very similar and that the same sets of genes underlie those physiological mechanisms. If this is true then identifying a smaller set of LR-related phenotypes in humans and in mouse or rat by which to compare QTL data across species using syntenic mapping could be successful. This was accomplished for a small set of alcohol-related phenotypes [see Ehlers et al., this issue (Ehlers et al. 2010)] and in fact some QTLs were found to be syntenic, even though the exact phenotypes between humans and mice did not necessarily overlap. These studies demonstrate how this technique might be useful in the search for genes underlying alcohol-related phenotypes in multiple species. However, these findings also suggest that establishing exact phenotypic matches in humans and rodents may not be necessary or even optimal for determining whether similar genes influence a range of alcohol-related behaviors across species. We nonetheless advocate the pursuit of more consilient phenotypes as an additional important path for future studies, as such phenotypes would lead to better understanding of the more complex “LR” phenotypes at all species levels, and should help to clarify their genetic complexity.
Thanks to Deb Finn and Nancy Badia-Elder for sharing unpublished data, and to Fay Horak for helpful comments. Thanks to Pam Metten and Andy Cameron for correlational analyses of mouse strain data. These studies were supported by a grant from the Department of Veterans Affairs (JCC), and by NIAAA Grants AA10760 (JCC), AA13519 (JCC), AA13522 (RLB), AA006059 (CLE), AA010201 (CLE), AA06420 (CLE), AA014339 (CLE) and U54 RR0250204 (CLE).