|Home | About | Journals | Submit | Contact Us | Français|
Traumatic brain injury (TBI) represents a significant cause of death and disability in industrialized countries. Of particular importance to patients the chronic effect that TBI has on cognitive function. Therapeutic strategies have been difficult to evaluate because of the complexity of injuries and variety of patient presentations within a TBI population. However, pharmacotherapies targeting dopamine (DA) have consistently shown benefits in attention, behavioral outcome, executive function, and memory. Still it remains unclear what aspect of TBI pathology is targeted by DA therapies and what time-course of treatment is most beneficial for patient outcomes. Fortunately, ongoing research in animal models has begun to elucidate the pathophysiology of DA alterations after TBI. The purpose of this review is to discuss clinical and experimental research examining DAergic therapies after TBI, which will in turn elucidate the importance of DA for cognitive function/dysfunction after TBI as well as highlight the areas that require further study.
Traumatic brain injury (TBI) is the leading cause of death and disability in individuals less than 45 years of age in industrialized countries (Bruns and Hauser, 2003). Each year an estimated 1.4 million Americans experience a TBI and 80,000 to 90,000 suffer long-term substantial loss of function (Rutland-Brown et al., 2006). Clinical studies have shown that 10–15% of individuals with mild TBI have persistent cognitive and behavioral complaints. Outcomes from moderate TBI are much less favorable with some estimates suggesting that 50% of these individuals endure long-term injury-related disabilities (Kraus et al., 2005). This places an enormous economic burden on the U.S. healthcare system with an estimated cost of $9–10 billion in acute care and rehabilitation annually. This cost is in addition to lost earnings, social services, and the cost to family members who must care for TBI survivors. TBI also represents a global healthcare crisis with an estimated 2% of the world's population suffering from chronic symptoms of brain trauma, equating to more than 120 million individuals (NIH, 1998; Ragnarsson, 2002). For these reasons it has been a long sought goal of TBI researchers to understand the mechanisms of chronic disability after TBI to help develop treatment strategies that may assist patients with cognitive recovery.
However, researching chronic disability following TBI has posed a unique challenge to both clinical and experimental researchers. TBI is a highly variable and extremely complex phenomenon. Following the acute primary injury, which often consists of a focal contusion and more diffuse structural damage, there are a series of subsequent secondary responses, which include, but are not limited to, excitotoxicity, ischemia, oxidative stress, and ongoing structural and chemical alterations (Kochanek, 1993; DeKosky et al., 1998; Park et al., 2008). Traditionally, research in recovery of function after TBI has focused on preventing or manipulating early events in order to prevent chronic dysfunction. Drugs inhibiting apoptosis, blocking glutamate-induced excitotoxicity, or attenuating oxidative stress were designed to reduce cell loss with the premise that neuronal sparing would enhance recovery (Faden et al., 1989; Jennings et al., 2008). Unfortunately, the neuroprotective effects observed in the TBI laboratories have not translated successfully to the clinic (Gualtieri, 1988; Tolias and Bullock, 2004). In contrast, therapeutics used during the rehabilitative phase have shown more promise in addressing long-term disability, although they do not necessarily demonstrate the same level of neuroprotection as drugs designed to inhibit apoptosis or block excitotoxicity (Gualtieri, 1988; Rees et al., 2007).
The failure of translating experimental preventative strategies to clinical efficacy has raised the question about what events in TBI are crucial for long-term outcome. The development of clinically relevant small animal models has greatly assisted the understanding of both acute and chronic TBI-induced alterations in brain chemistry. The two most widely used models of TBI are fluid percussion (FP; Dixon et al., 1987) and controlled cortical impact (CCI; Lighthall, 1988; Lighthall et al., 1989; Dixon et al., 1991; Kline et al., 2001; Kline and Dixon, 2001). Both models produce clinically relevant brain pathology as well as behavioral and cognitive dysfunction in rats and mice (Dixon et al., 1987, 1991; Hamm et al., 1992, 1996a,b; Fox et al., 1998, 1999; Kline et al., 2002, 2007a,b; Wagner et al., 2002, 2004; Cheng et al., 2007, 2008; Hoffman et al., 2008a,b). Animal studies (Bramlett and Dietrich, 2002; Lifshitz et al., 2007) and human positron emission tomography (PET) imaging (Langfitt et al., 1986; Fontaine et al., 1999; Donnemiller et al., 2000) have shown that in addition to overt damage (e.g., cortical lesions and hippocampal cell loss), there exist areas of chronic dysfunction previously unappreciated, particularly in the striatum and thalamus, which are regions known to have important roles in cognitive, motor, and emotional processing (Vertes, 2006).
The aims of this review are to: (1) highlight the role of dopamine (DA) in cognition and its functional anatomy relevant to TBI; (2) outline clinical research that has demonstrated potential efficacy of DAergic medications in the treatment of TBI; (3) provide an overview of observed changes in DA signaling and anatomy in experimental models, and outline the importance that these alterations have on cognitive and behavioral deficits; and (4) assess future areas of DA systems research in TBI. This review is not meant to be an exhaustive discussion of DA and cognition, but rather is intended to highlight the role(s) DA plays in persistent cognitive dysfunction after TBI, which may provide insight into potential mechanisms and therapeutic targets for chronic cognitive dys-function. For recent reviews on DA cellular function and DAergic mediated cognition, see Verheij and Cools (2008) and El-Ghundi et al. (2007), respectively.
DA represents a unique signaling system within the central nervous system (CNS) due to its role as both a neurotransmitter and neuromodulator. Furthermore, DA receptors are abundantly expressed in brain areas known to be damaged after TBI, such as the frontal cortex and striatum, which are important for cognitive function (Seeman et al., 1978; Baron et al., 1985; McDonald et al., 2002; Chudasama and Robbins, 2006). The hippocampus, which is also critical for cognitive function does not have a high level of DA receptor expression, but is dependent on DA activity to modulate function (Lemon and Manahan-Vaughan, 2006; O'Carroll et al., 2006; Granado et al., 2008).
Cognitive disorders experienced by TBI patients can present immediately after the initial injury or evolve during the subsequent months to years. Regardless of presentation, many patients live with sustained alterations in cognition and behavior for the rest of their lives (Millis et al., 2001). Non-pharmacological options for TBI patients experiencing cognitive and behavioral dysfunction are limited, with cognitive training paradigms often being the only consistent treatment provided. However, cognitive training effectiveness has not been fully validated, and program implementation can be variable (Turner-Stokes et al., 2005).
Persistent cognitive deficits can be categorized into one of three general domains: attention and processing speed, memory, and executive function (Gronwall, 1976; Levin and Grossman, 1978; Gronwall and Wrightson, 1981; Stuss et al., 1985; Binder, 1986, 1987; McMillan and Glucksman, 1987; Levin et al., 1988a,b; Gentilini et al., 1989; Stuss et al., 1989; Leininger et al., 1990; Binder, 1997; Binder et al., 1997; McMillan, 1997). Of these, memory difficulties are the most commonly reported and most difficult for patients and caregivers (Binder, 1987). In addition to cognitive difficulties, TBI survivors often experience behavioral difficulties characterized by enhanced emotional lability and alterations in affect (Fugate et al., 1997). Importantly, a prior TBI has been shown to be a risk factor for developing psychiatric and psychotic disorders (Koponen et al., 2002). Due to its often diffuse nature and the variety of cognitive disturbances experienced post-TBI, it has been difficult to localize a single disruption in neural function that could explain such an array of events. Indeed, persistent deficits experienced by TBI patients are probably due to a wide spectrum of different neural system dysfunctions. However, that does not preclude the possibility of utilizing a targeted therapeutic strategy to enhance cognitive recovery after TBI. DA is a particularly important system to study in this context because it is known to have an important role in physiological events relevant to cognition and in numerous systems also affected by TBI, including the hippocampus, striatum, and frontal cortex.
Fig. 1 is a depiction of the rat CNS demonstrating an overlay of injury processes that occur with a TBI and their direct relationship to DAergic pathways important to cognition. As illustrated, TBI can have widespread effects on brain anatomy and function within DAergic regions. What is not depicted in Fig. 1 is the indirect effect of TBI on DA signaling, including disruptions in glutamatergic cortico-striatal projections and striatal gamma-aminobutyric acid (GABA)ergic outputs.
Ascending DAergic pathways in the CNS can be divided into two predominant systems: (1) the nigrostriatal pathway [substantia nigra (SN) innervating striatum], and (2) the mesocorticolimbic pathway [ventral tegmental area (VTA) projecting to the prefrontal cortex (PFC), hippocampus, amygdala, and nucleus accumbens (NAcc)] (Alexander and Crutcher, 1990; Graybiel, 1990). Projections from the mesocorticolimbic pathway are believed to be involved in modulating memory consolidation (Ploeger et al., 1991, 1994; Cools et al., 1993; Setlow and McGaugh, 1998; Coccurello et al., 2000), motivation (Mitchell and Gratton, 1994; Salamone, 1994; Baldo and Kelley, 2007), and drug reinforcement and addiction (Carelli, 2002; Schultz, 2004; Salamone et al., 2005; Berridge, 2006; Di Chiara and Bassareo, 2007; Ikemoto, 2007; Sutton and Beninger, 1999). Change in the release of DA in the mesocorticolimbic system is also associated with neuropsychiatric disorders, arousal, stress, and addiction (Tidey and Miczek, 1996; Viggiano et al., 2003; Sonuga-Barke, 2005). The nigrostriatal system is predominantly associated with voluntary movement (Hornykiewicz, 1966; Seeman and Niznik, 1990; Jackson and Westlind-Danielsson, 1994), but it has also been shown to be important for behavioral events including reward processing (Wickens et al., 2007) and acquisition of spatial learning and memory (Mura and Feldon, 2003). Research investigating Parkinson's disease (PD) has also shown that dysfunctional nigrostriatal signaling has implications for other cognitive functions including memory, executive function, and attention (Tamaru, 1997; Ridley et al., 2006).
The striatum, which includes the NAcc and caudate putamen, exists as part of an anatomic network that subserves functions associated with the dorsolateral PFC (DLPFC), but also receives inputs from numerous other brain areas including the hippo-campus and limbic cortex. The DLPFC has dense projections to the head of the caudate and there exist reciprocal pathways back to the DLPFC through the thalamus (Middleton and Strick, 2000). Due to this complex relationship with surrounding cortical and subcortical structures through DAergic projections, the striatum is in a prime location for mediating human cognition. Studies have demonstrated that both the striatum and DLPFC are important for executive function and working memory (WM) (Crosson, 2003).
In Huntington's disease (HD), projections from the caudate to the frontal lobes are disrupted and result in significant motor, attentional, and executive dysfunction (Brandt et al., 1988; Zakzanis and Kaplan, 1999). Experiments producing lesions in the striatum suggest that the caudate, in particular, plays a specific role in cognition. Damage to the caudate produces deficits that resemble damage to corresponding projection targets of the PFC (Divac et al., 1967). In nonhuman primates, metabolic activity within the striatum has also been linked to specific changes in WM task performance (Levy et al., 1997). In a recent review by Grahn et al. (2009) the authors examine the role of the basal ganglia relevant to learning and memory concluding that the goal-directed behaviors subserved by basal ganglia function are crucial to all forms of normal behavior.
Human neuroimaging studies also support the role of the striatum in cognition. PET studies using 18F-dopa in individuals with PD show a correlation between DA depletion and neuropsychological performance (Broussolle et al., 1999; Marie et al., 1999; Bruck et al., 2001; Duchesne et al., 2002). PET imaging studies in HD using C11-Raclopride indicate that striatal DA receptor subtype 2 (D2) binding is decreased and is sensitive to cognitive performance on a variety of tasks including executive function, attention, and WM (Backman et al., 1997; Lawrence et al., 1998). While striatal dysfunction is associated with cognitive sequelae of HD and PD, DAergic system dysfunction within the PFC has been strongly tied to attentional and cognitive symptoms associated with schizophrenia and attention deficit hyperactivity disorder (ADHD; Heilman et al., 1991; Tassin, 1992; Knable et al., 1997; Tanaka, 2006).
These clinical studies strongly suggest that the striatum and PFC are functionally important for a variety of cognitive behaviors, and that the alterations in DA signaling appear to be the underlying cause of cognitive dysfunction in a variety of disease states. In TBI, it is known that both the striatum and PFC are vulnerable to damage. The known effects of TBI on the striatum include axonal degeneration (Ding et al., 2001), neuronal cell loss (Dunn-Meynell and Levin, 1997), and ischemia (Dietrich et al., 1994). Effects on the PFC include decreased glucose metabolism (Fontaine et al., 1999), changes in frontal lobe blood flow during memory tasks (Ricker et al., 2001), and hypoactivation with memory tasks (Sanchez-Carrion et al., 2008). Furthermore, experimental models of TBI have shown fairly robust effects on hippocampal neurons as demonstrated by significant loss in the CA2 and CA3 regions (Dixon et al., 1987; Hicks et al., 1993; Smith et al., 1994). While the hippocampus is not generally thought of in reference to DA signaling, it does have glutamatergic projections to the striatum that are important to the activity of GABAergic medium spiny neurons. Striatal medium spiny neurons also receive DAergic input from the VTA and SN (Meredith et al., 1990; Pennartz and Kitai, 1991). Furthermore, DA receptors in the hippocampus receive projections from the SN and have been shown to facilitate the maintenance of long-term potentiation (LTP), which is hypothesized to be the physiologic basis for memory formation and consolidation (Li et al., 2003; Lemon and Manahan-Vaughan, 2006).
Following TBI, patients often demonstrate confusion as well as an inability to concentrate. They are also distracted, have difficulties performing more than one task at a time, and require increased time to perform tasks (Gentilini et al., 1989; Draper and Ponsford, 2008). While these tasks are often broad in nature, they share a commonality in the form of attention processing. TBI patients consistently show impairments on measures of processing speed, including the Symbol Digit Modalities Task and Digit Symbol Coding. Attentional processing is believed to be a widely distributed cognitive function that involves both cortical and subcortical pathways including striatal and thalamic inputs and reticular activation. However, it is generally accepted that DA function plays a significant role in the ability to focus attention (Cohen and Servan-Schreiber, 1992; Wise et al., 1996; Brennan and Arnsten, 2008). In ADHD, DAergic drug treatments have been shown to be effective in treating attentional disorders (Solanto, 1998). This does not rule out other possible neural dysfunctions as being the underlying cause of attentional difficulties after TBI, but it does provide compelling reasons to examine DAergic function in TBI.
Attention processing can be difficult to localize given that there are multiple modalities of attentional function, including auditory, tactile, and visual attention (Arciniegas et al., 2000; Spence and Gallace, 2007; Adair and Barrett, 2008). Furthermore attention can refer to a wide variety of different cognitive processes that are both voluntary and involuntary and involve several brain systems including the parietal cortices, basal ganglia, PFC, and anterior cingulate cortex (Raz, 2004). In contrast, the anatomy of memory function is comparatively better understood with specific memory processing events (e.g., retrieval, consolidation) ascribed to specific brain regions (Izquierdo and Medina, 1997; Izquierdo et al., 1997). In particular, damage to the hippocampus has historically been associated with reproducible deficits in spatial and temporal memory processing (Buckley, 2005). In experimental TBI it is known that the hippocampus is exquisitely sensitive to both acute apoptotic events and excitotoxicity (Kotapka et al., 1991; Hicks et al., 1993; Dietrich et al., 1994; Smith et al., 1994). Regionalization of memory function and the reduction of hippocampal memory processing to spatial and temporal memory are simplistic views of memory processing in the CNS, but have helped guide research in the mechanisms of cognitive dysfunction following diffuse injuries such as TBI. However, at least one study (Lyeth et al., 1990) showed spatial memory impairments without clear neuronal loss in the hippocampus. Furthermore, strategies to reduce neuronal cell loss in the hippocampus using N-methyl-d-aspartic acid (NMDA) receptor antagonists and inhibitors of apoptosis have not always demonstrated behavioral improvements that correlate well with the level of neuronal sparing (Tolias and Bullock, 2004). These findings have challenged the practice of attributing cognitive dysfunction to discrete damage in specific brain regions after TBI and raised a new set of challenges for TBI research to look beyond anatomic damage and into functional studies.
Numerous studies have shown that after TBI there is a dysfunction in hippocampal LTP (Reeves et al., 1995; Sanders et al., 2000). The reasons for this impairment are not entirely clear. Falo et al. (2006, 2008) have suggested that the impairments may be due to changes in synaptic composition and dysfunction in normal molecular processes that influence synaptic plasticity in the hippocampus, including dysfunction in scaffolding proteins. What has not been examined is the role that DA may have in this process. It is known that blockade of D1/D5 receptors in the hippocampus eliminates late LTP and can even effect early LTP, suggesting that synaptic plasticity in the hippocampus depends upon a synergistic interaction between glutamate and DA (Frey et al., 1991; Granado et al., 2008). This raises the question about what other structures and what other events, besides hippocampal neuronal loss, could also be relevant to memory function after TBI.
The PFC and corticostriatal DA signaling system have, in addition to the hippocampus, been shown to be important for memory formation. The aspect of memory generally assigned to the PFC has been that of WM. WM commonly refers to those cognitive processes that provide the capacity to maintain and manipulate a limited amount of information over a brief period of time (Baddeley et al., 1986; Baddeley, 1992). WM incorporates aspects of divided attention, which is vulnerable to disruption after TBI (Stuss et al., 1985; Levin, 1990; Ponsford and Kinsella, 1992; McDowell et al., 1997). For example, a study of patients one year following treatment for severe TBI consistently found that these individuals were significantly impaired on the Paced Auditory Serial Addition Test, a demanding task that recruits WM (Levin, 1990). WM deficits can be particularly disabling given the critical role of WM in overall intellectual functioning (Smith et al., 1996). WM processes are, arguably, an encoding process for long-term memory (Johnson, 1992). WM is also important for a wide variety of cognitive skills, such as problem solving, planning, and active listening (Jonides, 1995). Individuals with significant WM deficits have great difficulty recording features from a changing environment and keeping them in mind in order to guide behavior (Smith et al., 1996). As such, WM is a primary and critical component of all aspects of cognition, and impairment in this cognitive domain can be particularly disruptive for everyday functioning
TBI also causes an impairment in executive control or executive functioning, a critical aspect of cognition (Hanks et al., 1999; McDonald et al., 2002). Although individuals and clinicians often report or emphasize “memory” as being a primary functional concern, executive control dysfunction might actually be the most disabling aspect of cognitive compromise after brain injury (Mateer, 1999; Millis et al., 2001). It is known that impairments in executive control can compromise other aspects of cognition, such as memory for verbal information (Tremont et al., 2000) and visual information (Ricker et al., 1994; Lange et al., 2000; Ricker et al., 2001).
Although not as well studied as cognitive deficits after TBI, it is widely recognized that TBI patients experience alterations in emotional control and general behavior (Oddy et al., 1985; Arciniegas et al., 2000). Dyer et al. (2006) reported that of three groups, TBI, spinal cord injury, and non-injured, the TBI group was more likely to be rated worse in areas of impulsivity and verbal aggressiveness. Patients with TBI are also known to have increased rates of depression (Seel et al., 2003; Jorge et al., 2004; Moldover et al., 2004). The mesocorticolimbic DA signaling pathway has been implicated in emotional and behavioral disturbances (Mega and Cummings, 1994). In fact, in schizophrenia, many of the observed affective mood disorders are proposed to be due to changes in DA signaling within the mesocorticolimbic signaling pathway (Abi-Dargham and Moore, 2003).
FP and CCI rodent models of TBI have been used extensively to produce deficits reminiscent of those seen clinically. Both models produce spatial learning and WM deficits in adult rats (Lyeth et al., 1990; Smith et al., 1991; Hamm et al., 1992, 1993, 1996a,b; Hicks et al., 1993; Colicos et al., 1996; Dixon et al., 1996, 1997; Scheff et al., 1997; Kline et al., 2000, 2002, 2004, 2007a,b, 2008; Wagner et al., 2002) and mice (Smith et al., 1994; Fox et al., 1998; Whalen et al., 1999) as tested in the Morris water maze (MWM). One concern when utilizing the MWM paradigm of memory testing is that this task represents a particularly stressful environment to animals. Furthermore, the MWM task is both a learning and memory task, and it is possible that dysfunction in MWM performance is, in part, due to defects in learning and coping strategies associated with damage in thalamic structures (Markowitsch, 1982; Aggleton and Brown, 1999; Van der Werf et al., 2003). Importantly experimental TBI has also been shown to cause dysfunction in less stressful paradigms including open field exploration and radial arm maze tasks (Lyeth et al., 1990; Soblosky et al., 1996; Lindner et al., 1998; Enomoto et al., 2005; Wagner et al., 2007b). While open field exploration tasks typically assess mobility and anxiety, changes in the exploration of novel environments is considered an important indicator of learning and memory function in rats if other factors (anxiety, olfactory sensitivity, etc.) are properly controlled for (File, 2001; Christoffersen et al., 2008). Interestingly deficits post-TBI have also been observed in passive-avoidance tasks, which are typically considered to be a hippocampal independent measure of learning (Hamm et al., 1994; Hogg et al., 1998; Milman et al., 2005). Due to its versatility and consistent results the MWM task remains a useful measure of memory dysfunction following experimental TBI. However, it is important to consider its limitations and the contribution of other cognitive processes to MWM deficits.
Analysis of the affected brain regions in animal models coupled with clinical studies have helped identify the brain areas thought to be responsible for cognitive processes commonly affected by TBI. The PFC, hippocampus, striatum, and limbic structures have all been shown to be sensitive to damage after TBI (Dixon et al., 1987; Lighthall, 1988; Lighthall et al., 1989). The degree of damage to these structures depends in large part on the localization and severity of TBI. Mild TBI models may only show diffuse white matter damage, cortical cell loss, and some hippocampal cell loss (Hicks et al., 1993; Sanders et al., 2001), while more moderate to severe injuries show greater degrees of both cortical and subcortical neuronal death and damage to structures beyond the contusion site including ischemic alterations in subcortical structures (Dietrich et al., 1994; Hellmich et al., 2005). There are also differences in structural damage depending upon the location of the insult (Lighthall et al., 1989; Thompson et al., 2005).
Animal studies of spatial learning and WM, coupled with clinical research examining executive function, attention, and behavior have consistently demonstrated prolonged cognitive dysfunction after TBI. Due to its importance to WM, executive function, behavior and emotion, and psychosis, DA represents a promising avenue of research in TBI therapy. TBI also shares some similarities to PD and other DA disorders. Similar to PD, TBI patients can experience memory impairment, bradykinetic motor dysfunctions, and decreases in cognitive processing speed. TBI patients also exhibit attention difficulties similar to ADHD, such as easy distractibility. Unlike pure disorders in DA function, TBI has its own unique set of concerns including inflammation, white matter damage, disruption of other neurotransmitter systems, and a unique series of temporal events in DA effects that are unlike other insults. For these reasons, understanding the DAergic signaling system and its relation to brain function is an important step in the proper utilization of DA targeted treatment strategies.
In the brain, DAergic neurons arise from the VTA and SN and project to the striatum, cortex, limbic system, and hypothalamus (Graybiel, 1990). DA influences on a number of physiologic functions including hormone secretion, movement control, motivation, emotion, and cognitive processing (Jackson and Westlind-Danielsson, 1994; Floresco and Magyar, 2006).
The unique effects of DA at each of its terminal sites are mediated by membrane receptors belonging to the large seven transmembrane domain (7TM) G-protein coupled receptors. Activation of DA receptors leads to alterations in intracellular second messengers either through formation or inhibition. Cloning experiments have identified five different DA receptors that are divided into two groups based upon their structural and pharmacological properties; D1-like and D2-like receptors (Bunzow et al., 1988; Dal Toso et al., 1989; Dearry et al., 1990; Zhou et al., 1990). Each of these receptor families have unique differences in regional CNS expression and unique intracellular signaling pathways that will be discussed briefly in the context of cognition. For a more complete review of the complex biochemical signaling pathways related to each receptor and the information gained from knockout mouse studies (D1–D5) see Gonon et al. (2000) and Tan et al. (2003).
In addition to different subsets of receptor populations throughout regions of the brain, DA effects are regulated through a complex control of DA release, re-uptake, and metabolism. Changes in a number of DA constituents, including the DA transporter (DAT) and tyrosine hydroxylase (TH), can effectively alter DA extracellular concentrations and its physiologic effect without directly affecting binding or receptor response.
DA is synthesized first by the hydroxylation of the amino acid l-tyrosine to 3,4-dihydroxy-l-phenylalanine via the enzyme TH and is then decarboxylated by l-amino acid decarboxylase to DA. DA can be inactivated either by reuptake via the DAT and subsequent enzymatic breakdown by catechol-O-methyl transferase and monoamine oxidase or repackaged into vesicles for reuse (for further clarification of these systems and the role they may play in TBI injury see Section 4).
The DAT is particularly important to presynaptic DA regulation, and differences in DAT expression due to genetic polymorphisms have been implicated in a number of diseases including bipolar disorder and ADHD (Greenwood et al., 2001; Thapar et al., 2005; Swanson et al., 2007). Furthermore, DAT inhibition is the mechanism of action for a number of pharmaceuticals designed to enhance DA neurotransmission as well as drugs of abuse, such as cocaine (Hitri et al., 1994; Madras et al., 1994).
The DAT is a Na+/Cl–-dependent plasma membrane neurotransmitter transporter containing twelve transmembrane domains with both the amino and the carboxyl termini located on the intracellular side of the membrane (Hersch et al., 1997). DAT terminates the action of vesicular DA release at the synapse via reuptake of extracellular DA (Torres et al., 2003) and acts as a reverse transporter of DA under basal conditions (Borland and Michael, 2004). Regional distribution of DAT has been found in areas of the brain with established DAergic circuitry including mesostriatal, mesolimbic, and mesocortical pathways (Ciliax et al., 1999). The rate at which DAT removes DA from the synapse has a profound effect on the amount of DA in the cell. This is best evidenced by the severe cognitive deficits, motor abnormalities, and hyperactivity seen in DAT knockout mice (Perona et al., 2008). DAT activity and expression has also been shown to change during normal aging (Bannon et al., 1992) and differs between males and females (Piccini, 2003).
There are many ways in which the DAT may be chronically regulated, including gene transcription, post-translational modifications, oligomerization, and trafficking (Doolen and Zahniser, 2001). Second messenger systems, in particular protein kinase C (PKC) activation, affect transporter activity (Huff et al., 1997). Activation of PKC via substrate and inhibitor binding appears to regulate DAT by altering its cellular distribution; specifically by altering levels of membrane bound DAT (Daniels and Amara, 1999). In addition, DA itself can regulate DAT via its interaction with the transporter or pre-synaptic autoreceptors (Williams and Galli, 2006). The DAT is also the target of several “DAT-blockers” including amphetamines and methylphenidate (MPD). These chemicals inhibit the action of DAT and, to a lesser extent, the other monoamine transporters. In contrast, amphetamines trigger a signal cascade thought to involve PKC or mitogen-activated protein kinase (MAPK) that leads to the internalization of DAT molecules, which are normally expressed on the neuron's surface (Kahlig et al., 2004).
The D1-like receptor family includes D1 and D5 receptors. For the purpose of this review, we will focus on the D1 receptor because the D5 receptor is less well studied in TBI. The distribution of the D1 receptor differs dependent upon the brain region that is being examined. Within the frontal cortex, D1 receptors are localized on post-synaptic dendrites of both pyramidal and non-pyramidal neurons. Similar structural localization of D1 receptors can be seen in the hippocampus and limbic cortex (Dearry et al., 1990; Monsma et al., 1990; Zhou et al., 1990). However, within the striatum, D1 receptors have been localized to non-synaptic dendritic spines suggesting that DA activity within the striatum is unique compared to cortical structures (Hersch et al., 1995; Caillé et al., 1996).
D1 receptor activation is classically associated with the stimulation of adenylate cyclase (AC) and subsequent activation of cAMP. However, it has also been demonstrated that D1 receptors are able to activate phosphoinositide hydrolysis (Undie et al., 1994) and inhibit arachidonic acid release (Schinelli et al., 1994). These varied receptor signaling mechanisms demonstrate the wide range of effects that D1 agonism or antagonism can have in the CNS. Activation of phosphoinositide hydrolysis is known to effect intracellular trafficking, cell growth, differentiation, and survival (Undie et al., 1994, 2000; Ming et al., 2006; Liu et al., 2008). Arachidonic acid release is important for inflammatory signaling and neuronal functioning such as LTP and synaptic plasticity (Tassoni et al., 2008). While both of these D1 receptor actions are important to DAergic function and have the potential to mediate insults to DA containing neurons affected by TBI, it is D1 receptor mediated control over intracellular cyclic adenosine monophosphate (cAMP) signaling that represents perhaps the most pertinent aspect of D1 signaling to cognition. Manipulation of cAMP levels can affect cAMP response element binding (CREB) through both MAPK dependent and MAPK independent pathways (Greengard, 1976; Schulman, 1995; Waltereit and Weller, 2003). In DA containing neurons, there is an intracellular signaling molecule, DA and cAMP-regulated phosphoprotein mKDa 32 (DARPP-32), which is known to mediate a number of important cellular signaling events and is intimately involved in cAMP signaling (Ouimet et al., 1984; Walaas and Greengard, 1984; Hemmings and Greengard, 1986; Greengard et al., 1999).
D1 knockout mice demonstrate spatial learning deficits (El-Ghundi et al., 1999), and D1a knockout mice show broad impairments in initiation of activity learning of cue related tasks (Smith et al., 1998). However, the idea of receptors being impaired versus not impaired may be a simplistic viewpoint of DA interaction at the D1 receptor. A review by Williams and Castner (2006) describes the mathematical relationship of tight DAergic control and its relation to dysfunction. They point out that D1 receptor mediation of PFC function is often dependent upon both concentration and the temporal sequence of DA release. A series of papers examining DA signaling and calcium regulation describe a paradigm in which there exists a difference in the kinetics of CREB phosphorylation that is dependent upon the length of incubation with a D1 agonist (Liu and Graybiel, 1996, 1998a,b). Ruskin and Marshall (1997) have shown that D1 receptor mediated induction of Fos in striatal neurons is dependent on D2 activation. Goldman-Rakic (1995) has described at least three possible cellular mechanisms for DAergic control of WM in the PFC; (1) direct synaptic control over pyramidal neuron activity, (2) nonsynaptic interaction with DA receptors located on distal spines of PFC pyramidal neurons, or (3) indirect modulation through GABAergic interneurons. Frey et al. (1991) have demonstrated that blockade of D1 receptors in the hippocampus impairs hippocampal late LTP maintenance, which may be another reason DA dysfunction impairs memory. This alteration in hippocampal LTP associated with D1 receptor changes is believed to contribute to spatial memory deficiencies observed in D1 receptor knockout mice (Matthies et al., 1997; Granado et al., 2008).
Recent studies in PD (Cooper et al., 1991; Postle et al., 1997) have shown that WM dysfunction is often one of the first cognitive symptoms experienced by patients. This WM deficit is believed to be D1 receptor-mediated because D1 agonist treatment has been shown to alleviate impairments in young and aged non-human primates (Arnsten et al., 1994; Cai and Arnsten, 1997; Castner and Goldman-Rakic, 2004), as well as in other conditions characterized by prefrontal DA loss, such as chronic stress (Mizoguchi et al., 2000), chronic neuroleptic treatment (Castner et al., 2000), and 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP) treatment-induced delayed response performance (Schneider et al., 1994a,b). How hippocampal DA loss is associated with these processes is unclear, but in non-PD paradigms, the administration of D1 agonists has been shown to facilitate hippocampal LTP (Otmakhova and Lisman, 1996).
The D2-like receptor family comprises D2, D3, and D4 receptors. For the purpose of this review, the focus will be on D2 receptors because D3 and D4 receptors are less well studied in TBI. D2 receptors are different from D1 in that they are associated both with terminal receptor activation and with presynaptic auto-receptor function and demonstrate a much greater expression in the striatum compared to other brain regions. However, like D1, D2 receptor expression in the striatum and frontal cortex are of particular interest in cognitive research. Interestingly, within the striatum, the pre and post-synaptic localization of the D2 receptors maintain a much closer proximity to actual synapses as opposed to the more diffuse location of D1 receptors on spines.
D2 receptor activation is more complex than D1 activation for a number of reasons. For example, D2 receptors can act through inhibitory G-proteins that lead to an inhibition of AC and cAMP (Cote et al., 1983; Onali et al., 1985; Weiss et al., 1985) or independently of cAMP pathways (Memo et al., 1986). D2 receptors have also been shown to inhibit phosphoinositide hydrolysis and subsequent Ca2+ mobilization (Vallar and Meldolesi, 1989; Picetti et al., 1997). An interesting caveat of D2 receptor activation is that in vitro studies utilizing co-immunoprecipitation methods and in vivo studies utilizing fluorescent and bioluminescent energy transfer (FRET and BRET) analyses have shown that many D2 receptor systems exist in heterodimerized pairs (Canals et al., 2003; Fuxe et al., 2005). D2 receptors have been shown to form heterodimers with adenosine A2a receptors and metabo-tropic glutamate-5 (Mglut-5) receptors (Diaz-Cabiale et al., 2002; Fuxe et al., 2003, 2005; Hillion et al., 2002; Ferré et al., 2008). A2a and Mglut-5 heterodimers are of particular interest for TBI research due to their role in striatal LTP and long-term depression (LTD) and subsequent plasticity of striatal connections (Ferré et al., 2002; Fuxe et al., 2003). The identification of heterodimer systems within the striatum indicate that D1 and D2 have control over CREB signaling within striatal neurons. Further, they modulate LTP and LTD dependent glutamatergic and adenosine transmitter systems (Calabresi et al., 2000, 2007; Centonze et al., 1999, 2003).
In PD, the contribution of neurotransmitters beyond DA is important to therapeutic manipulation as utilization of these other receptor systems provides an alternative way to manipulate DAergic signaling (Kase et al., 2000). D2–A2a interactions proposed by Fuxe et al. (2005) allow for precise alterations in intracellular signaling pathways that have been associated with cognitive function and striatal output. Mglut-5 receptors have also been implicated as important for control of DA signaling at the D2 receptor in a similar fashion to the A2a receptor.
DA signaling cascades in the striatum are involved in numerous physiologic functions including synaptic plasticity, movement regulation, and even the modulation of other neurotransmitter systems (e.g., acetylcholine, calcium, glutamate, and GABA). DARPP-32 is a cytoplasmic phosphoprotein found in 95% of the medium spiny neurons in the striatum (Fig. 2), and it plays a central role in nearly all DA mediated events (Greengard et al., 1999; Calabresi et al., 2000; Svenningsson et al., 2005; Valjent et al., 2005). Two distinct phosphorylation sites, threonine-34 (T34) and threonine-75 (T75), make DARPP-32 a bifunctional signal transduction molecule that controls the activities of both protein phosphatase-1 (PP1) and protein kinase A(PKA) (Halpain et al., 1990; Nishi et al., 2002). The activity of both PP1 and PKA tightly regulates protein transcription related to numerous important cellular functions including neurotrophic factor production, regulation of synaptic plasticity, and cell homeostasis.
DA-induced phosphorylation of ionotropic glutamate and GABA receptors are attenuated in DARPP-32 knockout mice, suggesting that DARPP-32 plays an important role in regulating excitatory neurotransmission (Yan et al., 1999; Flores-Hernandez et al., 2000, 2002). Ethanol reinforcement, which is known to act through regulation of NMDA receptors, is also reduced in DARPP-32 knockout models (Risinger et al., 2001; Maldve et al., 2002). Studies assessing the behavioral affects of other drugs of abuse, including cocaine and morphine, have shown that DARPP-32 knockout mice demonstrate lower levels of psychomotor activation following drug administration compared to wild-type mice (Borgkvist et al., 2007). Further evidence using knockout mice indicates that DARPP-32 is necessary for the induction of striatal LTD and LTP, both important processes in models of memory acquisition and consolidation (Calabresi et al., 2007). Furthermore, after hypoxia-ischemia injury, DARPP-32 phosphorylation states were shown to be important to membrane potential, glutamate receptor activity, and oxidative stress (Yang et al., 2007).
DA acts through D1 receptor mediated increases in PKA to promote DARPP-32 phosphorylation at its T34 site, which leads to an inhibition of PP1. Additionally, DA, glutamate, and adenosine act on protein phosphatase 2B (PP2B), also known as calcineurin, and protein phosphatase 2A (PP2A) to decrease phosphorylation at T34 and increase phosphorylation at T75. A reduction in phosphorylation at T34 subsequently removes the inhibitory affect of DARPP-32 on PP1 (Greengard et al., 1999; Nairn et al., 2004; Svenningsson et al., 2004). The regulation of PP1 and PKA by DARPP-32 allows convergent DA, glutamate, and adenosine signaling to alter the phosphorylation state of NMDA receptor subunits, the sodium and potassium adenosine triphosphatase (Na/K ATPase), and members of the extracellular regulated kinase (ERK) pathway (Bertorello et al., 1990; Blank et al., 1997; Fienberg et al., 1998; Snyder et al., 1998; Dudman et al., 2003; Hakansson et al., 2004).
Given the wide range of cellular processes mediated by DA, its interaction with DARPP-32 in the striatum, and the effect that DA action at its different receptor systems have on PFC and hippocampal function, it is clear that even minor disturbances in DA function could have significant implications for CNS function.
The efficacy of DA receptor agonists suggests that TBI patients benefit from the promotion of central DAergic transmission. This could be a sign that DA release is suppressed after injury, that DA uptake is over active, or some combination of the two. Alternatively, it might be the case that DA activity remains normal after injury, but that basal DA activity is inadequate in the face of the injury-induced disruptions. Given that a few studies have also shown benefits with DA antagonists it must also be recognized that TBI possesses a complex series of temporally specific injuries to a wide range of different brain structures. It is important to acknowledge that any systemic treatment without a well-defined window of therapy could be both beneficial and detrimental to the recovery process. For this reason, understanding the effect of TBI on DA transmission is crucial for proper therapeutic management and for promoting optimal recovery.
Evidence that DA systems are altered in humans after TBI is predominantly based on reports that neurostimulants are beneficial in attenuating cognitive deficits (Goldstein, 2003; McAllister et al., 2004) and data showing altered DA transporter binding after TBI (Donnemiller et al., 2000). Donnemiller et al. (2000) used single photon emission computed tomography (SPECT) to show that striatal DAT binding is decreased in patients 4–5 months after severe TBI, even in cases where no anatomical evidence of direct striatal injury exists.
After experimental TBI, alterations in catecholamine systems have been found in various brain regions and have been shown to be time-dependent (Huger and Patrick, 1979; Dunn-Meynell et al., 1994; McIntosh et al., 1994; Massucci et al., 2004). For example, regional increases in DA levels at acute time points after acceleration–deceleration brain injury were reported by Huger and Patrick (1979). A transient increase in DA in both the striatum, up to 6 h, and the hypothalamus, up to 24 h, have been identified utilizing microdialysis (McIntosh et al., 1994). Interestingly, in the McIntosh et al., 1994 study cortical tissue DA levels are actually depressed for up to 2 weeks post-FPI. In a CCI model of brain injury, there were significant increases in rat brain tissue DA levels and metabolism at 1 h in the contralateral frontal cortex and 1 day in the ipsilateral frontal cortex (Massucci et al., 2004). Tissue DA levels were also elevated in both the ipsilateral and contralateral striatum at 1 h compared to sham animals. Metabolism of DA was measured by the dihydroxyphenylacetic acid (DOPAC)/DA ratios demonstrating a bilateral increase in DA metabolism at 1 hr in the striatum of CCI injured rats (Massucci et al., 2004). TH activity and catecholamine increases (both DA and NE) have also been seen in the prelimbic and infralimbic cortices, areas critical to PFC function, up to 2 weeks following CCI (Kobori et al., 2006). Increases in human cerebrospinal fluid (CSF) DA and its metabolites post-TBI have been shown to depend on both gender and genetic variations in the DAT. Furthermore, the systemic administration of DA as an inotropic agent in TBI patients was also associated with higher CSF DA (Wagner et al., 2007c). Increased DA metabolism may represent a compensatory response to increased tissue DA levels or may be due to further direct effects of TBI on DA regulation. The consequences of these changes can be either beneficial in that DA neurotransmission is restored after TBI or could potentially be deleterious due to DA-induced oxidative stress.
Fig. 3 shows a summary of noted changes in DA at the cellular level after TBI, which includes alterations in TH, DAT, and DA receptors.
Given the importance of DA receptors to cognition, alterations in receptor expression has been another area of interest following TBI. Henry et al. (1997) reported a transient decrease in striatal D1 receptors immediately after injury,followedbyanincreaseat1day, and a subsequent return to pre-injury levels. No significant alterations in D2 receptor binding was reported in the study by Henry and colleagues. Direct analysis of DA D2 receptor protein via Western blots has shown no significant reduction in striatal D2 receptor or D1 receptor expression at 2 weeks in a rat model of TBI (Wagner et al., 2005, 2009). This finding suggests that DAergic dysfunction following TBI is not entirely mediated by changes in DA receptors.
In addition to temporal alterations in tissue DA levels there have also been changes observed in TH, the rate-limiting enzyme in catecholamine synthesis. TBI has been shown to increase TH protein in the rat frontal cortex at 28 days post-injury (Yan et al., 2001). Measured increases in TH protein are most likely due to enhanced synthesis as phospho-TH is also increased (Kobori et al., 2006). In contrast, DA beta hydroxylase protein levels were not altered after TBI suggesting that the increase in TH occurred predominantly in DAergic axons (Yan et al., 2001). The absence of a decrease in TH positive SN neurons further differentiates TBI from PD. Increases in TH protein have also been observed in the striatum with a similar temporal profile (Wagner et al., 2005, 2009; Yan et al., 2007).
DAT is a crucial protein in the regulation of DA neurotransmission, playing a central role in determining the duration of action of DA by rapidly taking up extracellular DA into pre-synaptic terminals after release (Horn, 1990; Gainetdinov et al., 1998). Studies in animals lacking expression of the DAT gene (Gainetdinov, 2008; Wu et al., 2007) suggest that this protein is perhaps the single most important determinant of the extraneuronal concentration and duration of DA. Differences in the number of uptake sites (Nirenberg et al., 1997; Sesack et al., 1998) in different brain regions provide DA with different extracellular lifetimes (Garris et al., 1994). Regional decreases in total DAT expression have been reported after CCI (Wagner et al., 2005, 2009). Alterations in DAT expression suggest that improvements in cognition and neurobehavioral recovery reported in experimental (Kline et al., 1994, 2000; Goldstein, 2003) and clinical (Whyte et al., 1997, 2004) TBI studies with the use of DAT inhibitors may, in part, confer their beneficial effects by increasing striatal extracellular DA in a post-injury environment where cortical influences on striatal DA neurotransmission may be impaired.
Recent work suggests that both frontal cortex and striatal decreases in total DAT expression post-TBI are gender specific and occur primarily in males (Wagner et al., 2005). Estrogen is known to have both a developmental and signaling role in DA systems so it is not surprising that TBI has different effects on DA signaling in males versus females. Specifically Wagner et al. (2005) showed that, when compared to gender-matched controls, male rats demonstrated a proportionally larger decrease in DAT expression compared to females. Furthermore, it was shown that environmental enrichment, which improves cognitive recovery (Hamm et al., 1996a; Passineau et al., 2001; Wagner et al., 2002; Kline et al., 2007b; Hoffman et al., 2008b) exerted bigger effects on post-injury DAT reductions in females compared to males. What remains unclear is whether the DAT changes that occur in males and females following TBI are beneficial or detrimental. Interestingly there is evidence that other members of the catecholamine metabolism system being altered post-TBI. Specifically there have been noted increases in catechol-O-methyl transferase expression 24 h post-TBI that persists for up to 14 days in the microglia of the injured hippocampus, suggesting a possible compensation for observed changes in DAT activity and providing further evidence of DA dysfunction (Redell and Dash, 2007).
Studies examining DA neurotransmission have demonstrated reduced evoked DA overflow and altered kinetics of DA clearance in the striatum when assessed utilizing fast scan cyclic voltammetry and using a medial forebrain bundle stimulation paradigm (Wagner et al., 2005). Recent work suggests that daily treatment with MPD for two weeks after CCI reverses deficits in DA neurotransmission. Interestingly, after two weeks of MPD treatment, there were no significant changes noted in DAT expression despite robust changes in DA neurotransmission and kinetics parameters (Wagner et al., 2009). Functional changes in DAT activity and trafficking, as well as other changes in DA receptor function may be responsible for the MPD mediated effects in DA neurotransmission observed.
What effect these alterations in DAT, evoked DA release, and TH increases have on DA signaling has just begun to be elucidated. Ongoing research into these molecular events will help researches develop targeted therapeutic strategies that specifically address TBI-induced deficits in DA signaling.
In 2006 the Neurotrauma Foundation (NTF) published an excellent review of current clinical recommendations for TBI management in both the acute and rehabilitative phases (Warden et al., 2006). As part of its review the NTF identified three drugs with DAergic effects as current viable options to assist with cognitive recovery. The identified pharmacotherapies were MPD, amantadine hydrochloride (AMH), and bromocriptine. MPD was recommended to enhance attentional function and speed of processing. Both MPD and AMH were considered reasonable options to enhance general cognitive function after TBI. Bromocriptine was recommended to enhance executive function after TBI. The NTF also acknowledged that there remains limited clinical evidence for long-term benefits with DAergic medications, but that initial clinical case reports and clinical studies have shown promise for stimulants (e.g., MPD), AMH, and bromocriptine. Both clinical data concerning the efficacy of DA agonists and a substantial amount of literature in animal models demonstrate improved functional recovery with DA agonists post-TBI. For the purpose of this review we are presenting research dealing with DA receptor agonists in adult TBI populations and corresponding animal studies.
Stimulants are often employed to assist in the rehabilitation of individuals with TBI. The most commonly used stimulants are amphetamine (AMPH) and MPD. Both AMPH and MPD have been used as drugs of choice for the treatment of ADHD. AMPH is a CNS stimulant that possesses two main mechanisms of action. AMPH acts on presynaptic nerve terminals to inhibit the reuptake of serotonin, norepinephrine (NE), and DA by acting as a substrate for monoamine transporters, including the DAT, causing a downregulation of transporter expression (Kahlig and Galli, 2003). AMPH also increases monoamine secretion via exchange diffusion and reverse transport (Haracz et al., 1998; Volkow et al., 2002a,b; Fleckenstein et al., 2007). In contrast MPD's predominant mechanism of action is via blockade of the DAT (Volkow et al., 2002a,b, 1998). Cognitive benefits on attention observed with AMPH and MPD treatment have been associated with the effective increase in DA caused by their administration (Volkow et al., 2001, 2004; Schiffer et al., 2006).
The administration of AMPH in the clinic to treat TBI is not as widely practiced as is the accepted pharmacotherapeutic strategy of providing MPD. Thus, this section will focus almost entirely on MPD. However, we would be remiss if we did not mention a report by Evans et al. (1987) who administered d-AMPH following closed head TBI and found that the treatment enhanced processing speed and improved memory. Regarding MPD, multiple studies have demonstrated its effectiveness in treating cognitive dysfunction after brain trauma (Table 1). In a double blind study, Gualtieri (1988) found that MPD treatment improved performance on measures of nonverbal fluency and selective attention as well as self-report measures in a subset of TBI “responders”. Another clinical trial found that low dose MPD treatment following moderate to severe TBI improved functional outcome at day 30 over controls as measured through the Disability Rating Scale (DRS) (Plenger et al., 1996). Kaelin et al. (1996) showed a trend toward improved DRS and a significant improvement in attention with low does MPD treatment after TBI. Other recent studies have demonstrated beneficial effects of MPD on attention (Whyte et al., 1997) and information processing speed (Whyte et al., 2004) in individuals with TBI. However, a few studies have shown no effect on cognitive outcomes with MPD treatment (Mooney and Haas, 1993; Speech et al., 1993; Tiberti et al., 1998). It must be noted that for the Speech et al. (1993) study, the authors noted that statistical power was low, and in Tiberti et al. (1998) study, the outcome measurement was memory function following organic amnesia, which may not be as profoundly influenced as attentional processing.
To better understand the mechanism of action responsible for the cognitive benefits gained with CNS stimulant treatment, specifically MPD, animal studies have examined both the behavioral and biochemical aspects of MPD treatment post-TBI.
Peak levels of MPD in the plasma and brain following intravenous injection occur within 20 min in awake animals and correspond with peak striatal DA levels as measured by microdialysis (Huff and Davies, 2002). Intraperitoneal and oral administration of MPD in rodents shows peak striatal DA levels after 40 min (Gerasimov et al., 2000). It has also been demonstrated that MPD has a shorter half-life in rats than in humans (Kuczenski and Segal, 2005). In order to overcome these limitations, doses are often larger to maintain drug levels at a therapeutic target over longer time intervals. Regardless of these limitations, experimental models utilizing a MPD treatment paradigm have been able to demonstrate cognitive benefit after both cortical ablation and TBI injuries (Kline et al., 1994, 2000). Specifically, a single administration of MPD followed by significant symptom relevant experience (i.e., beam walking experience) enhanced recovery of motor function following sensorimotor cortex lesions (Kline et al., 1994). Moreover, daily MPD treatments beginning as late as 24 h after TBI in rats reveal significantly less spatial memory performance deficits versus saline treatment (Kline et al., 2000). Wagner et al. (2009) showed that daily treatment with MPD (5 mg/kg) post-CCI resulted in increased DA overflow and Vmax. There was no associated effect of MPD treatment on DAT localization, or DA receptor expression, however there was a significant increase in cfos expression with MPD treatment.
Interestingly, while the exact mechanisms of MPD benefit on cognition after TBI are still not understood, gender differences have been observed. Wagner et al. (2007a) showed that injured males treated with MPD had no change in active behaviors and displayed significant improvements in a MWM task indicating that daily treatment with MPD results in enhanced cognition without noticeable motor enhancement. However, MPD treated female rats did not show significant improvement on the MWM task, but did exhibit increased swim speed, which was not observed in males. The researchers note that the dosing regimen was based on previous studies using male rats (5 mg/kg; Kline et al., 2000), and a lower dosing of MPD for injured females may be required to augment learning and memory pathways without inducing stereotypical behaviors that may interfere with attention and learning processes. The finding of increased motor sensitivity to MPD in females may be due to hormonal influences on regional DAT densities post-TBI and/or differences in DAT modification/function (Bosse et al., 1997). In a previous paper Wagner et al. (2005) demonstrated gender specific alterations in DAT expression after TBI. Furthermore, estrogen is known to exert effects on DAergic neuron development (Kipp et al., 2006) and has neuroprotective properties independent of other drug treatments (Gibson et al., 2008). Estrogen has also been shown to act as a signaling molecule within the DA system (Kuppers and Beyer, 1999; Kuppers et al., 2000).
AMPH use in experimental models of TBI and selective cortical injury models has also been shown to accelerate recovery. The positive benefits of AMPH have been reported in FP (Dhillon et al., 1998) and selective lesion studies (Feeney et al., 1981; Hovda et al., 1989; M'Harzi et al., 1988; Chudasama et al., 2005). AMPH treatment has been shown to reduce the accumulation of free fatty acids and lactate following FP in the cortex and hippocampus (Dhillon et al., 1998) and attenuate decreases in cerebral glucose utilization (Queen et al., 1997). However the most interesting effect of AMPH treatment may be in its ability to induce hippocampal brain derived neurotrophic factor (BDNF) following brain injury (Griesbach et al., 2008). This is not surprising given that AMPH treatment is known to induce use dependent plasticity and synaptogenesis (Butefisch et al., 2002) and has been strongly linked to plastic alterations following brain injury (Goldstein, 2003; Ramic et al., 2006). Interestingly, when combined with exercise, AMPH treatment no longer increased BDNF (Griesbach et al., 2008). This finding suggests that the positive benefits of AMPH treatment may be associated with its ability to enhance plastic responses in an injured brain and that combinational therapies are not necessarily more beneficial.
An important caveat to AMPH studies is that while AMPH treatment does increase levels of all monoamines (Fleckenstein et al., 2007), the beneficial effects of AMPH on motor recovery have only been reproduced by intraventricular administration of NE (Boyeson and Feeney, 1990). This does not rule out a positive role for DA facilitation with AMPH treatment on other cognitive processes, but simply suggests that DA mediated benefits on motor recovery may not be due simply to increases in DA release. This is supported by evidence that DA antagonists, such as haloperidol (which will be discussed later), can block the beneficial effects of AMPH treatment (Feeney et al., 1982; Hovda and Feeney, 1985).
AMH is a water soluble salt that has the capability of crossing all cellular membranes including those of the CNS. AMH was originally used as an antiviral agent for influenza type A. Subsequent studies showed it to be effective in treating PD and multiple sclerosis (Godwin-Austen et al., 1970; Rinne et al., 1972; Cohen and Fisher, 1989). Though the mechanism of action for AMH treatment of PD and multiple sclerosis is not completely understood, biochemical studies have demonstrated that AMH increases extracellular DA concentrations by blocking reuptake and by facilitating the synthesis of DA (Von Voigtlander and Moore, 1971; Bak et al., 1972; Gianutsos et al., 1985). In addition to acting at pre-synaptic targets, AMH has been demonstrated to act post-synaptically by increasing post-synaptic DA receptor density (Gianutsos et al., 1985) or altering their conformation (Allen, 1983). Evidence of a post-synaptic mechanism is clinically promising because the mechanisms of actions may not depend solely on the presence of surviving pre-synaptic terminals. Because the mechanism of action of AMH differs from other DA releasing drugs (see Gualtieri et al., 1989 for review), it is likely that the DAergic effects of AMH are a combination of pre-synaptic and post-synaptic effects.
AMH has been found to be effective at treating cognitive dysfunction post-TBI in both clinical trials and case reports (Table 2). While the level of evidence for AMH benefit is not as well developed as for MPD in adult TBI, clinical studies suggest general cognitive improvements with AMH administration after TBI. Zafonte et al. (1998) and Wu and Garmel (2005) reported improved scores on the activities of daily living scales in case reports of patients treated with AMH. Case reports (Chandler et al., 1988; Kraus and Maki, 1997) also indicate general improvements in global functioning. In patients demonstrating indications of diffuse axonal injury after TBI, AMH appeared to be effective in improving cognition independent of the timing of administration (Meythaler et al., 2002). Kraus et al. (2005) showed that improvements in executive function measurements correlated with increases in left PFC glucose metabolism in TBI patients receiving AMH treatment. As reported with other pharmacothera-pies (e.g., MPD) AMH did not confer benefits in all studies conducted (Schneider et al., 1999; Hughes et al., 2005). However, in a review of reports of AMH use after TBI, Sawyer et al. (2008) concluded that it appears to safely improve both arousal and cognition.
The mechanisms of AMH effects are still poorly understood making animal studies of its effects an important step in improving clinical use. In humans, AMH has variable absorption rates and steady state plasma concentrations are typically reached within 4 to 7 days. Doses of AMH given to patients are also somewhat variable due to a lack of correlation between plasma concentrations and therapeutic effects (Aoki and Sitar, 1988). In vitro studies have shown that doses required to affect DA uptake are higher than those used clinically with no significant difference in DA kinetics until concentrations of 40–80 mg/kg were given (Baldessarini et al., 1972; Brown and Redfern, 1976; Page et al., 2000). In brain injury patients the optimal dose of AMH has ranged from 50 to 400 mg/day given orally (Gualtieri et al., 1989). Unfortunately, AMH has not been as extensively researched in experimental models of TBI as CNS stimulants have. However, one study using daily treatment of AMH (10 mg/kg) did show significantly improved spatial memory performance compared to saline treated rats following TBI (Dixon et al., 1999).
Bromocriptine is a specific D2 receptor agonist that possesses a rather complex mechanism of action. At high doses (above 10 mg/kg) bromocriptine binds to both highly sensitive presynaptic D2 autoreceptors and less sensitive postsynaptic D2 receptors causing an expected inhibition in DA release and metabolism. Interestingly at lower doses (2.5 and 5 mg/kg) bromocriptine has been shown via microdialysis to increase extracellular DA levels in rats (Brannan et al., 1993). However, even at low doses of bromocriptine there is an associated delayed reduction in DA metabolites consistent with autoreceptor activation (Brannan et al., 1993; Pagliari et al., 1995). In vitro studies have suggested that at low concentrations bromocriptine can act as a partial D2 antagonist (Lieberman and Goldstein, 1985), which may explain an initial increase in DA release. In vivo studies have not been able to provide conclusive evidence of bromocriptine antagonist activity. However, low doses of bromocriptine are characteristically associated with inhibition of DA neuronal firing (Jackson et al., 1990). Consequently, the reason behind this concentration dependent effect of bromocriptine remains unclear. It may be a consequence of D2 receptor location, affinity, or bromocriptine's activity as a partial D1 antagonist and mixed agonist-antagonist at D2 receptors (Lieberman and Goldstein, 1985; Tan and Jankovic, 2001). What has been shown consistently is that bromocriptine requires DA in order to produce any behavioral effects (Jackson et al., 1988), and alterations in DA concentration effectively alter bromocriptine's activity at pre versus post-synaptic D2 receptors (Maruya et al., 2003). Furthermore, a single administration of bromocriptine can cause an alteration in D2 receptor binding for further treatments (Jackson et al., 1988). These are important caveats to consider in TBI research evaluating bromocriptine as a potential treatment strategy.
In humans bromocriptine has an oral availability of approximately 30–40% and reaches peak levels about 1–2 h after administration. Following discontinuation, bromocriptine remains in the system for up 12 h. Low dose treatments for PD range from 5 to 30 mg/day while high dose treatments for more advanced PD are within 31–100 mg/day (Lieberman and Goldstein, 1985; Deleu et al., 2002). Bromocriptine is less well studied in clinical research compared to AMH and MPD (Table 3). Past case reports (Ben Smail et al., 2006; Karli et al., 1999) showed improvements in executive function after administering bromocriptine. McDowell et al. (1998) also demonstrated improvements in executive function with a single 2.5 mg bromocriptine administration. Improvements in digit span, list learning, and motivation in bromocriptine treated patients (maximum of 10 mg/day) that persisted for at least two weeks after bromocriptine withdrawal were also reported (Powell et al., 1996). Bromocriptine did not appear to be effective in addressing moderate to severe TBI patients’ attentional difficulties during a postacute phase of recovery at a dose of 5 mg twice daily (Whyte et al., 2008). However, the study employed a relatively high dose of steady state bromocriptine at 10 mg/day for a more prolonged treatment timeline than previously studied in TBI (Whyte et al., 2008). It may be that the dosing of bromocriptine in head trauma patients needs to be specifically titrated given DAergic alterations caused by TBI. It is also possible that higher doses of bromocriptine for prolonged periods negatively impacts DA kinetics in this patient population.
In rats, plasma concentrations of bromocriptine peak at 15–30 min with delayed maximal D2 binding occurring for a period of up to 3 h (Maurer et al., 1983; Atsumi et al., 2003). Dosing for rats is generally lower than that for humans and is based on the findings from behavioral studies showing that higher doses (10–40 mg/kg) induce motor activation in normal rats (Jackson et al., 1988; Brannan et al., 1993). Rats receiving delayed (i.e., 24 h post-injury) and chronic (i.e., daily for 18 days) pharmacological treatment with bromocriptine (5 mg/kg) exhibited both enhanced WM and acquisition of spatial learning in a MWM task (Kline et al., 2002). In a follow up study, Kline et al. (2004) demonstrated that bromocriptine-treated rats exhibited enhanced spatial learning as they were more adept at locating a hidden platform in a MWM task and also displayed increased hippocampal neuronal protection following TBI compared to vehicle-treated controls. Furthermore, the data showed that bromocriptine attenuated TBI-induced oxidative stress (Kline et al., 2004).
The administration of selegiline (l-deprenyl) once daily for seven days beginning 24 h following FP injury has been reported to improve cognitive function in the MWM and enhance neuroplasticity (Zhu et al., 2000). l-Deprenyl is used to enhance the action of DA by inhibiting its main catabolic enzyme in the brain, monoamine oxidase-B. Additionally, Newburn and Newburn (2005) showed that selegiline has potential clinical benefits in the treatment of post-TBI apathy.
Atomoxetine administered at a dose of 1 mg/kg one day following lateral fluid percussion injury in rats showed improvement in Morris water maze performance compared to vehicle (Reid and Hamm, 2008). Atomoxetine is typically used as a non-stimulant drug for treatment of ADHD. While its mechanism of action is predominantly through inhibition of the NE transporter it has been shown to increase extracellular DA in the PFC (Bymaster et al., 2002).
Antipsychotic drugs, in particular haloperidol and risperidone, have been administered to TBI patients to treat agitation and psychotic symptoms that may be related to the injury. Unfortunately there are very few clinical studies on cognitive effects in TBI patients following antipsyhcotic administration, which has limited conclusions on potential consequences of antipsychotic use in TBI populations (Elovic et al., 2008). However, animal studies have demonstrated negative consequences of antipsychotic administration following TBI, in particular with administration of typical antipsychotics.
Haloperidol and risperidone have multiple CNS effects, but one of the predominant effects is a strong central DA receptor inhibition that can produce akinesia and pseudoparkinsonism. Given this profound DAergic component of these antipsychotics, the question about what effect they might have on the recovery process after TBI is important. Animal models have demonstrated that antipsychotics impair the recovery process and in some instances exacerbate the TBI-induced behavioral deficits. For instance, Feeney and colleagues demonstrated that even a single administration of haloperidol provided after TBI to adult rodents delayed motor recovery (Feeney et al., 1982). Moreover, administration of haloperidol after the rats were recovered, as indicated by normal beam-walking, led to a reinstatement of the deficits (Feeney et al., 1982). Similar findings were reported by Goldstein and Bullman (2002). Other studies have shown that antipsychotic drugs after brain trauma not only impair motor recovery, but also cognitive function. Wilson et al. (2003) showed that haloperidol led to slower acquisition of spatial learning in a water maze task. Interestingly there was no impairment noted with the administration of the atypical antipsychotic olanzapine following TBI by Wilson et al. (2003). One possible explanation is the relatively low activity of olanzapine at the D2 receptor relative to haloperidol (Tauscher et al., 2004).
Recent studies from our laboratory have demonstrated that prolonged exposure to the typical and atypical antipsychotics, haloperidol and risperidone, respectively, after TBI impairs motor recovery and hinders the acquisition of spatial learning and memory retention (Kline et al., 2007a, 2008; Hoffman et al., 2008a). Risperidone and haloperidol also impaired performance in uninjured controls (Hoffman et al., 2008a,b; Kline et al., 2008). Haloperidol has also been shown to block the positive benefits of AMPH treatment (Hovda and Feeney, 1985). Interestingly, while single or multiple low doses of risperidone and haloperidol appear to be innocuous to recovery after TBI, chronic high-dose treatments are uniformly detrimental (Kline et al., 2007a,b).
A number of studies have also shown positive improvements in WM and spatial memory with both early (Tang et al., 1997) and late (Kobori and Dash, 2006) administration of DA antagonists. For example, Kobori and Dash (2006) demonstrated that a single administration of the DA D1 antagonist (SCH23390) at 14 days post-injury in rats improved WM for up to a week. Tang et al. (1997) showed an improvement in functional recovery with D2 receptor specific antagonists given immediately post-injury and a synergistic effect when combined with D1 receptor antagonism in mice. Given that both haloperidol and risperidone have a higher affinity for the D2 receptors (Cohen, 1994; Reimold et al., 2007), it may be that specific blockade of D2 receptors is the event most associated with negative outcomes when antagonized at later time-points. This is not unreasonable as it has been shown that D1 receptor activation is also important for inflammatory and immunological responses including phosphoinositide hydrolysis and arachidonic acid release as discussed previously in this review. A potential explanation is that inhibition of D1 receptors after TBI beneficially affects the injury response while DA agonists at later time-points provide cognitive benefits.
There remains a significant amount of work in TBI research to explore completely the realities and consequences of DAergic dysfunction after TBI. Are the observed alterations in DAT and TH protein levels a result of injury and ongoing biochemical damage or are they a response to initial changes in DA levels in the cortex and subcortical layers? Furthermore, although no overt cellular damage has been identified within nigrostriatal and mesocortical DAergic pathways, there remains the possibility of axonal disruptions and biochemical alterations. Interestingly, in the PD literature it has been suggested that there exist subtle changes in oxidative stress related to DA signaling within the SN occurring prior to significant cell loss.
Animal models of TBI consistently produce widespread excitotoxic damage and increased amounts of oxidative stress in a number of different brain regions (Palmer et al., 1993; Rao et al., 1999). DA is known to possess excitotoxic properties (Olney et al., 1990), and DAergic fibers have been shown to modulate striatal glutamatergic excitotoxicity (Chapman et al., 1989; Filloux and Wamsley, 1991). The initial increases in DA observed post-TBI may precipitate excitotoxic disruption and oxidative damage to DAergic cellular function that leads to the observed alterations in DA kinetics and decreased evoked DA release at later time-points.
Observed changes in DAT expression in TBI also raise an intriguing set of considerations. As discussed in the section concerning DAT activity, the DAT is regulated by a number of genetic factors and shows variation in activity and expression with both age and gender. Alterations in DAT expression can alter the kinetics of DA release as demonstrated in DAT knockdown models (Zhuang et al., 2001), as can changes in DAT cellular localization (Pristupa et al., 1998). Decreases in evoked DA overflow Vmax following CCI may be explained by either changes in expression or changes in membrane bound DAT associated with DAT trafficking (Wagner et al., 2005, 2009). Given that a number of the current DA receptor agonist therapies act through a DAT mediated mechanism, it is necessary to fully understand the role of DAT changes in TBI in order to provide efficacious DA therapies.
In addition to documented biochemical alterations in DA signaling following TBI, there remains the possibility of structural changes. TBI is known to cause diffuse white matter injury and significant axonal disruptions throughout the CNS (Smith et al., 2003). However, it remains unclear if similar effects can be observed within DAergic systems. Increases in TH staining in both the PFC and striatum may represent regrowth of DAergic fibers that occur as a consequence of DA synapse or axonal pathology that occurred acutely following TBI.
Regrowth or collateral sprouting of catecholaminergic axons has already been demonstrated in experimentally induced lesions of adult CNS neurons (Katzman et al., 1971; Gilad and Reis, 1979; Fritschy and Grzanna, 1992). Moreover, in both PD patients and animals with experimental Parkinsonism, neural transplants or the supply of neurotrophic factors may promote regrowth of DAergic fibers in the striatum and reverse lesion-induced behavioral deficits (Kordower et al., 1991; Kopin, 1993; Tomac et al., 1995). The occurrence of spontaneous regrowth of DAergic fibers after partial nigrostriatal denervation has already been suggested (Onn et al., 1986) and a long-term increase in the amount of striatal TH has been observed after 6-hydroxydopamine injection in the SN pars compacta (Pasinetti et al., 1991; Blanchard et al., 1995). It is thus possible that the TBI-induced expression of TH in the nigrostriatal system might share similar mechanisms. However, further studies evaluating DA turnover, TH activity, DOPAC/DA and DA/TH in the nigrostriatal system after TBI are needed to confirm whether there is such a compensatory mechanism after TBI.
Furthermore the striatum is a heterogeneous region and a varied profile of DA kinetics has been reported within different areas of the dorsal (caudate and putamen) and ventral (NAcc) striatum (May and Wightman, 1989a,b; Bergstrom et al., 2001). It is possible that caudate subregions are differentially affected by CCI. Differences may also exist in damage to the NAcc core versus the shell. Subregion damage could also differentially affect PFC function given the regionally specific cortical connections that exist within basal ganglia anatomy (Grahn et al., 2009). While many studies have also identified alterations in the PFC associated with TBI, future studies will likely need to evaluate DA neurotransmission in multiple DA regions outside the caudate putamen, as well as subregions within the caudate to characterize fully the effects of trauma on these heterogeneous structures.
DAergic dysfunction after TBI may not be limited to neurotransmitter release, concentration, and metabolism. Preliminary research in our group suggests that TBI can cause alterations in DARPP-32 phosphorylation altering a number of important intracellular signaling molecules. The importance of DARPP-32 to DAergic synaptic plasticity and modulation of other neurotransmitter systems has broad reaching implications for medium spiny neuron function in the striatum and represents another possible level of DA dysfunction following TBI.
Alterations in receptor expression also remain an area where further research is necessary. While studies to date have not demonstrated any overt alteration in DA receptors, there are a number of areas that require further consideration. D1 and D2 receptors depend upon a complex interplay with other receptor systems within the striatum and other DA systems. In particular, recent research in PD has identified a series of heterodimeric complexes of the D2 receptor with other neurotransmitter receptor systems including the adenosine A2a and Mglut5 receptors. It has been proposed in PD research that A2a antagonists can augment DAergic treatments. Interestingly, both A2a and Mglut5 receptors have also been shown to play heavily into DA's role on striatal plasticity and intracellular calcium signaling.
This review has sought to summarize the evidence that supports a DAergic hypothesis of cognitive dysfunction after TBI and provide a context for the use of DA targeted therapies during patient rehabilitation. A concise overview of DA's role in cognitive deficits relevant to TBI, the use of DA therapies in clinical populations to benefit cognitive recovery, and an overview of the alterations observed post-TBI in DA neurotransmission as determined by both clinical and experimental studies allows the construction of an early schematic for the effect of TBI on DA to help guide future endeavors.
Given that TBI causes damage to areas known to be involved in DAergic processing and that research in other disease states have established that DA is important for cognitive outcomes, it is logical that TBI should produce clinical outcomes similar to what is observed in other disorders of DAergic function. Indeed, the clinical picture of behavioral and cognitive dysfunctions after TBI shares a number of commonalities with other disorders of DA dysfunction, such as PD and HD.
To date, clinical studies have consistently demonstrated that pharmacotherapies that enhance DA post-TBI are beneficial to memory, attention, and executive function. However, clinical studies examining DAergic therapies have a number of limitations. Small patient populations, variations in treatment protocols, lack of proper controls including the absence of randomized clinical trials, and a poor definition of TBI make it difficult to identify who would benefit the most from DA enhancement therapy and the proper time-course of therapeutic intervention.
Ongoing animal and clinical studies have better characterized DAergic dysfunction following TBI, which is still difficult due to the diffuse nature and variable presentation of TBI. However, studies in both animals and humans have identified a series of temporally specific alterations in DA neurotransmission that occur after TBI (Fig. 3). An acute hyperactive phase, characterized by increases in tissue DA levels and an initial increase in D1 receptors, is subsequently followed by hypofunction in DA signaling characterized by decreases in evoked DA overflow and alterations in both DAT and TH expression. Animal studies demonstrating the benefit of D1 antagonists and D2 agonists would suggest that the critical event in DAergic dysfunction following TBI is related primarily to signaling through the DA D2 receptor population.
Understanding the temporal alterations in DA following TBI and the mechanism of dysfunction at a cellular and systems level will allow DAergic therapies to be better tailored to specifically address the character of dysfunction in TBI populations during recovery. Furthermore, given the importance of genetic differences in DA kinetics and the role of gender in DA signaling, it is important to utilize animal models of injury to better understand how these factors affect potential treatments. Doing so will help answer long standing questions in TBI rehabilitation in how to best optimize neuropharmacology strategies. Understanding the role of DA in cognitive recovery following TBI also adds another layer of consideration to the use of acute phase medicines that may affect CNS DA systems.
Clinical studies coupled with animal research have clearly demonstrated that DA targeted therapies represent an important clinical option in the treatment of memory, learning, and executive function deficiencies that persist following a TBI. However, clinical studies have so far failed to identify the most beneficial dose and time period of administration for DAergic therapy. Furthermore, conclusions from DA enhancement therapeutic trials are further complicated by the nature of TBI itself. TBI is a complex disease with multiple primary and secondary etiologies. Poor stratification of patients, often based almost exclusively upon Glascow coma scale, has made it difficult to identify the patient populations most likely to benefit from targeted DA therapy. Even considering these limitations there remains enough evidence to support further studies into the role of DA in persistent deficits following TBI. We propose three goals for ongoing animal and clinical studies in DAergic signaling following TBI.
First, there needs to be a coherent understanding of temporal alterations in DA signaling that occurs following TBI. Efforts should be made to understand what occurs acutely, during recovery, and chronically. A better understanding of molecular events will allow clinicians and researchers to effectively analyze clinical successes and, more importantly, clinical failures. Without this basic understanding is difficult to ascertain why some promising DAergic therapies fail in TBI patients. Second, research in animal models is necessary to examine dosing, timecourse of administration, and possible combination therapies. The natural dichotomy in DA anatomy provided by D1 versus D2 receptor populations and other relevant neurotransmitter systems, provides a real opportunity to utilize adjunct therapies in addition to direct DA enhancing strategies. Third, clinical research in TBI patients must make efforts to stratify results based upon gender, genetic markers, in particular DAT expression profiles, and injury profiles. Doing so we will allow clinicians to identify which patients will be the most likely to benefit from DAergic therapy.