PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Trends Neurosci. Author manuscript; available in PMC 2012 October 1.
Published in final edited form as:
PMCID: PMC3183413
NIHMSID: NIHMS313421

A neoHebbian framework for episodic memory; role of dopamine-dependent late LTP

Abstract

According to the Hebb rule, the change in the strength of a synapse depends only on the local interaction of presynaptic and postsynaptic events. Studies at many types of synapses indicate that the early phase of long-term potentiation (LTP) has Hebbian properties. However, it is now clear that the Hebb rule does not account for late LTP; this requires an additional signal that is non-local. For novel information and motivational events such as rewards, this signal at hippocampal CA1 synapses is mediated by the neuromodulator, dopamine. In this Review, we discuss recent experimental findings that support the view that this “neoHebbian” framework can account for memory behavior in a variety of learning situations.

Introduction

The idea that association underlies memory was already emphasized by the Greeks [1]. The discovery of neurons and synapses raised the question of whether these structures are modifiable by experience and could thereby mediate association. A potential answer to this question was proposed by Donald Hebb [2]. According to the Hebb rule, associations are encoded if synaptic plasticity obeys a simple rule: the synapse between cell A and cell B will be strengthened if two conditions are met: 1) the synapse from cell A onto cell B is active; and 2) cell B responds to this and other inputs by strong depolarization that triggers action potentials. Thus, if cell A represented object A and cell B represented object B, the co-occurrence of the two objects would, by the Hebb rule, strengthen the synaptic linkage between these cells. This link would subsequently be evident when only object A was presented because it would lead to the firing of cell B, thus bringing object B to mind by association.

Experimental support for the Hebb rule came through the study of long-term potentiation (LTP), an activity-dependent change in the strength of synapses. LTP has been found in many brain regions but has been most extensively studied in the CA1 region of the hippocampus. Early experiments showed that LTP in CA1 is governed by the Hebb rule: the induction of LTP requires both presynaptic input and strong postsynaptic depolarization (the role of Na+ spikes remains unclear) [3]. Furthermore, preventing strong depolarization by injecting negative current prevents the induction of LTP [4]. Implicit in Hebb’s rule is that LTP is specific to the synapses at which the rule is met (e.g., inactive synapses should be unaffected despite the strong postsynaptic depolarization). The ability to induce LTP at single visualizable synaptic connections by 2-photon uncaging of glutamate has directly confirmed the synapse specificity of LTP [5].

Other work has revealed some of the molecular mechanisms that underlie Hebbian plasiticity, allowing tests of the role of this plasticity in memory. Remarkably, the Hebbian computation in CA1 is done by a single type of molecule, a type of glutamate-activated channel termed the NMDA receptor (NMDAR). The opening of the NMDAR is Hebbian: the channel opens only if there is both presynaptic glutamate release and strong postsynaptic depolarization. When these channels open, the resulting influx of Ca2+ activates the enzyme, Ca2+/Calmodulin-Dependent Protein Kinase II (CaMKII), which then triggers the local biochemical changes that strengthen the synapse. Experiments show that genetic or pharmacological interference with NMDARs or CaMKII strongly interferes with memory formation [6]. There is thus little doubt that a Hebbian form of LTP is important for memory formation.

But is the simple form of association envisioned by Hebb the whole story? From our daily experiences, we know that items may co-occur but be only briefly registered in conscious memory if we don’t attach importance to them. This suggests that there are additional factors that determine whether information is stored persistently. Importantly, studies of LTP (mostly in CA1) show that there is an additional factor: the persistence of LTP depends not just on the two factors of the Hebbian condition (glutamate release and postsynaptic depolarization), but also on a third, the action of the neurotransmitter dopamine [7] (see Box 1). Importantly, the dopamine release depends on systems-level processes that include motivation and stimulus novelty [8] (Box 2, Box 2 Figure 1). We term this “neoHebbian” to indicate that in addition to the 2-factor Hebbian process, stable synaptic modification requires a third signal dependent on a systems level computation.

Box 1

Mechanisms by which dopamine stimulates the protein synthesis required for late LTP

Dopamine enhances protein synthesis within dendrites of hippocampal neurons [85]. Other results point to a role of brain-derived neurotrophic factor (BDNF) in triggering protein synthesis [86]. Experimental results suggest that D1 receptor- and BDNF-mediated pathways interact in the activation of extracellular-signal regulated kinase (ERK) 1/2 [mitogen activated kinase (MAPK)] activity [87]. Active ERK1/2 can then induce nuclear transcription [via cyclic AMP response element-binding protein (CREB)], regulate translation [eg. by activating the eukaryotic translation initiation factor 4E (eIF4E)], and stimulate ribosomal function. Hippocampal late LTP is blocked when ERK1/2 activation is blocked by a dominant-negative mutation of MEK1, the kinase that directly phosphorylates ERK1/2 [88]. The pathway for D1-dependent control of ERK1/2 has been most thoroughly studied in the striatum. Activation of cAMP, protein kinase A (PKA) and dopamine releasing protein (DARP), lead to the stimulation of protein phosphatase 1 (PP1). ERK1/2 activation occurs because PP1 dephosphorylates (thereby inactivating) the striatal enriched protein tyrosine phosphatase (STEP) that dephosphorylates active ERK1/2 [89].

Box 2

What do dopamine neurons signal?

Novelty

In primates, novel stimuli can trigger burst firing of dopamine cells which habituates as stimuli become familiar [44]. Microdialysis [90, 91] and fast scan cyclic voltammetry [92] show that various types of novelty produce dopamine release in the rat striatum. These include social novelty [90], novel tastes [91], and novel environments [92]. Placing a rat in a novel environment leads to dopamine release in the hippocampus [19], and there is converging evidence from fMRI studies that the human SN/VTA is also activated by novelty (see main text and Figure I).

Reward prediction errors

In reinforcement learning, dopamine responses conform with a signed prediction error; they increase for better than expected (reward) outcomes and decrease for worse than expected (aversive) outcomes [43]. In rare cases, clinical requirements provide an opportunity to obtain direct recordings from the SN (e.g. in PD patients)[52]. SN neurons display higher spike rates to unexpected gains and lower rates to unexpected losses with a timing (ie. 150–375 ms) that is largely consistent with non-human primate studies. In the basal ganglia, dopamine release can reinforce action choices (‘go’ responses) within the so-called direct pathway to actively obtain rewards, whereas dips in dopamine firing can reinforce behavioral inhibition (‘no go’ responses) within the indirect pathway to passively avoid aversive outcomes [93] (note that an elevation of dopamine enhances LTP in the striatum, whereas a drop enhances LTD [9496]). This functional architecture provides a plausible mechanism for instrumental learning of active responses through positive reinforcement and passive avoidance responses through punishment.

Aversive events

It is now well-established that dopamine neurons can also fire in response to unexpected aversive outcomes and cues predicting punishments [45]. In monkeys, a substantial fraction of dopamine neurons in the dorsolateral SN/VTA are responsive to conditioned cues predicting aversive events; in contrast, those in the more ventromedial portions of the SN/VTA are inhibited by aversive stimuli [97] (but see [98]). Similar findings have been demonstrated in anesthetized rats, where some VTA dopamine neurons are excited by noxious stimuli [99], whereas others are inhibited [100]. Recent studies in freely behaving mice have shown that VTA dopaminergic neurons can exhibit responses to different valences. Thus, although 89% of VTA dopaminergic neurons respond with activation to a conditioned reward, they also respond to fearful events, with the majority of the neurons showing suppression. Interestingly, 25% respond with excitation to both positive and fearful events [101]. It is important to note that the fearful stimuli are maintained over the time course of the fear-inducing event, in contrast to the transient nature of a brief noxious stimulus as mentioned above. Indeed, sustained aversive stimuli or stressors (eg. repetitive footshocks or physical restraint), increase the number of dopamine neurons firing over longer periods of time [102]. This response is consistent with the time course of the increase in dopamine release produced by sustained aversive events measured with microdialysis [103, 104].

Interactions between valence and action

The human lateral SN/VTA has been similarly shown to be activated during anticipation (i.e. before execution of motor responses) to acquire a reward or to avoid a punishment [53]. This suggests that SN/VTA responses to aversive events are observed primarily under conditions that require active, rather than passive, avoidance. This view is largely consistent with the suggestion that dopamine may promote a learning process that enhances the motivational salience, the ‘incentive salience’, of reward cues [105] and with linking dopamine to ‘wanting’ rather than ‘liking’ in the field of addiction [106].

Attention/Salience

An alternative interpretation favored in the attention and learning field is that both rewards and aversive events generally produce an increase in dopamine that enhances learning/attention [107, 108] and thereby enhance salience.

Box 2 Figure I

An external file that holds a picture, illustration, etc.
Object name is nihms313421f3.jpg

Activation of SN/VTA by different types of stimuli. A Stimulus novelty: Activation was induced in the right medial SN/VTA by rarely appearing novel scenes (novel oddballs) compared with rarely appearing familiar scenes (familiar oddballs). The data were obtained from healthy older adults (Reprinted with permission from [109]). B Incidental stimulus encoding. FMRI responses in bilateral medial SN/VTA were elicited during the incidental encoding of images of scenes that correlate with scene recognition memory after a 24 hour retention interval. (Reprinted with permission from [110]). C Cues that predict novelty. fMRI response in right SN/VTA was elicited by symbolic cues that predict images of novel scenes as compared to cues predicting familiar scenes (Reprinted with permission from [64]. The SN/VTA is visible as a bilaterally bright area in the midbrain. Colour bars depict t-values.

In this review, we will first summarize the evidence for the dopaminergic involvement in LTP and learning. We will then briefly discuss the different types of stimuli that can lead to dopamine release. We will then turn to evaluate a simple idea: that all stimuli leading to dopamine release should enhance memory. This has now been examined in both animal models and in humans and the results have interesting implications for how learning occurs. Finally, we will discuss the tag and capture model [9], that makes further predictions about how stimuli interact to affect memory. Readers interested in additional perspectives may find other recent reviews useful (eg. [1012]).

Neohebbian synaptic plasticity: role of dopamine in late LTP in CA1

Early experiments revealed that dopamine antagonists applied during the LTP induced by multiple tetani (generally 100Hz for 1s) had an effect that was strongly time dependent [7]. Whereas the first phase of LTP was not strongly affected, the potentiation measured an hour after induction was strongly attenuated. LTP at these later times (termed late LTP) is dependent on protein synthesis, whereas the LTP at early times is not [13]. It is therefore thought that dopamine antagonists prevent late LTP because they prevent the required protein synthesis (see Box 1).

Much additional research has strengthened the evidence that dopamine is required for late LTP in CA1 and identified the key role of the D1 class of dopamine receptors. Late LTP is blocked by D1 antagonists [14] (Fig. 1A) and by genetic deletion of D1 receptors (Fig. 1C) [15, 16]. The experiments described above were done in the slice preparation; similar effects of D1 antagonist have been observed when experiments are done in vivo [16, 17] (Fig. 1B). Furthermore, depletion of dopamine selectively blocks late LTP [18], a block that can be reversed by a D1 agonist (Fig. 1D). Together these results make a strong case for the involvement of dopamine in late LTP. Other experiments have directly measured event-dependent release of dopamine in the hippocampus [19]; less is known about the location of dopaminergic axons and the release/uptake mechanisms (Box 5).

Box 5

Outstanding questions

How can the neoHebbian framework be tested pharmacologically in humans?

An important question to address further is whether dopamine is involved in the stabilization of human memory. Available dopaminergic drugs have substantial D2 receptor affinity, and selective D1/D5 compounds are not readily available for use in healthy humans (but see [123]). Another approach is to enhance dopamine levels with L-DOPA. This has been investigated mainly with respect to working memory [124, 125], but there is one report that L-DOPA improved long-term memory [126].

Does the neoHebbian framework apply to extrahippocampal MTL structures?

Protein synthesis-dependent forms of consolidation are not restricted to the hippocampus [127]. It would therefore be important to test whether dopamine plays a role in consolidation of non-episodic memory such as stimulus-familiarity.

How does dopamine affect encoding?

We have focused in this review on how dopamine affects consolidation. However, dopamine may be released in anticipation of information under various conditions [64](Box 2), and this could affect memory encoding. There are many ways this could occur: 1) Pathways can be turned off or on by dopamine [128]; 2) Network oscillations can be regulated [129, 130]; 3) Inhibition can be attenuated by a postsynaptic D3-dependent process [131]; 4) Encoding could be inhibited due to hyperpolarization by dopamine D2 receptors leading to an encoding of only strong stimuli (ie. increasing the signal to noise ratio [132]); 5) Explorative behavior could be energized (particularly in response to novelty) [133], [134], [135]; 6) Working memory capacity and attentional control [125] could be affected.

How can the neoHebbian framework be separated from encoding-related effects of dopamine?

Concern about encoding effects (see above) would be obviated if the “strong” stimulus that induces dopamine release enhances the memory for a ‘weak’ (e.g. ‘boring’ or pre-familiarized) stimulus given before the strong stimulus.

Do other neuromodulators contribute to neoHebbian plasticity?

LCholinergic [136] and noradrenergic [10] projections to MTL can also modulate LTP and long-term memory. Activation of the brainstem locus coeruleus, theorigin of noradrenergic projections to the CNS, can enhance protein synthesis in the dentate gyrus [137]and hippocampal stimulation can lead to increased hippocampal release of noradrenaline [138]. There is also crosstalk: the primary cholinergic drive of the hippocampus is strongly affected by dopamine [139], as is the locus coeruleus [140]. It is possible that noradrenaline is more important for consolidation of long-term memory for arousing and negative emotional events [141] than dopamine. Finally, serotonin may be necessary to consolidate memories about those types of aversive episodes that were associated with action inhibition (see [53] for a discussion).

How does the neoHebbian framework interact with off-line replay?

Whereas the Hebbian NMDAR-dependent processes in the slice occur only during the initial induction of LTP, events in vivo also involve NMDAR-dependent processes triggered by hippocampal replay processes [142]. In such replay, cells that were co-active during experience are reactivated during quiet wakefulness or sleep (reviewed in [143]). Such processes underlie consolidation events that both fix information at synapses and allow transfer of information from one region to another. Dopamine antagonists given only during such consolidation disrupt memory measured many hours later [144, 145].

Is there a tension between the penumbra and reward-related memory enhancement by dopamine?

It will be important to learn more about the time course of dopamine release, the duration of synaptic tags and the lifetime of plasticity related proteins, all of which will determine the properties of the penumbra. This may help to resolve the following puzzle. Reward-related SN/VTA activation improves memory for the rewarded stimulus, but not for the non-rewarded stimuli given in close temporal proximity [57], implying either a very short or very stimulus-specific penumbra. This is at odds with the observation that novelty-related activation of the SN/VTA has a long (ca. 30 minutes) penumbra that affects memory for unrelated information (e.g. exposure to novel scenes can affect memory for words) [74]. One possible resolution is that the duration or stimulus-specificity of the penumbra depends on the type of motivational event that triggers dopamine release.

Where is dopamine released in the hippocampus?

Older methods for labeling dopamine axons can give ambiguous results. Modern genetically-based labeling methods are needed to determine the location and density of these axons.

Fig. 1
Dopamine is required for late LTP in the CA1 region of the hippocampus in rodents. A. Under control conditions (closed black circles), robust LTP was induced after high frequency stimulation at time 0 min (as indicated by arrows) [14]. A significant reduction ...

Two types of experiments demonstrate that dopamine can strengthen the synaptic potentiation produced by learning itself. In vivo recordings show that CA3/CA1 synapses are strengthened over time during trace eyeblink conditioning but that this strengthening is greatly reduced either in D1 receptor knockout mice or after reducing D1 receptors by virally delivered siRNA [16]. Another line of experiments examined the learning that stabilizes place fields in CA1 after the animal enters a new environment. It was found that D1 antagonists prevent this stabilization [20].

Late-phase LTP can be induced in the slice preparation without dopamine application; a result that seemed to indicate that dopamine was not required. However, the block of LTP by dopamine antagonists suggests that dopamine is indeed required. This apparent contradiction was resolved by the demonstration that the extracellular stimulation procedure used to induce LTP releases endogenous dopamine from axon terminals in the hippocampus [7]. Another quandary was posed by early work showing that dopamine agonists alone could produce potentiation [21]. However, it was subsequently shown that this potentiation depended on the action of test pulses given during agonist application and that activation of NMDARs by the test pulses was required [22]. Thus, it is now clear that dopamine by itself does not produce potentiation; rather, it acts as a permissive signal for the late-phase LTP induced by a NMDAR-dependent process.

Results such as those shown in Fig. 1 (top traces) suggest that early LTP is completely unaffected by dopamine antagonist. Subsequent work, however, showed that, with less strong induction protocols, there is a partial inhibition of early LTP [23]. Thus, both early and late LTP are dopamine-dependent, but late LTP has a stronger dependence. Based on these results, one would predict that dopamine antagonists would produce a weak reduction of memory at short times after learning but would produce a strong reduction at later times. In the next section we discuss experiments that tested this prediction.

D1 antagonists block 1 day old memory without affecting early memory

Plasticity in the hippocampus subserves the particular form of memory called episodic memory [24]. This form is rapidly acquired and is referenced to a specific time and place. The investigation of the role of dopamine in episodic memory has been facilitated by the development of a rodent model of episodic-like memory: after learning several reward-location associations in an environment, rats rapidly (i.e., in only one trial) update their memory if one of the reward-location associations is changed [25]. This one-trial memory for novel associations is critically dependent on the hippocampus and on glutamatergic neurotransmission during encoding [25, 26]. Intrahippocampal infusion of the D1/D5 dopaminergic antagonist SCH23390 just before encoding causes impaired memory for the new reward-location association measured 24 hours later, but has no effect on early memory measured at 30 minutes [26] (Fig. 2). A similar observation was made for one-trial encoding of the location of the escape platform in the Morris water maze [27]. These data demonstrate that D1/D5 receptor-mediated effects do not strongly affect encoding or early memory, but do affect memory at later times. Dopamine has no effect on the previously formed memories and is not required at the time of retrieval [27]. Further insight into the role of dopamine receptors in memory will come from the use of hippocampal-specific knockout of dopamine receptors (e.g. Sarinana and Tonegawa, Soc for Neurosci abstract 608.11, 2010).

Fig. 2
A D1 antagonist blocks long-term memory but not early memory (30min) for novel episodic-like information in rats. A In this event-arena apparatus, six sand wells are available to which the rat can run to collect food. Animals are trained over several ...

Firing properties of dopaminergic neurons

Given the importance of dopamine release for LTP and memory, it becomes crucial to understand the properties of dopaminergic neurons and the kind of stimuli that lead to dopamine release. Midbrain dopamine neurons consist of three largely contiguous cell groups: the retrorubral field (cell group A8 in the rat nomenclature), the substantia nigra pars compacta (SNc, A9), and the ventral tegmental area (VTA, A10) [28, 29]. In primates, the SNc is further divided into a dorsal and ventral tier [30]. Although the SN and VTA are known as dopaminergic regions, some cells in these regions are GABAergic or glutamatergic, and others release both glutamate and dopamine [31, 32]. Dopamine cells can be identified by their electrophysiological waveform signature and firing characteristics [3335].

Dopamine cells can be silent, fire single spikes somewhat irregularly (tonic mode) or fire brief bursts of spikes. Studies performed both in vivo and in vitro show that the tonic mode is driven by intrinsic pacemaker conductances [34, 36]. This firing is modulated by GABAergic inputs from the ventral pallidum; a structure in which cells have high spontaneous activity [37]. As a result of this inhibition, up to 50% of dopamine neurons are in the silent mode at baseline. Activation of the ventral subiculum (an output structure of the hippocampus) excites the nucleus accumbens (Nac), which, in turn, inhibits the ventral pallidum. This reduces the inhibition of dopamine cells and thereby increases the number of tonically firing cells [38, 39]. Tonically firing neurons (but not silent neurons) can be driven into a bursting mode by stimuli such as those associated with reward (see below). These bursts last about 200–500 msec, have interspike intervals <100 msec, and generally contain about 3–5 spikes. Burst firing in dopaminergic neurons is potently driven by the brainstem pedunculopontine tegmentum (PPTg), which provides a glutamatergic input that acts via NMDARs ([39, 40] see Box 3). Recent work demonstrated the functional importance of these bursts: selective genetic inactivation of NMDARs in dopaminergic cells blocked burst production and prevented conditioned behavioral responses that were dependent on dopamine [41]; conversely, simulating burst firing using channelrhodopsin is sufficient to induce behavioral conditioning [42]. These results indicate that burst firing controls some downstream functions, but others may depend on how many cells are in the tonic mode (see Boxes 3 and 4).

Box 3

Circuits that control the firing of dopaminergic cells

The novelty processing hierarchy of the MTL (Figure I) is part of the circuitry that can regulate dopamine responses to novelty. The perirhinal cortex [65], for instance, shows much stronger responses to novel items than familiar items and may form connects to the PPTg (Figure I), a structure that regulates the SN/VTA. Strong evidence for hippocampal involvement in novelty-dependent dopamine release comes from experiments showing that this release is blocked by inactivating the subiculum, an output structure of the hippocampus that projects, by a polysynaptic pathway, to the SN/VTA (Figure I). In humans, hippocampal event-related potentials signaling unexpected events precede those in the Nac [111], which is compatible with the possibility that information flows from the hippocampus to the Nac and from there to the SN/VTA (Figure I). Hippocampal novelty signals incorporate context information by signaling mismatches for learned objects-spatial location associations [112], temporal sequences [113] and during integrative encoding [49].

The PPTg regulates burst firing of dopamine cells rather than their tonic resting activity [[39]]. It is known to be driven by prefrontal afferents, and responds to single sensory events from different modalities with earlier burst firing than dopamine neurons [[114]]. It is unclear to what extent PPTg responses are modified by contextual factors and/or conditioning [[115]] or simply relay ‘accurately timed and attended sensory information’ [[114]]. The PPTg drive [116] depends on a permissive ‘gating’ input from the laterodorsal tegmentum (LDT) [[107, 116, 117]]. The LDT, in turn, receives substantial input from the medial prefrontal cortex (PFC) [[118]], allowing the PFC to indirectly affect mesolimbic dopaminergic neuron activity[[118]].

Another brain region which is known to participate in the modulation of dopamine neurons is the lateral habenula. The lateral habenula provides SN/VTA dopamine neurons with a negative reward-prediction signal through the fasciculus retroflexus fiber bundles [[119]]. The rostral medial tegmental nucleus (RMTg), a GABAergic afferent to midbrain dopaminergic neurons [120], may be the interface between the lateral habenula and the SN/VTA.

Box 3 Figure I

An external file that holds a picture, illustration, etc.
Object name is nihms313421f4.jpg

Connectivity within the MTL and the hippocampus-VTA loop. Information about objects and their spatial location converges in the entorhinal cortex via the perirhinal and parahippocampal cortices. These inputs can be integrated in the hippocampus towards ‘object-in-context’ information. Contextual representations in the hippocampus are not limited to spatial information, but may also include the temporal sequence of occurrence and anticipatory states associated with current goals and expectations. Hippocampal contextual novelty signals generated on the basis of such representations can increase the pool of tonically active dopamine neurons in the SN/VTA. This could allow perirhinal cortical information about stimulus-novelty to generate phasic dopamine responses (italics in the figure indicates that this possibility needs experimental testing). The loop is completed by ascending dopaminergic fibers that innervate the hippocampus and other MTL structures.

Box 4

Phasic and tonic dopamine signals affect both memory and behavior

The tonic and phasic dopamine signals have consequences for both memory and behavior. In a predictable, familiar (and safe) environment, the ventral subiculum would be in a low-activity state, causing the number of tonically firing dopaminergic neurons firing to be minimal. A phasic stimulus would produce burst firing only in the tonically firing dopamine neurons, leading to a moderate dopamine output. As a result, the attentional state of the animal would be little affected. However, if the animal is in an unpredictable, novel (and presumably less safe) context (e.g. hunting) a novelty signal from ventral subiculum would raise the fraction of dopamine cells that are tonically firing. Under these conditions, a behaviorally relevant stimulus would cause a stronger phasic dopamine signal [107] (referred to as ‘saliency capture’ in [54, 61]).

A crucial control point in this process is the ventral striatum, which receives input from both the subiculum and PFC. Electrophysiological and behavioral studies suggest that these two inputs compete for regulation of the ventral striatum and are regulated in an opposite manner by the dopamine system. Tonic dopamine, which works through the higher-affinity D2 receptors, attenuates medial PFC (mPFC) inputs. In contrast, higher-concentration phasic dopamine signals activate D1 receptors and thereby enhance subicular input [121]. In a condition in which the animal’s behavior leads to a better than expected outcome and phasic release (i.e. positive prediction error), the subicular input would be potentiated, causing the animal to maintain focus on the current task, thus allowing rewarded behaviors to be maintained. However, in the absence of an expected reward, or if a worse than expected outcome occurs (i.e., a negative prediction error), the consequent depression of dopaminergic neuron activity would attenuate dopamine-mediated potentiation of the ventral subiculum while at the same time removing inhibition of the mPFC input. Consequently the mPFC would drive the animal to switch response strategies [122].

Dopamine release by reward, novelty and aversive stimuli

The types of stimuli that increase burst firing of dopaminergic neurons have been extensively studied in monkeys. Classical work showed the importance of reward as a stimulus for the bursting of these cells [43], while subsequent studies have demonstrated that a novel stimulus (in the absence of reward) can also produce burst firing [44]. More recent work has demonstrated conditions under which dopamine cells can be driven by aversive stimuli (reviewed in [45]; see also Box 2).

An important recent advance has been the ability to use functional magnetic resonance imaging (fMRI) to determine whether similar classes of stimuli activate the dopamine system in humans [4649] (reviewed in [50]). Currently, most fMRI studies do not have sufficient resolution to distinguish subregions of the SN/VTA (this will be achievable with ultrahighfield fMRI, which has greater than 1mm resolution). A further difficulty is that fMRI cannot distinguish tonic or bursting modes. Despite these difficulties, there is converging evidence for a close relationship between the fMRI response in the SN/VTA and dopamine release (reviewed in [50]). Such studies (e.g. [48, 51]) have demonstrated that the human SN/VTA is activated by rewards, a finding confirmed by a recent study using electrophysiological recordings [52]. Other work using fMRI has demonstrated that activation can also occur due to aversive events [53] as well as novel stimuli [54]. Thus, novelty, reward stimuli and aversive stimuli are all able to activate the dopamine system in both monkeys and humans.

Effect of Motivation (Reward) on Behavioral Memory

It is to an organism’s advantage to recollect circumstances which led to rewards. Indeed, according to the neoHebbian framework, activating the SN/VTA by a reward should enhance hippocampal-dependent episodic memory of information present at the time of reward. Furthermore, this improvement should be small with short delays when early LTP is present, but large at much longer delays when late LTP is present. A cognitive paradigm used to test these predictions in humans involved activating the SN/VTA at the time of encoding by giving reward. It was found that the prospect of receiving reward (e.g., monetary gain) improved long-term episodic memory for novel stimuli in incidental [47, 5557] and intentional [46] encoding paradigms. This reward-related memory enhancement was associated with a co-activation of SN/VTA, striatum, and hippocampus, as detected by fMRI [46, 47]. Memory enhancement after long retention intervals (e.g. 24 hours) has been consistently found [56, 57]. Moreover, the enhancement was greater at late timepoints than at early intervals (ie. 3 weeks vs. 20 minutes) [47].

It is noteworthy that earlier cognitive psychology experiments on the effects of reward on memory used shorter retention intervals than the aforementioned studies (e.g. [58, 59]). Such studies did not show consistent memory enhancement and led to the idea that motivation was unimportant for learning. From the perspective of the neoHebbian framework, this negative result is understandable; i.e., the SN/VTA activity-related enhancement of memory would be more apparent if testing is conducted after long retention intervals. However, more studies are needed to clarify whether reward-related activation of the SN/VTA is associated with memory enhancement only after long retention intervals, compatible with the effects of intra-hippocampal injections of specific dopamine antagonists (e.g., [26]) (see Boxes 4 and 5).

Importance of the novelty and aversive signals for memory

In changing environments, it is vital to form long-lasting memories for novel events and, by definition, this requires encoding and consolidation after a single exposure. The role of hippocampal dopamine for consolidating novel associations after only one trial in rats [26] raises the question as to whether the human dopaminergic midbrain is activated by trial-unique novelty. Indeed, fMRI studies (for examples see Box 2 Figure 1) confirm SN/VTA activation by associative novelty [60] and stimulus novelty [61] but less by other salient events such as rareness or negative emotional content [61]. Interestingly, the personality trait of novelty seeking is correlated with the magnitude of the SN/VTA novelty fMRI response even when novel stimuli predict the absence of reward [56], suggesting that this effect of novelty is hard to change by learning. In further support of such hard-wiring, positron emission tomography (PET) studies in humans show that novelty-seeking traits are inversely associated with D2-like receptor availability in the SN/VTA region [62] and are positively correlated with white matter connectivity between the hippocampus and the ventral striatum [63]. The human SN/VTA is also activated by cues that predict novel images, or by unexpected novel images that followed familiarity-predictive cues (ie. an ‘unexpected novelty’ response) [64].

Some progress has been made in understanding the circuitry involved in novelty-dependent dopamine release (see Box 3). There is now converging evidence that different forms of novelty signals are computed along the medial temporal lobe (MTL) processing hierarchy from the parahippocampal and perirhinal cortices to the entorhinal cortex and from there to the hippocampus (see Box 3). In the perirhinal cortex, for instance, stimulus-related incoming information can be compared to stored information and a stimulus-related novelty signal is computed [65]. The hippocampus, on the other hand, integrates information about stimuli and their spatial location into a contextual representation and can compute various types of novel changes in these relationships.

It may seem evident that a general purpose system like episodic memory should also enable long -term memory consolidation for events associated with aversive outcomes. Surprisingly, there is relatively little work comparing the consolidation of rewarding and aversive events (e.g. losses) in humans (although there is extensive work on the consolidation of emotionally negative stimuli). However, the observation that there are dopaminergic responses to aversive events (see Box 2) suggests that dopaminergic consolidation mechanisms similar to those seen with novelty and reward may apply. Consistent with this, learning of responses to aversive stimuli in rodents (ie. during fear conditioning) is inhibited by dopaminergic antagonists [6668]. Moreover, dopamine is required for association between stimuli and aversive events in the amygdala [69].

The synaptic tag and capture model: role of plasticity-related proteins

As noted before, the finding that late LTP is more strongly dependent on dopamine than early LTP has been influential because it inspired investigation into the effect of dopamine on early vs. late learning. This suggests that other physiological findings regarding the role of dopamine in LTP might also be relevant to behavior. There are indeed several additional results that fall into this category and these have been incorporated into the Synaptic Tagging and Capture model developed by Frey and Morris [9].

According to this model, weak stimulation induces only early LTP. In contrast, stronger stimulation produces the dopamine-dependent protein synthesis that allows late LTP. The proteins required for late LTP are termed plasticity-related proteins (PRPs). Given the synapse specificity of LTP, these proteins are presumed to interact only with synapses that have undergone Hebbian LTP. Frey and Morris propose that this specificity occurs because LTP immediately produces a synapse-specific “tag” and that PRPs can only produce late LTP at tagged synapses. However, the synapses that produce the PRPs can be different from those at which the tag is set. It appears that both the tag and the PRPs have considerable lifetimes. This follows from the finding that stimulation of protein synthesis and the induction of Hebbian plasticity do not need to be coincident; indeed, the trigger for protein synthesis can occur either before or after Hebbian plasticity (with intervening periods of many minutes). Taken together, these findings suggest that there is a “penumbra” surrounding events that cause dopamine release; the memory for events that occur before or after the dopamine release would depend not only on their own properties, but also on whether they fell within the penumbra of a dopamine-releasing stimulus.

Testing the synaptic tag and capture model at the behavioral level

A number of animal behavioral studies have used novelty exploration as a ‘strong’ event that is likely to induce dopamine release [7072]. In agreement with the synaptic tagging and capture model, exploration of a novel environment can make a memory for a ‘weak’ event more persistent [7072]. For instance, a single exposure to the location of a small reward is rapidly forgotten in rodents within a few hours, but persists for at least 24 hours if it is preceded or followed (within 30 minutes) by novelty exploration [72]. Also consistent with the synaptic tagging and capture hypothesis, intrahippocampal application of the D1/D5 receptor antagonist SCH23390 during novelty exploration diminishes memory retention [72].

Similar to the enhancement of long-term memory by reward in humans [47, 56, 57, 73], 24hr memory is enhanced in rats for rewarded locations (even after only a single exposure); moreover, this long-term memory can be blocked by a D1/D5 receptor antagonist at the time of encoding [72]. That this drug-induced memory impairment is not due to an effect of dopamine on encoding is nicely demonstrated by the rescue of long-term memory if there is a preceding exposure to a novel environment [72].

Studies of LTP have shown that strong stimuli can enhance the encoding of weak stimuli even if the strong stimulus is given many minutes before or after the weak one. We term this temporal smearing the “penumbra” of the strong stimulus. Experiments have been conducted in humans that specifically looked for this penumbra. Individuals were exposed to novel photographs of natural scenes (‘strong events’, like those one would expect to see in the magazine National Geographic), for 5 minutes before being presented with the ‘weak’ encoding events, single words that were pre-familiarized on the day before and hence were not by themselves novel. The result was a higher proportion of recollected (as opposed to those recognized on the basis of familiarity) words when compared to individuals who were exposed to familiar pictures prior to encoding. This effect was stronger after 24 hours as compared to 30 minutes after encoding [74]. Of note, pre-exposure to familiar, but emotionally very negative, photographs did not cause a comparable enhancement. It is therefore unlikely that affective arousal can account for the memory enhancement after novelty exposure. Hence, the behavioral penumbra seen after exposure to novelty in humans is compatible with the synaptic tag and capture model.

Although the concept of a penumbra is promising, further work to investigate its experimental basis is required. To understand factors that determine the duration of the penumbra, measurements need to be made on the duration of the dopamine signal in the hippocampus. The dopamine transporter (DAT), which rapidly clears dopamine from the synaptic cleft, is much less abundant in the hippocampus than in the striatum [75], so the duration of dopamine elevations may be relatively long. Recent elegant work is providing information about the spatial and temporal properties of protein-synthesis processes during LTP [76], which may also influence the time course of the penumbra. Behavioral experiments suggest that there are differences in the penumbra duration for novel as compared to rewarding events [57] (see Box 5). Perhaps a mechanistic understanding of the underlying events will be able to explain this difference. Additional experiments at the behavioral level are needed to examine the role of attention or arousal in the penumbra effect: if strong stimulation enhances the memory of a subsequent weak stimulus, perhaps this is due to stimulation of attention or arousal by the strong stimulus or to effects at encoding (see Box 5).

Concluding Comments

The results reviewed here indicate that the neoHebbian framework has explanatory power in the understanding of memory. A key insight to emerge from physiological experiments is that LTP at short times is relatively insensitive to dopamine, whereas LTP with longer delays is sensitive. Moreover, other properties of LTP (weak/strong; penumbra) appear also to have predictive power in explaining properties of memory.

The idea that long-term memory depends on more than the Hebbian condition makes sense, given the likelihood that too much information written into memory might lead to overwriting (and thus degradation) of pre-existing memories [77]. According to the neoHebbian model, the default condition is that information is stored automatically in the hippocampus by early LTP based on a local Hebbian process. Generally, this newly stored information will fade away as early LTP declines. Only if a systems-wide computation determines that there is a high level of novelty or motivational salience will dopamine be released, allowing the biochemical processes of late LTP to incorporate the information into long-term memory.

The conditions that lead to activation of the human dopamine system, as judged by fMRI signals, are similar to the conditions leading to activation in non-primate mammals. It therefore seems likely that these principles may apply across all mammals. Nevertheless, experiments that further test the neoHebbian ideas in humans are warranted. An important prediction is that long-term memory would be affected positively (but possibly with an inverted U-shape pattern) by drugs that increase dopamine availability, such as levodopa (L-DOPA), which might enhance the burst-evoked release of dopamine. There has been remarkably little work on the effect of dopaminergic manipulations on human long-term memory. The long-term safety of available dopaminergic drugs is an important problem that needs to be overcome. Alternatively, progress may be made by taking advantage of genetic diversity in the human population with respect to genes that strongly affect dopaminergic function, which has already proven valuable for studying the human reward system (eg. [78]).

A better understanding of the human dopaminergic system is relevant to several human neurological or psychiatric disorders and to aging. Parkinson’s disease (PD) usually begins with loss of nigrostriatal dopamine cells in the ventral tier SNc, leaving the dorsal tier projection neurons to the hippocampus, ventral striatum, and cortex intact [79]. It is thus understandable that severe episodic memory problems are not a symptom of early PD [80]. However, at late stages, when all types of dopaminergic projection neurons are affected, memory problems do become severe. In schizophrenia, the enhanced activation of the hippocampal system may increase dopamine release and skew behavioral choice towards repetitive behavior (see Box 3). In aging, there is now converging evidence from human imaging studies for a structural degeneration of the VTA/SN (for reviews, see [54, 81]). Consistent findings have also been recently reported in aged rats [82]. The possible negative effect of such a decline on the ability of information to enter into long-term memory has not yet been adequately considered.

Understanding the role of dopamine in memory may lead to methods that can be used to improve learning and teaching methods. First, based on the penumbra hypothesis, exposure to novel or salient information could be used to improve the long-term persistence of other information given in temporal proximity. Second, strategies based on the ability of external reward or reward anticipation to release dopamine might prove to be useful in increasing the retention of learned material. Finally, just as with external performance positive feedback, memory persistence might benefit from internally generated reward signals. For instance, the ability to recollect newly acquired information may be intrinsically rewarding. In fact, the study of human learning has revealed an interesting puzzle; long-term retention is not helped by simple re-exposure to recently learned material but is greatly helped by retesting even when subjects already know the answer [83]. One interesting possibility is that retesting provides an opportunity to generate intrinsic reward signals, thereby enhancing long-term persistence of newly learned material.

Acknowledgments

This research was supported by National Institutes of Health grants (R01 NS027337 and R01 DA027807 to JL, MH57440 and NS15408 to A.A.G), the Deutsche Forschungsgemeinschaft (SFB 776, TP A7 and KFO 163 to E.D.), the German Centre for Neurodegenerative Disorders (DZNE), and a National Science Foundation (NSF) grant (SBE-0935288) to J.L.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

1. Watson JS, PERAM In: Cicero (B.C.55) Cicero on Oratory and Orators (Landmarks in Rhetoric and Public Address) Potter David., editor. Southern Illinois University Press; 1986.
2. Hebb DO. The Organization of Behavior: A Neuropsychological Theory. John Wiley and Sons, Inc; 1949.
3. Wigstrom H, Gustafsson B. Postsynaptic control of hippocampal long-term potentiation. J Physiol (Paris) 1986;81:228–236. [PubMed]
4. Malinow R, Miller JP. Postsynaptic hyperpolarization during conditioning reversibly blocks induction of long-term potentiation. Nature. 1986;320:529–530. [PubMed]
5. Matsuzaki M, et al. Structural basis of long-term potentiation in single dendritic spines. Nature. 2004;429:761–766. [PMC free article] [PubMed]
6. Lisman J, et al. The molecular basis of CaMKII function in synaptic and behavioural memory. Nature reviews. 2002;3:175–190. [PubMed]
7. Frey U, et al. Dopaminergic antagonists prevent long-term maintenance of posttetanic LTP in the CA1 region of rat hippocampal slices. Brain research. 1990;522:69–75. [PubMed]
8. Lisman JE, Grace AA. The hippocampal-VTA loop: controlling the entry of information into long-term memory. Neuron. 2005;46:703–713. [PubMed]
9. Frey U, Morris RG. Synaptic tagging and long-term potentiation [see comments] Nature. 1997;385:533–536. [PubMed]
10. Frey S, Frey JU. ‘Synaptic tagging’ and ‘cross-tagging’ and related associative reinforcement processes of functional plasticity as the cellular basis for memory formation. Progress in brain research. 2008;169:117–143. [PubMed]
11. Shohamy D, Adcock RA. Dopamine and adaptive memory. Trends Cogn Sci. 2010;14:464–472. [PubMed]
12. Redondo RL, Morris RG. Making memories last: the synaptic tagging and capture hypothesis. Nature reviews. 2011;12:17–30. [PubMed]
13. Frey U, et al. Effects of cAMP simulate a late stage of LTP in hippocampal CA1 neurons. Science (New York, N Y. 1993;260:1661–1664. [PubMed]
14. O’Carroll CM, Morris RG. Heterosynaptic co-activation of glutamatergic and dopaminergic afferents is required to induce persistent long-term potentiation. Neuropharmacology. 2004;47:324–332. [PubMed]
15. Matthies H, et al. Dopamine D1-deficient mutant mice do not express the late phase of hippocampal long-term potentiation. Neuroreport. 1997;8:3533–3535. [PubMed]
16. Granado N, et al. D1 but not D5 dopamine receptors are critical for LTP, spatial learning, and LTP-Induced arc and zif268 expression in the hippocampus. Cereb Cortex. 2008;18:1–12. [PubMed]
17. Swanson-Park JL, et al. A double dissociation within the hippocampus of dopamine D1/D5 receptor and beta-adrenergic receptor contributions to the persistence of long-term potentiation. Neuroscience. 1999;92:485–497. [PubMed]
18. Yang HW, et al. Change in bi-directional plasticity at CA1 synapses in hippocampal slices taken from 6-hydroxydopamine-treated rats: the role of endogenous norepinephrine. The European journal of neuroscience. 2002;16:1117–1128. [PubMed]
19. Ihalainen JA, et al. Comparison of dopamine and noradrenaline release in mouse prefrontal cortex, striatum and hippocampus using microdialysis. Neurosci Lett. 1999;277:71–74. [PubMed]
20. Kentros CG, et al. Increased attention to spatial context increases both place field stability and spatial memory. Neuron. 2004;42:283–295. [PubMed]
21. Huang YY, Kandel ER. D1/D5 receptor agonists induce a protein synthesis-dependent late potentiation in the CA1 region of the hippocampus. Proceedings of the National Academy of Sciences of the United States of America. 1995;92:2446–2450. [PubMed]
22. Navakkode S, et al. Synergistic requirements for the induction of dopaminergic D1/D5-receptor-mediated LTP in hippocampal slices of rat CA1 in vitro. Neuropharmacology. 2007;52:1547–1554. [PubMed]
23. Otmakhova NA, Lisman JE. D1/D5 dopamine receptor activation increases the magnitude of early long-term potentiation at CA1 hippocampal synapses. J Neurosci. 1996;16:7478–7486. [PubMed]
24. Tulving E. Memory and consciousness. Canadian Psychology. 1985;26:1–12.
25. Tse D, et al. Schemas and memory consolidation. Science (New York, N Y. 2007;316:76–82. [PubMed]
26. Bethus I, et al. Dopamine and memory: modulation of the persistence of memory for novel hippocampal NMDA receptor-dependent paired associates. J Neurosci. 2010;30:1610–1618. [PubMed]
27. O’Carroll CM, et al. Dopaminergic modulation of the persistence of one-trial hippocampus-dependent memory. Learning & memory (Cold Spring Harbor, N Y. 2006;13:760–769. [PubMed]
28. Dahlström A, Fuxe K. Evidence for the existence of monoamine containing neurons in the central nervous system. I. Demonstration of monoamines in the cell bodies of brain stem neurons. Acta Physiol Scand. 1964;62:1– 80. [PubMed]
29. Beckstead RM, et al. Efferent connections of the substantia nigra and ventral tegmental area in the rat. Brain research. 1979;175:191–217. [PubMed]
30. Haber SN, Knutson B. The reward circuit: linking primate anatomy and human imaging. Neuropsychopharmacology. 2010;35:4–26. [PMC free article] [PubMed]
31. Nair-Roberts RG, et al. Stereological estimates of dopaminergic, GABAergic and glutamatergic neurons in the ventral tegmental area, substantia nigra and retrorubral field in the rat. Neuroscience. 2008;152:1024–1031. [PMC free article] [PubMed]
32. Hnasko TS, et al. Vesicular glutamate transport promotes dopamine storage and glutamate corelease in vivo. Neuron. 2010;65:643–656. [PMC free article] [PubMed]
33. Grace AA, Bunney BS. Nigral dopamine neurons: intracellular recording and identification with L-dopa injection and histofluorescence. Science (New York, N Y. 1980;210:654–656. [PubMed]
34. Grace AA, Bunney BS. Intracellular and extracellular electrophysiology of nigral dopaminergic neurons--1. Identification and characterization. Neuroscience. 1983;10:301–315. [PubMed]
35. Guyenet PG, Aghajanian GK. Antidromic identification of dopaminergic and other output neurons of the rat substantia nigra. Brain research. 1978;150:69–84. [PubMed]
36. Grace AA, Onn SP. Morphology and electrophysiological properties of immunocytochemically identified rat dopamine neurons recorded in vitro. J Neurosci. 1989;9:3463–3481. [PubMed]
37. Yim CY, Mogenson GJ. Response of ventral pallidal neurons to amygdala stimulation and its modulation by dopamine projections to nucleus accumbens. Journal of Neurophysiology. 1983;50:148–161. [PubMed]
38. Floresco SB, et al. Glutamatergic afferents from the hippocampus to the nucleus accumbens regulate activity of ventral tegmental area dopamine neurons. Journal of Neuroscience. 2001;21:4915–4922. [PubMed]
39. Floresco SB, et al. Afferent modulation of dopamine neuron firing differentially regulates tonic and phasic dopamine transmission. Nature neuroscience. 2003;6:968–973. [PubMed]
40. Lodge DJ, Grace AA. The hippocampus modulates dopamine neuron responsivity by regulating the intensity of phasic neuron activation. Neuropsychopharmacology. 2006;31:1356–1361. [PubMed]
41. Zweifel LS, et al. Disruption of NMDAR-dependent burst firing by dopamine neurons provides selective assessment of phasic dopamine-dependent behavior. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:7281–7288. [PubMed]
42. Tsai HC, et al. Phasic firing in dopaminergic neurons is sufficient for behavioral conditioning. Science (New York, N Y. 2009;324:1080–1084. [PubMed]
43. Schultz W. Multiple dopamine functions at different time courses. Annual review of neuroscience. 2007;30:259–288. [PubMed]
44. Ljungberg T, et al. Responses of monkey dopamine neurons during learning of behavioral reactions. J Neurophysiol. 1992;67:145–163. [PubMed]
45. Bromberg-Martin ES, et al. Dopamine in motivational control: rewarding, aversive, and alerting. Neuron. 2010;68:815–834. [PMC free article] [PubMed]
46. Adcock RA, et al. Reward-motivated learning: mesolimbic activation precedes memory formation. Neuron. 2006;50:507–517. [PubMed]
47. Wittmann BC, et al. Reward-related FMRI activation of dopaminergic midbrain is associated with enhanced hippocampus-dependent long-term memory formation. Neuron. 2005;45:459–467. [PubMed]
48. D’Ardenne K, et al. BOLD responses reflecting dopaminergic signals in the human ventral tegmental area. Science (New York, N Y. 2008;319:1264–1267. [PubMed]
49. Shohamy D, Wagner AD. Integrating memories in the human brain: hippocampal-midbrain encoding of overlapping events. Neuron. 2008;60:378–389. [PMC free article] [PubMed]
50. Duzel E, et al. Functional imaging of the human dopaminergic midbrain. Trends in neurosciences. 2009;32:321–328. [PubMed]
51. Guitart-Masip M, et al. Contextual novelty changes reward representations in the striatum. J Neurosci. 2010;30:1721–1726. [PMC free article] [PubMed]
52. Zaghloul KA, et al. Human substantia nigra neurons encode unexpected financial rewards. Science (New York, N Y. 2009;323:1496–1499. [PMC free article] [PubMed]
53. Guitart-Masip M, et al. Action dominates valence in anticipatory representations in the human striatum and dopaminergic midbrain. Journal of Neuroscience. 2011;31:7867–7875. [PMC free article] [PubMed]
54. Duzel E, et al. NOvelty-related motivation of anticipation and exploration by dopamine (NOMAD): implications for healthy aging. Neuroscience and biobehavioral reviews. 2010;34:660–669. [PubMed]
55. Callan DE, Schweighofer N. Positive and negative modulation of word learning by reward anticipation. Human brain mapping. 2008;29:237–249. [PubMed]
56. Krebs RM, et al. Personality traits are differentially associated with patterns of reward and novelty processing in the human substantia nigra/ventral tegmental area. Biological psychiatry. 2009;65:103–110. [PubMed]
57. Wittmann B, et al. Behavioural specifications of reward-associated long-term memory enhancement in humans. Learning and Memory. 2011;18:296–300. [PMC free article] [PubMed]
58. Craik FIM, Tulving E. Depth of Processing and the Retention of Words in Episodic Memory. Journal ol Experimental Psychology: General. 1975;3:268–294.
59. Ngaosuvan L, Mantyla T. Rewarded remembering: dissociations between self-rated motivation and memory performance. Scandinavian journal of psychology. 2005;46:323–330. [PubMed]
60. Schott BH, et al. Activation of midbrain structures by associative novelty and the formation of explicit memory in humans. Learning & memory (Cold Spring Harbor, N Y. 2004;11:383–387. [PubMed]
61. Bunzeck N, Duzel E. Absolute stimulus-novelty is coded by the human substantia nigra/VTA. Neuron. 2006;3:369–379. [PubMed]
62. Zald DH, et al. Midbrain dopamine receptor availability is inversely associated with novelty-seeking traits in humans. J Neurosci. 2008;28:14372–14378. [PMC free article] [PubMed]
63. Cohen MX, et al. Connectivity-based segregation of the human striatum predicts personality characteristics. Nature neuroscience. 2009;12:32–34. [PubMed]
64. Wittmann BC, et al. Anticipation of novelty recruits reward system and hippocampus while promoting recollection. Neuroimage. 2007;38:194–202. [PMC free article] [PubMed]
65. Xiang JZ, Brown MW. Differential neuronal encoding of novelty, familiarity and recency in regions of the anterior temporal lobe. Neuropharmacology. 1998;37:657–676. [PubMed]
66. Fadok JP, et al. Dopamine is necessary for cue-dependent fear conditioning. J Neurosci. 2009;29:11089–11097. [PMC free article] [PubMed]
67. Ortiz O, et al. Associative learning and CA3-CA1 synaptic plasticity are impaired in D1R null, Drd1a−/− mice and in hippocampal siRNA silenced Drd1a mice. J Neurosci. 2010;30:12288–12300. [PubMed]
68. Shen YL, et al. Dopamine receptor antagonists impair place conditioning after acute stress in rats. Behav Pharmacol. 2010;21:77–82. [PubMed]
69. Rosenkranz JA, Grace AA. Dopamine-mediated modulation of odour-evoked amygdala potentials during pavlovian conditioning. Nature. 2002;417:282–287. [PubMed]
70. Moncada D, Viola H. Induction of long-term memory by exposure to novelty requires protein synthesis: evidence for a behavioral tagging. J Neurosci. 2007;27:7476–7481. [PubMed]
71. Ballarini F, et al. Behavioral tagging is a general mechanism of long-term memory formation. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:14599–14604. [PubMed]
72. Wang SH, et al. Relevance of synaptic tagging and capture to the persistence of long-term potentiation and everyday spatial memory. Proceedings of the National Academy of Sciences of the United States of America. 2010;107:19537–19542. [PubMed]
73. Bunzeck N, et al. A common mechanism for adaptive scaling of reward and novelty. Human brain mapping. 2010;31:1380–1394. [PMC free article] [PubMed]
74. Fenker DB, et al. Novel scenes improve recollection and recall of words. J Cogn Neurosci. 2008;20:1250–1265. [PubMed]
75. Schott BH, et al. The dopaminergic midbrain participates in human episodic memory formation: evidence from genetic imaging. J Neurosci. 2006;26:1407–1417. [PubMed]
76. Govindarajan A, et al. The dendritic branch is the preferred integrative unit for protein synthesis-dependent LTP. Neuron. 2011;69:132–146. [PMC free article] [PubMed]
77. Blumenfeld B, et al. Dynamics of memory representations in networks with novelty-facilitated synaptic plasticity. Neuron. 2006;52:383–394. [PubMed]
78. Dreher JC, et al. Variation in dopamine genes influences responsivity of the human reward system. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:617–622. [PubMed]
79. Fearnley JM, Lees AJ. Ageing and Parkinson’s disease: substantia nigra regional selectivity. Brain. 1991;114 (Pt 5):2283–2301. [PubMed]
80. Perry RJ, Hodges JR. Spectrum of memory dysfunction in degenerative disease. Current opinion in neurology. 1996;9:281–285. [PubMed]
81. Backman L, et al. Linking cognitive aging to alterations in dopamine neurotransmitter functioning: recent data and future avenues. Neuroscience and biobehavioral reviews. 2010;34:670–677. [PubMed]
82. Sanchez HL, et al. Dopaminergic mesencephalic systems and behavioral performance in very old rats. Neuroscience. 2008;154:1598–1606. [PubMed]
83. Karpicke JD, Roediger HL., 3rd The critical importance of retrieval for learning. Science (New York, N Y. 2008;319:966–968. [PubMed]
84. Osten P, et al. Protein synthesis-dependent formation of protein kinase Mzeta in long-term potentiation. J Neurosci. 1996;16:2444–2451. [PubMed]
85. Smith WB, et al. Dopaminergic stimulation of local protein synthesis enhances surface expression of GluR1 and synaptic transmission in hippocampal neurons. Neuron. 2005;45:765–779. [PubMed]
86. Lu Y, et al. BDNF: a key regulator for protein synthesis-dependent LTP and long-term memory? Neurobiology of learning and memory. 2008;89:312–323. [PMC free article] [PubMed]
87. Yoshii A, Constantine-Paton M. Postsynaptic BDNF-TrkB signaling in synapse maturation, plasticity, and disease. Dev Neurobiol. 2010;70:304–322. [PMC free article] [PubMed]
88. Kelleher RJ, 3rd, et al. Translational control by MAPK signaling in long-term synaptic plasticity and memory. Cell. 2004;116:467–479. [PubMed]
89. Valjent E, et al. Regulation of a protein phosphatase cascade allows convergent dopamine and glutamate signals to activate ERK in the striatum. Proceedings of the National Academy of Sciences of the United States of America. 2005;102:491–496. [PubMed]
90. De Leonibus E, et al. Distinct kinds of novelty processing differentially increase extracellular dopamine in different brain regions. The European journal of neuroscience. 2006;23:1332–1340. [PubMed]
91. Bassareo V, et al. Differential Expression of Motivational Stimulus Properties by Dopamine in Nucleus Accumbens Shell versus Core and Prefrontal Cortex. J Neurosci. 2002;22:4709–4719. [PubMed]
92. Rebec GV. Real-time assessments of dopamine function during behavior: single-unit recording, iontophoresis, and fast-scan cyclic voltammetry in awake, unrestrained rats. Alcohol Clin Exp Res. 1998;22:32–40. [PubMed]
93. Frank MJ, Fossella JA. Neurogenetics and pharmacology of learning, motivation, and cognition. Neuropsychopharmacology. 2011;36:133–152. [PMC free article] [PubMed]
94. Goto Y, Grace AA. Dopamine-dependent interactions between limbic and prefrontal cortical plasticity in the nucleus accumbens: disruption by cocaine sensitization. Neuron. 2005;47:255–266. [PubMed]
95. Belujon P, et al. Aberrant striatal plasticity is specifically associated with dyskinesia following levodopa treatment. Mov Disord. 2010;25:1568–1576. [PMC free article] [PubMed]
96. Calabresi P, et al. Dopamine-mediated regulation of corticostriatal synaptic plasticity. Trends in neurosciences. 2007;30:211–219. [PubMed]
97. Matsumoto M, Hikosaka O. Two types of dopamine neuron distinctly convey positive and negative motivational signals. Nature. 2009;459:837–841. [PMC free article] [PubMed]
98. Mirenowicz J, Schultz W. Preferential activation of midbrain dopamine neurons by appetitive rather than aversive stimuli. Nature. 1996;379:449–451. [PubMed]
99. Brischoux F, et al. Phasic excitation of dopamine neurons in ventral VTA by noxious stimuli. Proceedings of the National Academy of Sciences of the United States of America. 2009;106:4894–4899. [PubMed]
100. Brown MT, et al. Activity of neurochemically heterogeneous dopaminergic neurons in the substantia nigra during spontaneous and driven changes in brain state. J Neurosci. 2009;29:2915–2925. [PMC free article] [PubMed]
101. Wang DV, Tsien JZ. Convergent processing of both positive and negative motivational signals by the VTA dopamine neuronal populations. PloS one. 2011;6:e17047. [PMC free article] [PubMed]
102. Valenti O, et al. Aversive Stimuli Alter Ventral Tegmental Area Dopamine Neuron Activity via a Common Action in the Ventral Hippocampus. J Neurosci. 2011;31:4280–4289. [PMC free article] [PubMed]
103. Piazza PV, Le Moal M. The role of stress in drug self-administration. Trends Pharmacol Sci. 1998;19:67–74. [PubMed]
104. Imperato A, et al. Repeated stressful experiences differently affect the time-dependent responses of the mesolimbic dopamine system to the stressor. Brain research. 1993;601:333–336. [PubMed]
105. Flagel SB, et al. A selective role for dopamine in stimulus-reward learning. Nature. 2010;469:53–57. [PMC free article] [PubMed]
106. Hnasko TS, et al. Morphine reward in dopamine-deficient mice. Nature. 2005;438:854–857. [PubMed]
107. Grace AA, et al. Regulation of firing of dopaminergic neurons and control of goal-directed behaviors. Trends in neurosciences. 2007;30:220–227. [PubMed]
108. Grace AA. Dopamine system dysregulation by the ventral subiculum as the common pathophysiological basis for schizophrenia psychosis, psychostimulant abuse, and stress. Neurotoxicity research. 2010;18:367–376. [PMC free article] [PubMed]
109. Bunzeck N, et al. Mesolimbic novelty processing in older adults. Cereb Cortex. 2007;17:2940–2948. [PubMed]
110. Bunzeck N, et al. Contextual interaction between novelty and reward processing within the mesolimbic system. Human brain mapping 2011 [PMC free article] [PubMed]
111. Axmacher N, et al. Intracranial EEG correlates of expectancy and memory formation in the human hippocampus and nucleus accumbens. Neuron. 2010;65:541–549. [PubMed]
112. Eichenbaum H, Lipton PA. Towards a functional organization of the medial temporal lobe memory system: role of the parahippocampal and medial entorhinal cortical areas. Hippocampus. 2008;18:1314–1324. [PMC free article] [PubMed]
113. Kumaran D, Maguire EA. An unexpected sequence of events: mismatch detection in the human hippocampus. PLoS biology. 2006;4:e424. [PubMed]
114. Pan WX, Hyland BI. Pedunculopontine tegmental nucleus controls conditioned responses of midbrain dopamine neurons in behaving rats. J Neurosci. 2005;25:4725–4732. [PubMed]
115. Dormont JF, et al. The role of the pedunculopontine tegmental nucleus in relation to conditioned motor performance in the cat. I. Context-dependent and reinforcement-related single unit activity. Exp Brain Res. 1998;121:401–410. [PubMed]
116. Mesulam MM, et al. Human reticular formation: cholinergic neurons of the pedunculopontine and laterodorsal tegmental nuclei and some cytochemical comparisons to forebrain cholinergic neurons. J Comp Neurol. 1989;283:611–633. [PubMed]
117. Lodge DJ, Grace AA. The laterodorsal tegmentum is essential for burst firing of ventral tegmental area dopamine neurons. Proceedings of the National Academy of Sciences of the United States of America. 2006;103:5167–5172. [PubMed]
118. Sesack SR, et al. Anatomical substrates for glutamate-dopamine interactions: evidence for specificity of connections and extrasynaptic actions. Ann N Y Acad Sci. 2003;1003:36–52. [PubMed]
119. Matsumoto M, Hikosaka O. Lateral habenula as a source of negative reward signals in dopamine neurons. Nature. 2007;447:1111–1115. [PubMed]
120. Jhou TC, et al. The rostromedial tegmental nucleus (RMTg), a GABAergic afferent to midbrain dopamine neurons, encodes aversive stimuli and inhibits motor responses. Neuron. 2009;61:786–800. [PMC free article] [PubMed]
121. Goto Y, Grace AA. Dopaminergic modulation of limbic and cortical drive of nucleus accumbens in goal-directed behavior. Nature neuroscience. 2005;8:805–812. [PubMed]
122. Goto Y, Grace AA. Limbic and cortical information processing in the nucleus accumbens. Trends in neurosciences. 2008;31:552–558. [PMC free article] [PubMed]
123. Fischer H, et al. Simulating neurocognitive aging: effects of a dopaminergic antagonist on brain activity during working memory. Biological psychiatry. 2010;67:575–580. [PubMed]
124. Dagher A, Robbins TW. Personality, addiction, dopamine: insights from Parkinson’s disease. Neuron. 2009;61:502–510. [PubMed]
125. Cools R. Dopaminergic modulation of cognitive function-implications for L-DOPA treatment in Parkinson’s disease. Neuroscience and biobehavioral reviews. 2006;30:1–23. [PubMed]
126. Knecht S, et al. Levodopa: faster and better word learning in normal humans. Annals of neurology. 2004;56:20–26. [PubMed]
127. Balderas I, et al. The consolidation of object and context recognition memory involve different regions of the temporal lobe. Learning & memory (Cold Spring Harbor, N Y. 2008;15:618–624. [PubMed]
128. Otmakhova NA, Lisman JE. Dopamine selectively inhibits the direct cortical pathway to the CA1 hippocampal region. J Neurosci. 1999;19:1437–1445. [PubMed]
129. Peters Y, et al. Prefrontal cortical up states are synchronized with ventral tegmental area activity. Synapse. 2004;52:143–152. [PubMed]
130. Benchenane K, et al. Coherent theta oscillations and reorganization of spike timing in the hippocampal- prefrontal network upon learning. Neuron. 2010;66:921–936. [PubMed]
131. Swant J, et al. Postsynaptic dopamine D3 receptor modulation of evoked IPSCs via GABA(A) receptor endocytosis in rat hippocampus. Hippocampus. 2008;18:492–502. [PubMed]
132. O’Donnell P. Dopamine gating of forebrain neural ensembles. The European journal of neuroscience. 2003;17:429–435. [PubMed]
133. Kakade S, Dayan P. Dopamine: generalization and bonuses. Neural Netw. 2002;15:549–559. [PubMed]
134. Fanselow MS, Dong HW. Are the dorsal and ventral hippocampus functionally distinct structures? Neuron. 2010;65:7–19. [PMC free article] [PubMed]
135. Puryear CB, et al. Conjunctive encoding of movement and reward by ventral tegmental area neurons in the freely navigating rodent. Behav Neurosci. 2010;124:234–247. [PMC free article] [PubMed]
136. Sarter M, et al. Unraveling the attentional functions of cortical cholinergic inputs: interactions between signal-driven and cognitive modulation of signal detection. Brain Res Brain Res Rev. 2005;48:98–111. [PubMed]
137. Walling SG, Harley CW. Locus ceruleus activation initiates delayed synaptic potentiation of perforant path input to the dentate gyrus in awake rats: a novel beta-adrenergic- and protein synthesis-dependent mammalian plasticity mechanism. J Neurosci. 2004;24:598–604. [PubMed]
138. Neugebauer F, et al. Modulation of extracellular monoamine transmitter concentrations in the hippocampus after weak and strong tetanization of the perforant path in freely moving rats. Brain research. 2009;1273:29–38. [PubMed]
139. Fitch TE, et al. Dopamine D1/5 receptor modulation of firing rate and bidirectional theta burst firing in medial septal/vertical limb of diagonal band neurons in vivo. J Neurophysiol. 2006;95:2808–2820. [PubMed]
140. Aston-Jones G, Cohen JD. An integrative theory of locus coeruleus-norepinephrine function: adaptive gain and optimal performance. Annual review of neuroscience. 2005;28:403–450. [PubMed]
141. Tully K, Bolshakov VY. Emotional enhancement of memory: how norepinephrine enables synaptic plasticity. Molecular brain. 2010;3:15. [PMC free article] [PubMed]
142. Shimizu E, et al. NMDA receptor-dependent synaptic reinforcement as a crucial process for memory consolidation. Science (New York, N Y. 2000;290:1170–1174. [PubMed]
143. O’Neill J, et al. Play it again: reactivation of waking experience and memory. Trends in neurosciences. 2010;33:220–229. [PubMed]
144. Rossato JI, et al. Dopamine controls persistence of long-term memory storage. Science (New York, N Y. 2009;325:1017–1020. [PubMed]
145. Guzman-Ramos K, et al. Off-line concomitant release of dopamine and glutamate involvement in taste memory consolidation. Journal of neurochemistry. 2010;114:226–236. [PubMed]