|Home | About | Journals | Submit | Contact Us | Français|
Neuronal oscillations appear throughout the nervous system, in structures as diverse as the cerebral cortex, hippocampus, subcortical nuclei and sense organs. Whether neural rhythms contribute to normal function, are merely epiphenomena, or even interfere with physiological processing are topics of vigorous debate. Sensory pathways are ideal for investigation of oscillatory activity because their inputs can be defined. Thus, we will focus on sensory systems as we ask how neural oscillations arise and how they might encode information about the stimulus. We will highlight recent work in the early visual pathway that shows how oscillations can multiplex different types of signals to increase the amount of information that spike trains encode and transmit. Last, we will describe oscillation-based models of visual processing and explore how they might guide further research.
Oscillatory neural activity has been observed in the early stages of virtually every sensory system of animals, including insects, frogs and primates. For example, periodic activity has been recorded from olfactory organs (Bressler and Freeman, 1980; Laurent, 2002) as well as somatosensory (Ahissar and Vaadia, 1990; Ahissar et al., 2000), visual (Laufer and Verzeano, 1967; Neuenschwander and Singer, 1996; Arai et al., 2004) and auditory pathways (Langner, 1992; Eguiluz et al., 2000; Roberts and Rutherford, 2008).
The strength of neural oscillations can be assessed in the time and frequency domains. In the time domain, oscillations in spike trains are often characterized by means of periodic peaks in the autocorrelogram (Figures (Figures1A,B);1A,B); in the frequency domain, they are defined on the basis of peaks in the power spectrum (Figures (Figures1C,D).1C,D). However, the autocorrelograms are subject to confounds caused by the refractory period and spectral peaks often fail to reveal weak rhythms. A new method, the oscillation score (Muresan et al., 2008) reduces these problems. It combines analyses in the time and frequency domains to indicate the strength of oscillations as one dimensionless number. Also, oscillations shared by local groups of cells can be detected in population responses, such as the local field potential (LFP) or in patterns of synaptic input.
The mechanisms that generate oscillations are specific to sensory modality and brain region. Oscillations can be inherited from the temporal structure of the stimulus, as in the auditory system (Figure (Figure2A);2A); originate from periodic movements of mechanical sensors, as in the rodent's whisker system (Figure (Figure2B);2B); be generated or amplified by electrical resonances of individual cells, as in auditory hair cells (Figure (Figure2C);2C); or arise from intrinsic activity in recurrent neural networks, as in the olfactory system (Figure (Figure2D).2D). At times, these mechanisms operate cooperatively to generate oscillations.
Many studies have explored functional roles of neural oscillations. Work in sensory systems suggests that oscillations might improve the chance that neural activity propagates from one stage to the next. For example, oscillations might synchronize convergent inputs to a single postsynaptic cell (Usrey et al., 1998; Bruno and Sakmann, 2006) or drive downstream neurons at their preferred input frequency (Nowak et al., 1997; Hutcheon and Yarom, 2000; Fellous et al., 2001). Further, work in the auditory, somatosensory, olfactory and visual systems suggest that oscillations carry specific types of information about the stimulus.
Neural oscillations in the auditory system can be directly induced by the stimulus (Figure (Figure2A)2A) and are often amplified by mechanical and electrical resonances (Figures (Figures2B,C).2B,C). The periodicity of the spikes is phase-locked to the acoustic signal, and can be used to increase sensitivity to different features in the stimulus and sharpen tuning to specific frequencies (Eguiluz et al., 2000; Roberts and Rutherford, 2008). Information about the phase of the acoustic signal is also essential for spatial localization (Gerstner et al., 1996). Further, psychophysical studies suggest that the representation of acoustic phase in periodic spike trains forms the basis for the dual quality of pitch perception (Licklider, 1951; Langner, 1992; Patterson,2000).
Cyclical movements of the whiskers are reflected in neural oscillations in the rodent somatosensory system (Figure (Figure2B).2B). The neural oscillations form a reference signal such that the spatial location of an obstacle is encoded in the phase at the time the whisker registers a contact (Ahissar et al., 2000; Szwed et al., 2003). In addition, high frequency oscillations have been proposed to encode the textures of surfaces (Brecht, 2006). Thus, work from the somatosensory system suggests that spatial structure in the stimulus is transformed to temporal structure in the spike train.
Finally, in the olfactory system, neural oscillations are internally generated by recurrent feedback among neurons in sensory organs (Figure (Figure2D)2D) (Bressler and Freeman, 1980). Thus, this case is different from the two previous examples, in which the neural oscillations represent external variables. Exploring the role of intrinsic versus extrinsic dynamics poses unique difficulties, but nonetheless has proven key to understanding how some forms of sensory information are analyzed (Freeman and Viana Di Prisco, 1986; Laurent, 2002). The olfactory bulb of zebra fish provides a good illustration. Information about complementary features of odors, identity and category, is encoded in the oscillating spike trains of single populations of neurons (Friedrich et al., 2004).
The discovery that the amount of force applied to the skin modulates the firing rate of peripheral nerves provided great insight into neural coding (Adrian and Zotterman, 1926). It led to the realization that features of sensory stimuli, like pressure, sound level or visual contrast are encoded by instantaneous spike rate. Rate coding of sensory information is, perhaps, the most successful paradigm for understanding how neurons convey information.
Techniques to estimate the amount of information transmitted by changes in spike rate (MacKay and McCulloch, 1952; Grüsser et al., 1962; Rieke et al., 1999; Brenner et al., 2000) are well established. But what if neural firing patterns are shaped not only by stimulus-evoked changes in rate, but also by intrinsic or extrinsic oscillations? How might one separate and measure the contribution of each component to the transmission of neural information? The next section discusses how to address this question.
The influence of a stimulus on the spike rate of a sensory neuron is usually estimated by recording neural responses to multiple repeats of the same stimulus. Firing rate during repeats of the stimulus is then averaged to obtain a peri-stimulus time histogram (PSTH). Thus, structure in the PSTH reflects temporal changes in the features of the stimulus to which the cell responded (Figure (Figure33A).
By applying Shannon's information theory, it is possible to quantify how much information the averaged spike rate (PSTH) conveys about the stimulus (MacKay and McCulloch, 1952; Grüsser et al., 1962; Rieke et al., 1999; Brenner et al., 2000). The result of this analysis, the information rate I (spike at t|stimulus), is usually given in bits per spike (or bits per second). In simple terms, the information rate corresponds to the mean number of yes/no questions one would have to ask in order to gather the amount of information conveyed by each action potential, or each second of the spike train.
Now consider the case of an oscillating neuron in a sensory pathway. The spike rate encodes the temporal structure of a stimulus and is also modulated by an oscillation that is not phase-locked to the onset of the stimulus (Figure (Figure3B).3B). Therefore, the spikes are phase-locked rather than stimulus-locked. The PSTH, which reports averaged stimulus-evoked changes in spike timing, will not show evidence of the oscillation. This is because the phase of the oscillation varies randomly with respect to the stimulus onset; the periodic component evident during single trials of the stimulus is lost in the average. Thus, the information that the spikes carry about the oscillatory signal cannot be estimated by the standard methods.
We developed a new technique to estimate the amount of information that oscillating spike trains transmit (Koepsell and Sommer, 2008). It uses an oscillatory reference signal that can be extracted from various sources, for example, the simultaneously recorded LFP or presynaptic inputs. The reference signal is used to assign each spike in the train at time t a phase ϕ(t) (Figure (Figure3C).3C). If there is a fixed relationship between the phase of the oscillation and the generation of spikes, the histogram of spike phases has a certain structure. Specifically, the histogram has a single peak if spikes are locked to a particular phase of the oscillation (Figure (Figure3D).3D). Similarly, the shift-predictor of the histogram is flat (Figure (Figure3E)3E) if the phase of oscillation varies relative to the onset of the stimulus.
We illustrate this technique by analyzing an oscillating spike train (Figures (Figures3B–E)3B–E) that carries information about two messages: the stimulus feature that the spike rate encodes and the phase of the oscillation (Figures (Figures3C,D).3C,D). Further, changes in firing rate evoked by the stimulus versus those induced by the oscillation occupy separate bands of the power spectrum of the spike train, so the two messages do not interfere with each other.
Conventional methods of estimating information in spike rates do not account for information encoded by oscillation phase, since they depend solely on the PSTH (which only reports changes in spike timing with respect to the stimulus). Our technique, the multiconditional direct method, measures both the stimulus and oscillation based information (Koepsell and Sommer, 2008). It uses a two-dimensional response histogram whose bins contain the average response for a given latency between the onset of the stimulus and a particular phase of the oscillation. The method yields estimates of I (spike at t|stimulus,ϕ(t)), the information contained in a single spike about the stimulus and the oscillation.
Because the multiconditional direct method relies on a higher dimensional response histogram than conventional PSTH-based methods, it requires substantial amounts of data. Such large datasets are difficult to acquire in the laboratory. Thus, we developed an alternative method to analyze small datasets, the phase de-jittering method (Koepsell et al., 2009). This method, in essence, uses the reference signal to align, or de-jitter, the oscillation phase across different repeats of the stimulus. This is done by shifting spikes in time, but with displacements so small that the structure of the spike train at low frequencies is unaltered. After de-jittering, the oscillation is retained in the PSTH and therefore it is possible to use conventional PSTH-based methods to estimate the information. Comparisons of the full multiconditional method and the de-jittering method applied to a surrogate dataset, show that both yield comparable results (Koepsell and Sommer, 2008).
In the previous section we explained how information multiplexing can occur in a single neuron, that is, the spike train conveys information about two different messages. We found that the two signals occupy separate frequency bands – a scheme called frequency division multiplexing. Spike trains that encode dual messages in this form are common in sensory pathways. Usually, information encoded by spike rate occupies the lower part of the frequency spectrum. This position reflects the temporal structure in the stimulus; the spectral power of natural signals decays with increasing temporal frequency (Ruderman and Bialek, 1994). By contrast, information encoded by the oscillations often resides in a separate high-frequency band, such as the gamma frequency range. For example, the frequency band of the stimulus to which auditory neurons phase-lock is usually higher than the acoustic modulation spectrum (Langner, 1992).
The transmission of oscillatory signals in spike trains can be used in various ways to convey sensory information. The frequency, phase or coherence of the oscillatory signal can convey sensory information in itself, as in the auditory system (Langner, 1992). The hippocampus provides an example of spike phase coding, in which the relative phases between spikes and an oscillatory signal carry sensory information. Specifically, the location of the animal can be resolved by decoding the relative phase between spikes fired by a place cell and the activity that produces the theta rhythm (OKeefe and Recce, 1993). Further, oscillatory signals can enable time division multiplexing within individual spike trains. Here, the idea is that different phases of the oscillation define temporal windows in which particular features of the stimulus are selectively transmitted. Thus, the rates measured in these different time windows encode different types of sensory signals. Such a coding scheme has been found in the olfactory system (Friedrich et al., 2004) and has also been proposed in theoretical work for other sensory systems (Masquelier et al., 2009; Nadasdy, 2009; Panzeri et al., 2010).
Ongoing oscillations, those generated by the internal dynamics of the system, have been found at all stages of visual processing, from the retina to the cortex (Munk and Neuenschwander, 2000). In the cortex, oscillations reflect visual information in several different ways. Information about the stimulus is conveyed by the synchrony between two oscillating spike trains (Eckhorn et al., 1988; Gray and Singer, 1989; Samonds et al., 2006), in the relative phase between spikes and oscillations in the LFP (Montemurro et al., 2008; Kayser et al., 2009) and in the oscillations themselves (Kayser and Konig, 2004; Berens et al., 2008; Mazzoni et al., 2008). In addition to endogenous rhythms, the cortex also seems to inherit oscillations that emerge at earlier stages of the visual system (Neuenschwander and Singer, 1996; Castelo-Branco et al., 1998). The oscillations are observed in the absence (Doty et al., 1964; Heiss and Bornschein, 1966) and the presence of anesthetics [barbiturates (Heiss and Bornschein, 1965; Laufer and Verzeano, 1967), N2O and halothane (Neuenschwander and Singer, 1996; Castelo-Branco et al., 1998), propofol (Koepsell et al., 2009)].
We recently asked how oscillations in the retina might be used by the thalamus to transmit information downstream. In particular, we asked how the spike trains of a single thalamic relay cell can transmit two separate streams of information, one encoded by firing rate and the other in oscillations (Koepsell et al., 2009). We used the technique of whole-cell recording in vivo, which allowed us to detect retinothalamic synaptic potentials and the action potentials they evoked from single relay cells. In other words, we were able to reconstruct the spike trains of the inputs and outputs of single relay cells. Often, we found that both spike trains had an oscillatory component. To explore whether these oscillations were able to encode information, we used the phase of the oscillation of the retinal inputs to de-jitter the timing of thalamic spikes across repeated trials of the stimulus (see Figures Figures11 and and3C–E).3C–E). The result of the realignment was dramatic, as illustrated in Figure Figure4A.4A. Although the oscillation was not visible in the raw PSTH, it generated a pronounced modulation in the amplitude of the PSTH made from the de-jittered signal (Figure (Figure44B).
We then estimated the information in the de-jittered spike train. The results showed that most relay cells that received periodic synaptic inputs transmitted a significant amount of information in the gamma frequency band. For some cells, the amount of information in the oscillation-based (high frequency) channel was severalfold higher than that conveyed by rate-coded (low frequency) channel; compare Figure Figure4C4C with Figure Figure44D.
Gamma oscillations in retina and thalamus provide a novel channel that is able to convey information to the cortex. How might this channel contribute to visual function? In the following we outline various hypotheses about the potential roles for the new channel and how they might be tested.
One possibility we explore is the case in which the oscillatory trend of the retinal cell does not contain information about the visual stimulus. Even in this situation, the oscillations might increase the amount of information about local retinal features transmitted by the thalamic rate code. They would do so by a process akin to amplitude modulation, in which information about the retinal feature is reproduced in the frequency band of the oscillations. This redundant information could be read out and decoded in the cortex by various mechanisms, such as coincidence detection of afferent inputs or by the relative phase of the thalamic and cortical oscillations. A specific role for the second channel could be de-noising. Further, the amplitude modulation of the afferent spike train generates a signal that might enable cortical oscillations (e.g., by adjusting relative phases of the two oscillations) to route sensory information or to direct attention to a particular feature. (For discussion of potential roles of cortical oscillations in analyzing afferent input, see Buzsaki and Draguhn, 2004; Sejnowski and Paulsen, 2006; Fries et al., 2007).
A second possibility is that retinal oscillations are influenced by the stimulus, specifically, by displacements of the retinal image caused by eye movements. Thus, periodic activity in the retina might encode spatial information in the temporal domain, as in the whisker system (Ahissar and Arieli, 2001; Rucci, 2008). This idea is supported, at least in part, by the strong similarity between the dominant frequency bands in the LFP recorded from primary visual cortex and fixational eye movements [also note that oscillatory fixational eye movements are found in species ranging from turtle to humans (Greschner et al., 2002; Martinez-Conde et al., 2004)]. Work that combines electrophysiology, measurements of eye-movements, and psychophysics should help to test this idea.
A third potential role for retinal oscillations involves computational analysis of visual stimuli. To explore this possibility, one must address two questions at the same time. First, one must determine which features of the stimulus are encoded by the oscillations. To address this question, it is helpful to recall that retinal oscillations are formed by distributed networks, and thus might be sensitive to spatially extensive features and/or context. Second, one must identify which particular attributes of the oscillations are used to encode information. Reasonable guesses include frequency, phase, relative phase, modulation of coherence among cells, or combinations of these parameters. The possible answers to these two questions are numerous, and this uncertainty precludes the design of physiological experiments. Thus, for the time being, the most promising approach to studying the computational roles of retinal oscillations in vision is to use computational models constrained by biological and psychophysical results. In fact, there are many models of oscillatory neural networks that are able to transform spatial structure of visual input into temporal structure of neural activity. These models, which were originally developed to simulate cortical computations, are built with phase-coupled oscillatory neurons (e.g., Baldi and Meir, 1990; Sompolinsky et al., 1991; Sporns et al., 1991; von der Malsburg and Buhmann, 1992; Schillen and Koenig, 1994; Wang and Terman, 1997; Ursino et al., 2006). It would be useful to develop such models to explore retinal and thalamic function.
Other families of models combine psychophysical results with approaches used in computer vision to determine how various image operations are able to reproduce visual perception and behavior. For example, recent work compared how well segmentation algorithms (Pal and Pal, 1993) matched the human ability to outline objects in images (Martin et al., 2004). State-of-the-art algorithms for edge detection and image segmentation approached human performance by combining local with non-local features of the image; algorithms based solely on local contrast were not successful.
Motivated by this work, we implemented the normalized cut algorithm for the computation of non-local features (Shi and Malik, 2000) in a network of oscillating neurons (Figure (Figure5).5). Our preliminary results indicate that, over time, information about homogeneous image segments is encoded by different oscillation phases (Figure (Figure5B).5B). Further, oscillatory spike trains are able to transmit this information as well as that about local contrast. In addition, synchronized spikes in the oscillating network provide information about edges (Figure (Figure5C).5C). These results suggest that features like edge continuation, orientation, and border ownership – known to be represented by cortical firing rates – might already be available in the temporal structure of retinal activity.
Unlike the case for mammalian vision, in which the function of oscillations remain subjects of debate, the behavioral role of gamma oscillations has been clearly established in the frog. Specifically, looming stimuli designed to simulate shadows (cast by predators) evoke synchronous oscillatory discharges in neural “dimming detectors”. By contrast, small dark spots that mimic prey fail to induce such activity (Ishikane et al., 1999). The consequence of the synchronous oscillations among retinal dimming detectors is important for an animal's survival – it triggers escape behavior (Arai et al., 2004). Further strengthening the link between synchronous retinal activity and behavior, pharmacological suppression of gamma oscillations abolishes escape responses, but spares the slower modulation of spike rate evoked by small objects (Ishikane et al., 2005). Thus, in the frog, information about different types of visual signals seems to be multiplexed in different frequency bands of neural spike trains.
This review focused on research that explores the functional role of neural oscillations in the early stages of sensory pathways. We described neural oscillations that carry information about diverse sensory modalities, including olfaction, vision, audition and somatosensation. In particular, we discussed gamma oscillations in the early visual system, and showed how these form a novel channel that conveys information from the retina, via the thalamus, to the cortex. As well, we surveyed current techniques that are used to quantify the amount of information that oscillatory spike trains encode. Further, we summarized potential functions of oscillation-based channels in the periphery that are being actively explored by the community.
The work we have reviewed also bears on cortex (Eckhorn et al., 1988; Gray and Singer, 1989; Young et al., 1992; de Oliveira et al., 1997; Thiele and Stoner, 2003), where oscillations are generated by two sources: sensory afferents and intracortical networks. That is, we not only discussed the types of sensory information that oscillations carry downstream, but also described theoretical frameworks that can be applied to diverse cortical regions.
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
We are grateful to Tony Bell, Bruno Olshausen and the other members of the former Redwood Neuroscience Institute for helpful discussions. We thank Tim Blanche, Charles Cadieu, Jack Culpepper, and Christian Wehrhahn for valuable comments on the manuscript. Vishal Vaingankar, Yichun Wei, Qingbo Wang, Daniel Rathbun, and W. Martin Usrey contributed to the experiments we described. This work was supported by NIH grant EY09593 (Judith A. Hirsch), NSF grant IIS-0713657 (Friedrich T. Sommer), NSF grant IIS-0917342 (Kilian Koepsell), The Swartz Foundation (Kilian Koepsell), and the Redwood Neuroscience Institute (Friedrich T. Sommer, Kilian Koepsell). The data analysis, simulations and figures were produced using IPython (Pérez and Granger, 2007), NumPy/SciPy (Oliphant, 2007), and Matplotlib (Barrett et al., 2005).
Kilian Koepsell is Principal Investigator at the Redwood Center for Theoretical Neuroscience and at the Helen Wills Neuroscience Institute at University of California, Berkeley. He is working on functional models of information processing in biological and artificial neural networks and on statistical models of natural sensory stimuli and neural activity. Kilian obtained his Ph.D. in Physics at Hamburg University, Germany. He received postdoctoral training at the Max-Planck Institute for Gravitational Physics, Potsdam, at King’s College, London, at the Redwood Neuroscience Institute, Menlo Park, and at University of California, Berkeley.
Friedrich T. Sommer is Associate Adjunct Professor at the Redwood Center for Theoretical Neuroscience and at the Helen Wills Neuroscience Institute at University of California, Berkeley. His research interests include models of memory and studies of the computation performed by networks of sensory neurons. Dr. Sommer holds a Habilitation degree in Computer Science and Ph.D./Diploma degrees in Physics from the universities of Ulm, Düsseldorf and Tübingen. He conducted postdoctoral research at the Massachusetts Institute of Technology and the University of Tübingen.