Search tips
Search criteria

Results 1-9 (9)

Clipboard (0)

Select a Filter Below

more »
Year of Publication
Document Types
author:("Teki, sandeep")
1.  A brain basis for musical hallucinations☆ 
The physiological basis for musical hallucinations (MH) is not understood. One obstacle to understanding has been the lack of a method to manipulate the intensity of hallucination during the course of experiment. Residual inhibition, transient suppression of a phantom percept after the offset of a masking stimulus, has been used in the study of tinnitus. We report here a human subject whose MH were residually inhibited by short periods of music. Magnetoencephalography (MEG) allowed us to examine variation in the underlying oscillatory brain activity in different states. Source-space analysis capable of single-subject inference defined left-lateralised power increases, associated with stronger hallucinations, in the gamma band in left anterior superior temporal gyrus, and in the beta band in motor cortex and posteromedial cortex. The data indicate that these areas form a crucial network in the generation of MH, and are consistent with a model in which MH are generated by persistent reciprocal communication in a predictive coding hierarchy.
PMCID: PMC3969291  PMID: 24445167
Musical hallucinations; Magnetoencephalography; Auditory cortex; Gamma oscillations; Beta oscillations; Predictive coding
2.  Segregation of complex acoustic scenes based on temporal coherence 
eLife  2013;2:e00699.
In contrast to the complex acoustic environments we encounter everyday, most studies of auditory segregation have used relatively simple signals. Here, we synthesized a new stimulus to examine the detection of coherent patterns (‘figures’) from overlapping ‘background’ signals. In a series of experiments, we demonstrate that human listeners are remarkably sensitive to the emergence of such figures and can tolerate a variety of spectral and temporal perturbations. This robust behavior is consistent with the existence of automatic auditory segregation mechanisms that are highly sensitive to correlations across frequency and time. The observed behavior cannot be explained purely on the basis of adaptation-based models used to explain the segregation of deterministic narrowband signals. We show that the present results are consistent with the predictions of a model of auditory perceptual organization based on temporal coherence. Our data thus support a role for temporal coherence as an organizational principle underlying auditory segregation.
eLife digest
Even when seated in the middle of a crowded restaurant, we are still able to distinguish the speech of the person sitting opposite us from the conversations of fellow diners and a host of other background noise. While we generally perform this task almost effortlessly, it is unclear how the brain solves what is in reality a complex information processing problem.
In the 1970s, researchers began to address this question using stimuli consisting of simple tones. When subjects are played a sequence of alternating high and low frequency tones, they perceive them as two independent streams of sound. Similar experiments in macaque monkeys reveal that each stream activates a different area of auditory cortex, suggesting that the brain may distinguish acoustic stimuli on the basis of their frequency.
However, the simple tones that are used in laboratory experiments bear little resemblance to the complex sounds we encounter in everyday life. These are often made up of multiple frequencies, and overlap—both in frequency and in time—with other sounds in the environment. Moreover, recent experiments have shown that if a subject hears two tones simultaneously, he or she perceives them as belonging to a single stream of sound even if they have different frequencies: models that assume that we distinguish stimuli from noise on the basis of frequency alone struggle to explain this observation.
Now, Teki, Chait, et al. have used more complex sounds, in which frequency components of the target stimuli overlap with those of background signals, to obtain new insights into how the brain solves this problem. Subjects were extremely good at discriminating these complex target stimuli from background noise, and computational modelling confirmed that they did so via integration of both frequency and temporal information. The work of Teki, Chait, et al. thus offers the first explanation for our ability to home in on speech and other pertinent sounds, even amidst a sea of background noise.
PMCID: PMC3721234  PMID: 23898398
auditory scene analysis; temporal coherence; psychophysics; segregation; Human
3.  Navigating the auditory scene: an expert role for the hippocampus 
Over a typical career piano tuners spend tens of thousands of hours exploring a specialized acoustic environment. Tuning requires accurate perception and adjustment of beats in two-note chords that serve as a navigational device to move between points in previously learned acoustic scenes. It is a two-stage process that depends on: firstly, selective listening to beats within frequency windows and, secondly, the subsequent use of those beats to navigate through a complex soundscape. The neuroanatomical substrates underlying brain specialization for such fundamental organization of sound scenes are unknown.
Here, we demonstrate that professional piano tuners are significantly better than controls matched for age and musical ability on a psychophysical task simulating active listening to beats within frequency windows that is based on amplitude modulation rate discrimination. Tuners show a categorical increase in grey matter volume in the right frontal operculum and right superior temporal lobe. Tuners also show a striking enhancement of grey matter volume in the anterior hippocampus, parahippocampal gyrus, and superior temporal gyrus, and an increase in white matter volume in the posterior hippocampus as a function of years of tuning experience. The relationship with GM volume is sensitive to years of tuning experience and starting age but not actual age or level of musicality.
Our findings support a role for a core set of regions in the hippocampus and superior temporal cortex in skilled exploration of complex sound scenes in which precise sound ‘templates’ are encoded and consolidated into memory over time in an experience-dependent manner.
PMCID: PMC3448926  PMID: 22933806
4.  Single-subject oscillatory gamma responses in tinnitus 
Brain  2012;135(10):3089-3100.
This study used magnetoencephalography to record oscillatory activity in a group of 17 patients with chronic tinnitus. Two methods, residual inhibition and residual excitation, were used to bring about transient changes in spontaneous tinnitus intensity in order to measure dynamic tinnitus correlates in individual patients. In residual inhibition, a positive correlation was seen between tinnitus intensity and both delta/theta (6/14 patients) and gamma band (8/14 patients) oscillations in auditory cortex, suggesting an increased thalamocortical input and cortical gamma response, respectively, associated with higher tinnitus states. Conversely, 4/4 patients exhibiting residual excitation demonstrated an inverse correlation between perceived tinnitus intensity and auditory cortex gamma oscillations (with no delta/theta changes) that cannot be explained by existing models. Significant oscillatory power changes were also identified in a variety of cortical regions, most commonly midline lobar regions in the default mode network, cerebellum, insula and anterior temporal lobe. These were highly variable across patients in terms of areas and frequency bands involved, and in direction of power change. We suggest a model based on a local circuit function of cortical gamma-band oscillations as a process of mutual inhibition that might suppress abnormal cortical activity in tinnitus. The work implicates auditory cortex gamma-band oscillations as a fundamental intrinsic mechanism for attenuating phantom auditory perception.
PMCID: PMC3470708  PMID: 22975389
tinnitus; gamma oscillations; mutual inhibition; auditory cortex; magnetoencephalography
5.  Slow GABA Transient and Receptor Desensitization Shape Synaptic Responses Evoked by Hippocampal Neurogliaform Cells 
The kinetics of GABAergic synaptic currents can vary by an order of magnitude depending on the cell type. The neurogliaform cell (NGFC) has recently been identified as a key generator of slow GABAA receptor-mediated volume transmission in the isocortex. However, the mechanisms underlying slow GABAA receptor-mediated IPSCs and their use-dependent plasticity remain unknown. Here, we provide experimental and modeling data showing that hippocampal NGFCs generate an unusually prolonged (tens of milliseconds) but low-concentration (micromolar range) GABA transient, which is responsible for the slow response kinetics and which leads to a robust desensitization of postsynaptic GABAA receptors. This strongly contributes to the use-dependent synaptic depression elicited by various patterns of NGFC activity including the one detected during theta network oscillations in vivo. Synaptic depression mediated by NGFCs is likely to play an important modulatory role in the feedforward inhibition of CA1 pyramidal cells provided by the entorhinal cortex.
PMCID: PMC3377669  PMID: 20660272
6.  Gamma band pitch responses in human auditory cortex measured with magnetoencephalography 
Neuroimage  2012;59(2-5):1904-1911.
We have previously used direct electrode recordings in two human subjects to identify neural correlates of the perception of pitch (Griffiths, Kumar, Sedley et al., Direct recordings of pitch responses from human auditory cortex, Curr. Biol. 22 (2010), pp. 1128–1132). The present study was carried out to assess virtual-electrode measures of pitch perception based on non-invasive magnetoencephalography (MEG). We recorded pitch responses in 13 healthy volunteers using a passive listening paradigm and the same pitch-evoking stimuli (regular interval noise; RIN) as in the previous study. Source activity was reconstructed using a beamformer approach, which was used to place virtual electrodes in auditory cortex. Time-frequency decomposition of these data revealed oscillatory responses to pitch in the gamma frequency band to occur, in Heschl's gyrus, from 60 Hz upwards. Direct comparison of these pitch responses to the previous depth electrode recordings shows a striking congruence in terms of spectrotemporal profile and anatomical distribution. These findings provide further support that auditory high gamma oscillations occur in association with RIN pitch stimuli, and validate the use of MEG to assess neural correlates of normal and abnormal pitch perception.
► High gamma-band correlates of pitch perception identified with MEG beamforming. ► Results correlate strongly with invasive electrode recordings of same responses. ► Validation of accuracy of MEG beamformer approach.
PMCID: PMC3236996  PMID: 21925281
Pitch; Auditory; Magnetoencephalography; Gamma; Beamformer; Perception
7.  Distinct Neural Substrates of Duration-Based and Beat-Based Auditory Timing 
Research on interval timing strongly implicates the cerebellum and the basal ganglia as part of the timing network of the brain. Here we tested the hypothesis that the brain uses differential timing mechanisms and networks—specifically, that the cerebellum subserves the perception of the absolute duration of time intervals, whereas the basal ganglia mediate perception of time intervals relative to a regular beat. In a functional magnetic resonance imaging experiment, we asked human subjects to judge the difference in duration of two successive time intervals as a function of the preceding context of an irregular sequence of clicks (where the task relies on encoding the absolute duration of time intervals) or a regular sequence of clicks (where the regular beat provides an extra cue for relative timing). We found significant activations in an olivocerebellar network comprising the inferior olive, vermis, and deep cerebellar nuclei including the dentate nucleus during absolute, duration-based timing and a striato-thalamo-cortical network comprising the putamen, caudate nucleus, thalamus, supplementary motor area, premotor cortex, and dorsolateral prefrontal cortex during relative, beat-based timing. Our results support two distinct timing mechanisms and underlying subsystems: first, a network comprising the inferior olive and the cerebellum that acts as a precision clock to mediate absolute, duration-based timing, and second, a distinct network for relative, beat-based timing incorporating a striato-thalamo-cortical network.
PMCID: PMC3074096  PMID: 21389235
8.  Brain Bases for Auditory Stimulus-Driven Figure–Ground Segregation 
Auditory figure–ground segregation, listeners’ ability to selectively hear out a sound of interest from a background of competing sounds, is a fundamental aspect of scene analysis. In contrast to the disordered acoustic environment we experience during everyday listening, most studies of auditory segregation have used relatively simple, temporally regular signals. We developed a new figure–ground stimulus that incorporates stochastic variation of the figure and background that captures the rich spectrotemporal complexity of natural acoustic scenes. Figure and background signals overlap in spectrotemporal space, but vary in the statistics of fluctuation, such that the only way to extract the figure is by integrating the patterns over time and frequency. Our behavioral results demonstrate that human listeners are remarkably sensitive to the appearance of such figures.
In a functional magnetic resonance imaging experiment, aimed at investigating preattentive, stimulus-driven, auditory segregation mechanisms, naive subjects listened to these stimuli while performing an irrelevant task. Results demonstrate significant activations in the intraparietal sulcus (IPS) and the superior temporal sulcus related to bottom-up, stimulus-driven figure–ground decomposition. We did not observe any significant activation in the primary auditory cortex. Our results support a role for automatic, bottom-up mechanisms in the IPS in mediating stimulus-driven, auditory figure–ground segregation, which is consistent with accumulating evidence implicating the IPS in structuring sensory input and perceptual organization.
PMCID: PMC3059575  PMID: 21209201
9.  A Unified Model of Time Perception Accounts for Duration-Based and Beat-Based Timing Mechanisms 
Accurate timing is an integral aspect of sensory and motor processes such as the perception of speech and music and the execution of skilled movement. Neuropsychological studies of time perception in patient groups and functional neuroimaging studies of timing in normal participants suggest common neural substrates for perceptual and motor timing. A timing system is implicated in core regions of the motor network such as the cerebellum, inferior olive, basal ganglia, pre-supplementary, and supplementary motor area, pre-motor cortex as well as higher-level areas such as the prefrontal cortex. In this article, we assess how distinct parts of the timing system subserve different aspects of perceptual timing. We previously established brain bases for absolute, duration-based timing and relative, beat-based timing in the olivocerebellar and striato-thalamo-cortical circuits respectively (Teki et al., 2011). However, neurophysiological and neuroanatomical studies provide a basis to suggest that timing functions of these circuits may not be independent. Here, we propose a unified model of time perception based on coordinated activity in the core striatal and olivocerebellar networks that are interconnected with each other and the cerebral cortex through multiple synaptic pathways. Timing in this unified model is proposed to involve serial beat-based striatal activation followed by absolute olivocerebellar timing mechanisms.
PMCID: PMC3249611  PMID: 22319477
interval timing; time perception; timing mechanisms; cerebellum; striatum

Results 1-9 (9)