Search tips
Search criteria

Results 1-7 (7)

Clipboard (0)

Select a Filter Below

more »
Year of Publication
Document Types
1.  Atypical audio-visual speech perception and McGurk effects in children with specific language impairment 
Audiovisual speech perception of children with specific language impairment (SLI) and children with typical language development (TLD) was compared in two experiments using /aCa/ syllables presented in the context of a masking release paradigm. Children had to repeat syllables presented in auditory alone, visual alone (speechreading), audiovisual congruent and incongruent (McGurk) conditions. Stimuli were masked by either stationary (ST) or amplitude modulated (AM) noise. Although children with SLI were less accurate in auditory and audiovisual speech perception, they showed similar auditory masking release effect than children with TLD. Children with SLI also had less correct responses in speechreading than children with TLD, indicating impairment in phonemic processing of visual speech information. In response to McGurk stimuli, children with TLD showed more fusions in AM noise than in ST noise, a consequence of the auditory masking release effect and of the influence of visual information. Children with SLI did not show this effect systematically, suggesting they were less influenced by visual speech. However, when the visual cues were easily identified, the profile of responses to McGurk stimuli was similar in both groups, suggesting that children with SLI do not suffer from an impairment of audiovisual integration. An analysis of percent of information transmitted revealed a deficit in the children with SLI, particularly for the place of articulation feature. Taken together, the data support the hypothesis of an intact peripheral processing of auditory speech information, coupled with a supra modal deficit of phonemic categorization in children with SLI. Clinical implications are discussed.
PMCID: PMC4033223  PMID: 24904454
multisensory speech perception; specific language impairment; McGurk effects; audio-visual speech integration; masking release
2.  Enhancement of Visual Motion Detection Thresholds in Early Deaf People 
PLoS ONE  2014;9(2):e90498.
In deaf people, the auditory cortex can reorganize to support visual motion processing. Although this cross-modal reorganization has long been thought to subserve enhanced visual abilities, previous research has been unsuccessful at identifying behavioural enhancements specific to motion processing. Recently, research with congenitally deaf cats has uncovered an enhancement for visual motion detection. Our goal was to test for a similar difference between deaf and hearing people. We tested 16 early and profoundly deaf participants and 20 hearing controls. Participants completed a visual motion detection task, in which they were asked to determine which of two sinusoidal gratings was moving. The speed of the moving grating varied according to an adaptive staircase procedure, allowing us to determine the lowest speed necessary for participants to detect motion. Consistent with previous research in deaf cats, the deaf group had lower motion detection thresholds than the hearing. This finding supports the proposal that cross-modal reorganization after sensory deprivation will occur for supramodal sensory features and preserve the output functions.
PMCID: PMC3938732  PMID: 24587381
3.  Audiotactile interaction can change over time in cochlear implant users 
Recent results suggest that audiotactile interactions are disturbed in cochlear implant (CI) users. However, further exploration regarding the factors responsible for such abnormal sensory processing is still required. Considering the temporal nature of a previously used multisensory task, it remains unclear whether any aberrant results were caused by the specificity of the interaction studied or rather if it reflects an overall abnormal interaction. Moreover, although duration of experience with a CI has often been linked with the recovery of auditory functions, its impact on multisensory performance remains uncertain. In the present study, we used the parchment-skin illusion, a robust illustration of sound-biased perception of touch based on changes in auditory frequencies, to investigate the specificities of audiotactile interactions in CI users. Whereas individuals with relatively little experience with the CI performed similarly to the control group, experienced CI users showed a significantly greater illusory percept. The overall results suggest that despite being able to ignore auditory distractors in a temporal audiotactile task, CI users develop to become greatly influenced by auditory input in a spectral audiotactile task. When considered with the existing body of research, these results confirm that normal sensory interaction processing can be compromised in CI users.
PMCID: PMC4033126  PMID: 24904359
audiotactile interaction; multisensory interactions; cochlear implant; parchment-skin illusion; sensory deprivation; cross-modal plasticity; deafness; hearing loss
4.  Reduced procedural motor learning in deaf individuals 
Studies in the deaf suggest that cross-modal neuroplastic changes may vary across modalities. Only a handful of studies have examined motor capacities in the profoundly deaf. These studies suggest the presence of deficits in manual dexterity and delays in movement production. As of yet, the ability to learn complex sequential motor patterns has not been explored in deaf populations. The aim of the present study was to investigate the procedural learning skills of deaf adults. A serial reaction-time task (SRTT) was performed by 18 deaf subjects and 18 matched controls to investigate possible motor alteration subsequent to auditory deprivation. Deaf participants had various degrees of hearing loss. Half of the experimental group were early deaf adults mostly using hearing aids, the remaining half were late-deaf adults using a cochlear implant (CI). Participants carried out a repeating 12-item sequence of key presses along with random blocks containing no repeating sequence. Non-specific and sequence-specific learning was analyzed in relation to individual features related to the hearing loss. The results revealed significant differences between groups in sequence-specific learning, with deaf subjects being less efficient than controls in acquiring sequence-specific knowledge. We interpret the results in light of cross-modal plasticity and the auditory scaffolding hypothesis.
PMCID: PMC4033194  PMID: 24904381
deafness; cochlear implant; hearing loss; motor learning; plasticity; sensory deprivation; serial reaction time task
5.  Audiovisual Segregation in Cochlear Implant Users 
PLoS ONE  2012;7(3):e33113.
It has traditionally been assumed that cochlear implant users de facto perform atypically in audiovisual tasks. However, a recent study that combined an auditory task with visual distractors suggests that only those cochlear implant users that are not proficient at recognizing speech sounds might show abnormal audiovisual interactions. The present study aims at reinforcing this notion by investigating the audiovisual segregation abilities of cochlear implant users in a visual task with auditory distractors. Speechreading was assessed in two groups of cochlear implant users (proficient and non-proficient at sound recognition), as well as in normal controls. A visual speech recognition task (i.e. speechreading) was administered either in silence or in combination with three types of auditory distractors: i) noise ii) reverse speech sound and iii) non-altered speech sound. Cochlear implant users proficient at speech recognition performed like normal controls in all conditions, whereas non-proficient users showed significantly different audiovisual segregation patterns in both speech conditions. These results confirm that normal-like audiovisual segregation is possible in highly skilled cochlear implant users and, consequently, that proficient and non-proficient CI users cannot be lumped into a single group. This important feature must be taken into account in further studies of audiovisual interactions in cochlear implant users.
PMCID: PMC3299746  PMID: 22427963
6.  Feel What You Say: An Auditory Effect on Somatosensory Perception 
PLoS ONE  2011;6(8):e22829.
In the present study, we demonstrate an audiotactile effect in which amplitude modulation of auditory feedback during voiced speech induces a throbbing sensation over the lip and laryngeal regions. Control tasks coupled with the examination of speech acoustic parameters allow us to rule out the possibility that the effect may have been due to cognitive factors or motor compensatory effects. We interpret the effect as reflecting the tight interplay between auditory and tactile modalities during vocal production.
PMCID: PMC3152559  PMID: 21857955
7.  Speech and Non-Speech Audio-Visual Illusions: A Developmental Study 
PLoS ONE  2007;2(8):e742.
It is well known that simultaneous presentation of incongruent audio and visual stimuli can lead to illusory percepts. Recent data suggest that distinct processes underlie non-specific intersensory speech as opposed to non-speech perception. However, the development of both speech and non-speech intersensory perception across childhood and adolescence remains poorly defined. Thirty-eight observers aged 5 to 19 were tested on the McGurk effect (an audio-visual illusion involving speech), the Illusory Flash effect and the Fusion effect (two audio-visual illusions not involving speech) to investigate the development of audio-visual interactions and contrast speech vs. non-speech developmental patterns. Whereas the strength of audio-visual speech illusions varied as a direct function of maturational level, performance on non-speech illusory tasks appeared to be homogeneous across all ages. These data support the existence of independent maturational processes underlying speech and non-speech audio-visual illusory effects.
PMCID: PMC1937019  PMID: 17710142

Results 1-7 (7)