Enter Your Search:
Results 1-3 (3)
Go to page number:
Select a Filter Below
Brain and Behavior (1)
Human brain mapping (1)
PLoS ONE (1)
Callan, Daniel E. (2)
Dapretto, Mirella (2)
Allodi, Silvana (1)
Bookheimer, Susan Y (1)
Callan, Akiko (1)
Callan, Daniel E (1)
Cassel, Daniel B. (1)
Gamez, Mario (1)
Hubbard, Amy L (1)
Hubbard, Amy L. (1)
Kawato, Mitsuo (1)
McNealy, Kristin (1)
Sato, Masa-aki (1)
Scott-Van Zeeland, Ashley A (1)
Terzibas, Cengiz (1)
Wilson, Stephen M. (1)
Year of Publication
Altered integration of speech and gesture in children with autism spectrum disorders
Hubbard, Amy L
Scott-Van Zeeland, Ashley A
Bookheimer, Susan Y
Brain and Behavior
The presence of gesture during speech has been shown to impact perception, comprehension, learning, and memory in normal adults and typically developing children. In neurotypical individuals, the impact of viewing co-speech gestures representing an object and/or action (i.e., iconic gesture) or speech rhythm (i.e., beat gesture) has also been observed at the neural level. Yet, despite growing evidence of delayed gesture development in children with autism spectrum disorders (ASD), few studies have examined how the brain processes multimodal communicative cues occurring during everyday communication in individuals with ASD. Here, we used a previously validated functional magnetic resonance imaging (fMRI) paradigm to examine the neural processing of co-speech beat gesture in children with ASD and matched controls. Consistent with prior observations in adults, typically developing children showed increased responses in right superior temporal gyrus and sulcus while listening to speech accompanied by beat gesture. Children with ASD, however, exhibited no significant modulatory effects in secondary auditory cortices for the presence of co-speech beat gesture. Rather, relative to their typically developing counterparts, children with ASD showed significantly greater activity in visual cortex while listening to speech accompanied by beat gesture. Importantly, the severity of their socio-communicative impairments correlated with activity in this region, such that the more impaired children demonstrated the greatest activity in visual areas while viewing co-speech beat gesture. These findings suggest that although the typically developing brain recognizes beat gesture as communicative and successfully integrates it with co-occurring speech, information from multiple sensory modalities is not effectively integrated during social communication in the autistic brain.
Autism spectrum disorders; fMRI; gesture; language; superior temporal gyrus
Dynamic Visuomotor Transformation Involved with Remote Flying of a Plane Utilizes the ‘Mirror Neuron’ System
Cassel, Daniel B.
Brain regions involved with processing dynamic visuomotor representational transformation are investigated using fMRI. The perceptual-motor task involved flying (or observing) a plane through a simulated Red Bull Air Race course in first person and third person chase perspective. The third person perspective is akin to remote operation of a vehicle. The ability for humans to remotely operate vehicles likely has its roots in neural processes related to imitation in which visuomotor transformation is necessary to interpret the action goals in an egocentric manner suitable for execution. In this experiment for 3rd person perspective the visuomotor transformation is dynamically changing in accordance to the orientation of the plane. It was predicted that 3rd person remote flying, over 1st, would utilize brain regions composing the ‘Mirror Neuron’ system that is thought to be intimately involved with imitation for both execution and observation tasks. Consistent with this prediction differential brain activity was present for 3rd person over 1st person perspectives for both execution and observation tasks in left ventral premotor cortex, right dorsal premotor cortex, and inferior parietal lobule bilaterally (Mirror Neuron System) (Behaviorally: 1st>3rd). These regions additionally showed greater activity for flying (execution) over watching (observation) conditions. Even though visual and motor aspects of the tasks were controlled for, differential activity was also found in brain regions involved with tool use, motion perception, and body perspective including left cerebellum, temporo-occipital regions, lateral occipital cortex, medial temporal region, and extrastriate body area. This experiment successfully demonstrates that a complex perceptual motor real-world task can be utilized to investigate visuomotor processing. This approach (Aviation Cerebral Experimental Sciences ACES) focusing on direct application to lab and field is in contrast to standard methodology in which tasks and conditions are reduced to their simplest forms that are remote from daily life experience.
Giving Speech a Hand: Gesture Modulates Activity in Auditory Cortex During Speech Perception
Hubbard, Amy L.
Wilson, Stephen M.
Human brain mapping
Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture – a fundamental type of hand gesture that marks speech prosody – might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.
gestures; speech perception; auditory cortex; magnetic resonance imaging; nonverbal communication
Results 1-3 (3)
Go to page number:
Remove citation from clipboard
Add citation to clipboard
This will clear all selections from your clipboard. Do you wish proceed?
Clipboard is full! Please remove an item and try again.
PubMed Central Canada is a service of the
Canadian Institutes of Health Research
(CIHR) working in partnership with the National Research Council's
Canada Institute for Scientific and Technical Information
in cooperation with the
National Center for Biotechnology Information
U.S. National Library of Medicine
(NCBI/NLM). It includes content provided to the
PubMed Central International archive
by participating publishers.