PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1082374)

Clipboard (0)
None

Related Articles

1.  Giving Speech a Hand: Gesture Modulates Activity in Auditory Cortex During Speech Perception 
Human brain mapping  2009;30(3):1028-1037.
Viewing hand gestures during face-to-face communication affects speech perception and comprehension. Despite the visible role played by gesture in social interactions, relatively little is known about how the brain integrates hand gestures with co-occurring speech. Here we used functional magnetic resonance imaging (fMRI) and an ecologically valid paradigm to investigate how beat gesture – a fundamental type of hand gesture that marks speech prosody – might impact speech perception at the neural level. Subjects underwent fMRI while listening to spontaneously-produced speech accompanied by beat gesture, nonsense hand movement, or a still body; as additional control conditions, subjects also viewed beat gesture, nonsense hand movement, or a still body all presented without speech. Validating behavioral evidence that gesture affects speech perception, bilateral nonprimary auditory cortex showed greater activity when speech was accompanied by beat gesture than when speech was presented alone. Further, the left superior temporal gyrus/sulcus showed stronger activity when speech was accompanied by beat gesture than when speech was accompanied by nonsense hand movement. Finally, the right planum temporale was identified as a putative multisensory integration site for beat gesture and speech (i.e., here activity in response to speech accompanied by beat gesture was greater than the summed responses to speech alone and beat gesture alone), indicating that this area may be pivotally involved in synthesizing the rhythmic aspects of both speech and gesture. Taken together, these findings suggest a common neural substrate for processing speech and gesture, likely reflecting their joint communicative role in social interactions.
doi:10.1002/hbm.20565
PMCID: PMC2644740  PMID: 18412134
gestures; speech perception; auditory cortex; magnetic resonance imaging; nonverbal communication
2.  The impact of impaired semantic knowledge on spontaneous iconic gesture production 
Aphasiology  2013;27(9):1050-1069.
Background
Previous research has found that people with aphasia produce more spontaneous iconic gesture than control participants, especially during word-finding difficulties. There is some evidence that impaired semantic knowledge impacts on the diversity of gestural handshapes, as well as the frequency of gesture production. However, no previous research has explored how impaired semantic knowledge impacts on the frequency and type of iconic gestures produced during fluent speech compared with those produced during word-finding difficulties.
Aims
To explore the impact of impaired semantic knowledge on the frequency and type of iconic gestures produced during fluent speech and those produced during word-finding difficulties.
Methods & Procedures
A group of 29 participants with aphasia and 29 control participants were video recorded describing a cartoon they had just watched. All iconic gestures were tagged and coded as either “manner,” “path only,” “shape outline” or “other”. These gestures were then separated into either those occurring during fluent speech or those occurring during a word-finding difficulty. The relationships between semantic knowledge and gesture frequency and form were then investigated in the two different conditions.
Outcomes & Results
As expected, the participants with aphasia produced a higher frequency of iconic gestures than the control participants, but when the iconic gestures produced during word-finding difficulties were removed from the analysis, the frequency of iconic gesture was not significantly different between the groups. While there was not a significant relationship between the frequency of iconic gestures produced during fluent speech and semantic knowledge, there was a significant positive correlation between semantic knowledge and the proportion of word-finding difficulties that contained gesture. There was also a significant positive correlation between the speakers' semantic knowledge and the proportion of gestures that were produced during fluent speech that were classified as “manner”. Finally while not significant, there was a positive trend between semantic knowledge of objects and the production of “shape outline” gestures during word-finding difficulties for objects.
Conclusions
The results indicate that impaired semantic knowledge in aphasia impacts on both the iconic gestures produced during fluent speech and those produced during word-finding difficulties but in different ways. These results shed new light on the relationship between impaired language and iconic co-speech gesture production and also suggest that analysis of iconic gesture may be a useful addition to clinical assessment.
doi:10.1080/02687038.2013.770816
PMCID: PMC3778580  PMID: 24058228
Gesture; Aphasia; Semantic knowledge
3.  Audiovisual speech integration in autism spectrum disorder: ERP evidence for atypicalities in lexical-semantic processing 
Lay Abstract
Language and communicative impairments are among the primary characteristics of autism spectrum disorders (ASD). Previous studies have examined auditory language processing in ASD. However, during face-to-face conversation, auditory and visual speech inputs provide complementary information, and little is known about audiovisual (AV) speech processing in ASD. It is possible to elucidate the neural correlates of AV integration by examining the effects of seeing the lip movements accompanying the speech (visual speech) on electrophysiological event-related potentials (ERP) to spoken words. Moreover, electrophysiological techniques have a high temporal resolution and thus enable us to track the time-course of spoken word processing in ASD and typical development (TD). The present study examined the ERP correlates of AV effects in three time windows that are indicative of hierarchical stages of word processing. We studied a group of TD adolescent boys (n=14) and a group of high-functioning boys with ASD (n=14). Significant group differences were found in AV integration of spoken words in the 200–300ms time window when spoken words start to be processed for meaning. These results suggest that the neural facilitation by visual speech of spoken word processing is reduced in individuals with ASD.
Scientific Abstract
In typically developing (TD) individuals, behavioural and event-related potential (ERP) studies suggest that audiovisual (AV) integration enables faster and more efficient processing of speech. However, little is known about AV speech processing in individuals with autism spectrum disorder (ASD). The present study examined ERP responses to spoken words to elucidate the effects of visual speech (the lip movements accompanying a spoken word) on the range of auditory speech processing stages from sound onset detection to semantic integration. The study also included an AV condition which paired spoken words with a dynamic scrambled face in order to highlight AV effects specific to visual speech. Fourteen adolescent boys with ASD (15–17 years old) and 14 age- and verbal IQ-matched TD boys participated. The ERP of the TD group showed a pattern and topography of AV interaction effects consistent with activity within the superior temporal plane, with two dissociable effects over fronto-central and centro-parietal regions. The posterior effect (200–300ms interval) was specifically sensitive to lip movements in TD boys, and no AV modulation was observed in this region for the ASD group. Moreover, the magnitude of the posterior AV effect to visual speech correlated inversely with ASD symptomatology. In addition, the ASD boys showed an unexpected effect (P2 time window) over the frontal-central region (pooled electrodes F3, Fz, F4, FC1, FC2, FC3, FC4) which was sensitive to scrambled face stimuli. These results suggest that the neural networks facilitating processing of spoken words by visual speech are altered in individuals with ASD.
doi:10.1002/aur.231
PMCID: PMC3586407  PMID: 22162387
Auditory; ASD; ERP; Language; Multisensory; Visual
4.  Frontal and temporal contributions to understanding the iconic co-speech gestures that accompany speech 
Human brain mapping  2012;35(3):900-917.
In everyday conversation, listeners often rely on a speaker’s gestures to clarify any ambiguities in the verbal message. Using fMRI during naturalistic story comprehension, we examined which brain regions in the listener are sensitive to speakers’ iconic gestures. We focused on iconic gestures that contribute information not found in the speaker’s talk, compared to those that convey information redundant with the speaker’s talk. We found that three regions—left inferior frontal gyrus triangular (IFGTr) and opercular (IFGOp) portions, and left posterior middle temporal gyrus (MTGp)—responded more strongly when gestures added information to non-specific language, compared to when they conveyed the same information in more specific language; in other words, when gesture disambiguated speech as opposed to reinforced it. An increased BOLD response was not found in these regions when the non-specific language was produced without gesture, suggesting that IFGTr, IFGOp, and MTGp are involved in integrating semantic information across gesture and speech. In addition, we found that activity in the posterior superior temporal sulcus (STSp), previously thought to be involved in gesture-speech integration, was not sensitive to the gesture-speech relation. Together, these findings clarify the neurobiology of gesture-speech integration and contribute to an emerging picture of how listeners glean meaning from gestures that accompany speech.
doi:10.1002/hbm.22222
PMCID: PMC3797208  PMID: 23238964
gestures; semantic; language; inferior frontal gyrus; posterior superior temporal sulcus; posterior middle temporal gyrus
5.  Co-speech gestures influence neural activity in brain regions associated with processing semantic information 
Human brain mapping  2009;30(11):3509-3526.
Everyday communication is accompanied by visual information from several sources, including co-speech gestures, which provide semantic information listeners use to help disambiguate the speaker’s message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, she made semantically unrelated hand movements. In the third, she kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech.
doi:10.1002/hbm.20774
PMCID: PMC2896896  PMID: 19384890
discourse comprehension; fMRI; gestures; semantic processing; inferior frontal gyrus
6.  White matter impairment in the speech network of individuals with autism spectrum disorder☆ 
NeuroImage : Clinical  2013;3:234-241.
Impairments in language and communication are core features of Autism Spectrum Disorder (ASD), and a substantial percentage of children with ASD do not develop speech. ASD is often characterized as a disorder of brain connectivity, and a number of studies have identified white matter impairments in affected individuals. The current study investigated white matter integrity in the speech network of high-functioning adults with ASD. Diffusion tensor imaging (DTI) scans were collected from 18 participants with ASD and 18 neurotypical participants. Probabilistic tractography was used to estimate the connection strength between ventral premotor cortex (vPMC), a cortical region responsible for speech motor planning, and five other cortical regions in the network of areas involved in speech production. We found a weaker connection between the left vPMC and the supplementary motor area in the ASD group. This pathway has been hypothesized to underlie the initiation of speech motor programs. Our results indicate that a key pathway in the speech production network is impaired in ASD, and that this impairment can occur even in the presence of normal language abilities. Therapies that result in normalization of this pathway may hold particular promise for improving speech output in ASD.
Highlights
•We used diffusion tensor imaging to measure white matter (WM) tracts in autism.•Autistic participants were high-functioning individuals with normal language skills.•WM between left supplementary motor and premotor areas is impaired in autism.•This tract is believed to be involved in the initiation of speech articulation.•Speech production may be impaired in the absence of language deficits in autism.
doi:10.1016/j.nicl.2013.08.011
PMCID: PMC3815014  PMID: 24273708
Autism; ASD; Speech; Diffusion tensor imaging; Tractography; Communication
7.  Gesture Facilitates the Syntactic Analysis of Speech 
Recent research suggests that the brain routinely binds together information from gesture and speech. However, most of this research focused on the integration of representational gestures with the semantic content of speech. Much less is known about how other aspects of gesture, such as emphasis, influence the interpretation of the syntactic relations in a spoken message. Here, we investigated whether beat gestures alter which syntactic structure is assigned to ambiguous spoken German sentences. The P600 component of the Event Related Brain Potential indicated that the more complex syntactic structure is easier to process when the speaker emphasizes the subject of a sentence with a beat. Thus, a simple flick of the hand can change our interpretation of who has been doing what to whom in a spoken sentence. We conclude that gestures and speech are integrated systems. Unlike previous studies, which have shown that the brain effortlessly integrates semantic information from gesture and speech, our study is the first to demonstrate that this integration also occurs for syntactic information. Moreover, the effect appears to be gesture-specific and was not found for other stimuli that draw attention to certain parts of speech, including prosodic emphasis, or a moving visual stimulus with the same trajectory as the gesture. This suggests that only visual emphasis produced with a communicative intention in mind (that is, beat gestures) influences language comprehension, but not a simple visual movement lacking such an intention.
doi:10.3389/fpsyg.2012.00074
PMCID: PMC3307377  PMID: 22457657
language; syntax; audiovisual; P600; ambiguity
8.  A Cross-Species Study of Gesture and Its Role in Symbolic Development: Implications for the Gestural Theory of Language Evolution 
Using a naturalistic video database, we examined whether gestures scaffold the symbolic development of a language-enculturated chimpanzee, a language-enculturated bonobo, and a human child during the second year of life. These three species constitute a complete clade: species possessing a common immediate ancestor. A basic finding was the functional and formal similarity of many gestures between chimpanzee, bonobo, and human child. The child’s symbols were spoken words; the apes’ symbols were lexigrams – non-iconic visual signifiers. A developmental pattern in which gestural representation of a referent preceded symbolic representation of the same referent appeared in all three species (but was statistically significant only for the child). Nonetheless, across species, the ratio of symbol to gesture increased significantly with age. But even though their symbol production increased, the apes continued to communicate more frequently by gesture than by symbol. In contrast, by 15–18 months of age, the child used symbols more frequently than gestures. This ontogenetic sequence from gesture to symbol, present across the clade but more pronounced in child than ape, provides support for the role of gesture in language evolution. In all three species, the overwhelming majority of gestures were communicative (i.e., paired with eye contact, vocalization, and/or persistence). However, vocalization was rare for the apes, but accompanied the majority of the child’s communicative gestures. This species difference suggests the co-evolution of speech and gesture after the evolutionary divergence of the hominid line. Multimodal expressions of communicative intent (e.g., vocalization plus persistence) were normative for the child, but less common for the apes. This species difference suggests that multimodal expression of communicative intent was also strengthened after hominids diverged from apes.
doi:10.3389/fpsyg.2013.00160
PMCID: PMC3674957  PMID: 23750140
gestural theory of language evolution; language-enculturated apes; symbolic development; cross-species comparisons; gesture; communication development; language development
9.  Talking hands: tongue motor excitability during observation of hand gestures associated with words 
Perception of speech and gestures engage common brain areas. Neural regions involved in speech perception overlap with those involved in speech production in an articulator-specific manner. Yet, it is unclear whether motor cortex also has a role in processing communicative actions like gesture and sign language. We asked whether the mere observation of hand gestures, paired and not paired with words, may result in changes in the excitability of the hand and tongue areas of motor cortex. Using single-pulse transcranial magnetic stimulation (TMS), we measured the motor excitability in tongue and hand areas of left primary motor cortex, while participants viewed video sequences of bimanual hand movements associated or not-associated with nouns. We found higher motor excitability in the tongue area during the presentation of meaningful gestures (noun-associated) as opposed to meaningless ones, while the excitability of hand motor area was not differentially affected by gesture observation. Our results let us argue that the observation of gestures associated with a word results in activation of articulatory motor network accompanying speech production.
doi:10.3389/fnhum.2014.00767
PMCID: PMC4179693  PMID: 25324761
transcranial magnetic stimulation; tongue motor excitability; speech perception; gesture perception; sign language
10.  Distinguishing the Processing of Gestures from Signs in Deaf Individuals: An fMRI Study 
Brain research  2009;1276:140-150.
Manual gestures occur on a continuum from co-speech gesticulations to conventionalized emblems to language signs. Our goal in the present study was to understand the neural bases of the processing of gestures along such a continuum. We studied four types of gestures, varying along linguistic and semantic dimensions: linguistic and meaningful American Sign Language (ASL), non-meaningful pseudo-ASL, meaningful emblematic, and nonlinguistic, non-meaningful made-up gestures. Pre-lingually deaf, native signers of ASL participated in the fMRI study and performed two tasks while viewing videos of the gestures: a visuo-spatial (identity) discrimination task and a category discrimination task. We found that the categorization task activated left ventral middle and inferior frontal gyrus, among other regions, to a greater extent compared to the visual discrimination task, supporting the idea of semantic-level processing of the gestures. The reverse contrast resulted in enhanced activity of bilateral intraparietal sulcus, supporting the idea of featural-level processing (analogous to phonological-level processing of speech sounds) of the gestures. Regardless of the task, we found that brain activation patterns for the nonlinguistic, non-meaningful gestures were the most different compared to the ASL gestures. The activation patterns for the emblems were most similar to those of the ASL gestures and those of the pseudo-ASL were most similar to the nonlinguistic, non-meaningful gestures. The fMRI results provide partial support for the conceptualization of different gestures as belonging to a continuum and the variance in the fMRI results was best explained by differences in the processing of gestures along the semantic dimension.
doi:10.1016/j.brainres.2009.04.034
PMCID: PMC2693477  PMID: 19397900
American Sign Language; gestures; Deaf; visual processing; categorization; linguistic; brain; fMRI
11.  A Supramodal Neural Network for Speech and Gesture Semantics: An fMRI Study 
PLoS ONE  2012;7(11):e51207.
In a natural setting, speech is often accompanied by gestures. As language, speech-accompanying iconic gestures to some extent convey semantic information. However, if comprehension of the information contained in both the auditory and visual modality depends on same or different brain-networks is quite unknown. In this fMRI study, we aimed at identifying the cortical areas engaged in supramodal processing of semantic information. BOLD changes were recorded in 18 healthy right-handed male subjects watching video clips showing an actor who either performed speech (S, acoustic) or gestures (G, visual) in more (+) or less (−) meaningful varieties. In the experimental conditions familiar speech or isolated iconic gestures were presented; during the visual control condition the volunteers watched meaningless gestures (G−), while during the acoustic control condition a foreign language was presented (S−). The conjunction of the visual and acoustic semantic processing revealed activations extending from the left inferior frontal gyrus to the precentral gyrus, and included bilateral posterior temporal regions. We conclude that proclaiming this frontotemporal network the brain's core language system is to take too narrow a view. Our results rather indicate that these regions constitute a supramodal semantic processing network.
doi:10.1371/journal.pone.0051207
PMCID: PMC3511386  PMID: 23226488
12.  Assessing Gestures in Young Children with Autism Spectrum Disorders 
Purpose
To determine whether scoring of the gestures point, give, and show were correlated across measurement tools used to assess gesture production in children with an Autism Spectrum Disorder (ASD).
Method
Seventy-eight children with an ASD between the ages of 23 to 37 months participated. Correlational analyses were conducted to determine whether performance of three key gestures related to joint attention and behavior regulation (point, give, show) were correlated across three different measurement tools: the Autism Diagnostic Observation Schedule, the Early Social Communication Scale, and the MacArthur-Bates Communicative Developmental Inventory: Words and Gestures. To establish whether different measures were related at different points in development, children were subdivided into two groups based on their expressive language levels.
Results
The scoring of gesture performance was not entirely consistent across assessment methods. The score that a child received appeared to be influenced by theoretical perspective, gesture definition, and assessment methodology, as well as developmental level.
Conclusion
When assessing the gestures of children with ASD clinicians should determine what aspects of gesture they are interested in profiling, gather data from multiple sources, and consider performance in light of the measurement tool.
doi:10.1044/2013_JSLHR-L-12-0244
PMCID: PMC4106481  PMID: 24129012
Assessment; autism; language; gestures
13.  Gesture in the developing brain 
Developmental science  2011;15(2):165-180.
Speakers convey meaning not only through words, but also through gestures. Although children are exposed to co-speech gestures from birth, we do not know how the developing brain comes to connect meaning conveyed in gesture with speech. We used functional magnetic resonance imaging (fMRI) to address this question and scanned 8- to 11-year-old children and adults listening to stories accompanied by hand movements, either meaningful co-speech gestures or meaningless self-adaptors. When listening to stories accompanied by both types of hand movements, both children and adults recruited inferior frontal, inferior parietal, and posterior temporal brain regions known to be involved in processing language not accompanied by hand movements. There were, however, age-related differences in activity in posterior superior temporal sulcus (STSp), inferior frontal gyrus, pars triangularis (IFGTr), and posterior middle temporal gyrus (MTGp) regions previously implicated in processing gesture. Both children and adults showed sensitivity to the meaning of hand movements in IFGTr and MTGp, but in different ways. Finally, we found that hand movement meaning modulates interactions between STSp and other posterior temporal and inferior parietal regions for adults, but not for children. These results shed light on the developing neural substrate for understanding meaning contributed by co-speech gesture.
doi:10.1111/j.1467-7687.2011.01100.x
PMCID: PMC3515080  PMID: 22356173
14.  Supramodal neural processing of abstract information conveyed by speech and gesture 
Abstractness and modality of interpersonal communication have a considerable impact on comprehension. They are relevant for determining thoughts and constituting internal models of the environment. Whereas concrete object-related information can be represented in mind irrespective of language, abstract concepts require a representation in speech. Consequently, modality-independent processing of abstract information can be expected. Here we investigated the neural correlates of abstractness (abstract vs. concrete) and modality (speech vs. gestures), to identify an abstractness-specific supramodal neural network. During fMRI data acquisition 20 participants were presented with videos of an actor either speaking sentences with an abstract-social [AS] or concrete-object-related content [CS], or performing meaningful abstract-social emblematic [AG] or concrete-object-related tool-use gestures [CG]. Gestures were accompanied by a foreign language to increase the comparability between conditions and to frame the communication context of the gesture videos. Participants performed a content judgment task referring to the person vs. object-relatedness of the utterances. The behavioral data suggest a comparable comprehension of contents communicated by speech or gesture. Furthermore, we found common neural processing for abstract information independent of modality (AS > CS ∩ AG > CG) in a left hemispheric network including the left inferior frontal gyrus (IFG), temporal pole, and medial frontal cortex. Modality specific activations were found in bilateral occipital, parietal, and temporal as well as right inferior frontal brain regions for gesture (G > S) and in left anterior temporal regions and the left angular gyrus for the processing of speech semantics (S > G). These data support the idea that abstract concepts are represented in a supramodal manner. Consequently, gestures referring to abstract concepts are processed in a predominantly left hemispheric language related neural network.
doi:10.3389/fnbeh.2013.00120
PMCID: PMC3772311  PMID: 24062652
gesture; speech; fMRI; abstract semantics; emblematic gestures; tool-use gestures
15.  Specificity of Dyspraxia in Children with Autism 
Neuropsychology  2012;26(2):165-171.
Objective
To explore the specificity of impaired praxis and postural knowledge to autism by examining three samples of children, including those with autism spectrum disorder (ASD), attention-deficit hyperactivity disorder (ADHD), and typically developing (TD) children.
Method
Twenty-four children with ASD, 24 children with ADHD, and 24 TD children, ages 8–13, completed measures assessing basic motor control (the Physical and Neurological Exam for Subtle Signs; PANESS), praxis (performance of skilled gestures to command, with imitation, and tool use) and the ability to recognize correct hand postures necessary to perform these skilled gestures (the Postural Knowledge Test; PKT).
Results
Children with ASD performed significantly worse than TD children on all three assessments. In contrast, children with ADHD performed significantly worse than TD controls on PANESS but not on the praxis examination or PKT. Furthermore, children with ASD performed significantly worse than children with ADHD on both the praxis examination and PKT, but not on the PANESS.
Conclusions
Whereas both children with ADHD and children with ASD show impairments in basic motor control, impairments in performance and recognition of skilled motor gestures, consistent with dyspraxia, appear to be specific to autism. The findings suggest that impaired formation of perceptual-motor action models necessary to development of skilled gestures and other goal directed behavior is specific to autism; whereas, impaired basic motor control may be a more generalized finding.
doi:10.1037/a0026955
PMCID: PMC3312580  PMID: 22288405
imitation; motor learning; procedural learning; premotor cortex; inferior parietal lobe
16.  Communicative Acts of Children with Autism Spectrum Disorders in the Second Year of Life 
Purpose
This study examined the communicative profiles of children with autism spectrum disorders (ASD) in the second year of life.
Method
Communicative acts were examined in 125 children 18 to 24 months of age: 50 later diagnosed with ASD; 25 with developmental delays (DD); and 50 with typical development (TD). Precise measures of rate, functions, and means of communication were obtained through systematic observation of videotaped Behavior Samples from the Communication and Symbolic Behavior Scales Developmental Profile (Wetherby & Prizant, 2002).
Results
Children with ASD communicated at a significantly lower rate than children with DD and TD. The ASD group used a significantly lower proportion of acts for joint attention and a significantly lower proportion of deictic gestures with a reliance on more primitive gestures compared to DD and TD. Children with ASD who did communicate for joint attention were as likely as other children to coordinate vocalizations, eye gaze, and gestures. Rate of communicative acts and joint attention were the strongest predictors of verbal outcome at age 3.
Conclusions
By 18 to 24 months of age, children later diagnosed with ASD showed a unique profile of communication, with core deficits in communication rate, joint attention, and communicative gestures.
doi:10.1044/1092-4388(2009/07-0280)
PMCID: PMC2756334  PMID: 19635941
17.  No Neural Evidence of Statistical Learning During Exposure to Artificial Languages in Children with Autism Spectrum Disorders 
Biological psychiatry  2010;68(4):345-351.
Background
Language delay is a hallmark feature of autism spectrum disorders (ASD). The identification of word boundaries in continuous speech is a critical first step in language acquisition that can be accomplished via statistical learning and reliance on speech cues. Importantly, early word segmentation skills have been shown to predict later language development in typically developing (TD) children.
Methods
Here we investigated the neural correlates of online word segmentation in children with and without ASD with a well-established behavioral paradigm previously validated for functional magnetic resonance imaging. Eighteen high-functioning boys with ASD and 18 age- and IQ-matched TD boys underwent functional magnetic resonance imaging while listening to two artificial languages (containing statistical or statistical + prosodic cues to word boundaries) and a random speech stream.
Results
Consistent with prior findings, in TD control subjects, activity in fronto-temporal-parietal networks decreased as the number of cues to word boundaries increased. The ASD children, however, did not show this facilitatory effect. Furthermore, statistical contrasts modeling changes in activity over time identified significant learning-related signal increases for both artificial languages in basal ganglia and left temporo-parietal cortex only in TD children. Finally, the level of communicative impairment in ASD children was inversely correlated with signal increases in these same regions during exposure to the artificial languages.
Conclusions
This is the first study to demonstrate significant abnormalities in the neural architecture subserving language-related learning in ASD children and to link the communicative impairments observed in this population to decreased sensitivity to the statistical and speech cues available in the language input.
doi:10.1016/j.biopsych.2010.01.011
PMCID: PMC3229830  PMID: 20303070
Autism; implicit learning; language; neuroimaging; speech perception
18.  Non-Specialist Psychosocial Interventions for Children and Adolescents with Intellectual Disability or Lower-Functioning Autism Spectrum Disorders: A Systematic Review 
PLoS Medicine  2013;10(12):e1001572.
In a systematic review, Brian Reichow and colleagues assess the evidence that non-specialist care providers in community settings can provide effective interventions for children and adolescents with intellectual disabilities or lower-functioning autism spectrum disorders.
Please see later in the article for the Editors' Summary
Background
The development of effective treatments for use by non-specialists is listed among the top research priorities for improving the lives of people with mental illness worldwide. The purpose of this review is to appraise which interventions for children with intellectual disabilities or lower-functioning autism spectrum disorders delivered by non-specialist care providers in community settings produce benefits when compared to either a no-treatment control group or treatment-as-usual comparator.
Methods and Findings
We systematically searched electronic databases through 24 June 2013 to locate prospective controlled studies of psychosocial interventions delivered by non-specialist providers to children with intellectual disabilities or lower-functioning autism spectrum disorders. We screened 234 full papers, of which 34 articles describing 29 studies involving 1,305 participants were included. A majority of the studies included children exclusively with a diagnosis of lower-functioning autism spectrum disorders (15 of 29, 52%). Fifteen of twenty-nine studies (52%) were randomized controlled trials and just under half of all effect sizes (29 of 59, 49%) were greater than 0.50, of which 18 (62%) were statistically significant. For behavior analytic interventions, the best outcomes were shown for development and daily skills; cognitive rehabilitation, training, and support interventions were found to be most effective for improving developmental outcomes, and parent training interventions to be most effective for improving developmental, behavioral, and family outcomes. We also conducted additional subgroup analyses using harvest plots. Limitations include the studies' potential for performance bias and that few were conducted in lower- and middle-income countries.
Conclusions
The findings of this review support the delivery of psychosocial interventions by non-specialist providers to children who have intellectual disabilities or lower-functioning autism spectrum disorders. Given the scarcity of specialists in many low-resource settings, including many lower- and middle-income countries, these findings may provide guidance for scale-up efforts for improving outcomes for children with developmental disorders or lower-functioning autism spectrum disorders.
Protocol Registration
PROSPERO CRD42012002641
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Newborn babies are helpless, but over the first few years of life, they acquire motor (movement) skills, language (communication) skills, cognitive (thinking) skills, and social (interpersonal interaction) skills. Individual aspects of these skills are usually acquired at specific ages, but children with a development disorder such as an autism spectrum disorder (ASD) or intellectual disability (mental retardation) fail to reach these “milestones” because of impaired or delayed brain maturation. Autism, Asperger syndrome, and other ASDs (also called pervasive developmental disorders) affect about 1% of the UK and US populations and are characterized by abnormalities in interactions and communication with other people (reciprocal socio-communicative interactions; for example, some children with autism reject physical affection and fail to develop useful speech) and a restricted, stereotyped, repetitive repertoire of interests (for example, obsessive accumulation of facts about unusual topics). About half of individuals with an ASD also have an intellectual disability—a reduced overall level of intelligence characterized by impairment of the skills that are normally acquired during early life. Such individuals have what is called lower-functioning ASD.
Why Was This Study Done?
Most of the children affected by developmental disorders live in low- and middle-income countries where there are few services available to help them achieve their full potential and where little research has been done to identify the most effective treatments. The development of effective treatments for use by non-specialists (for example, teachers and parents) is necessary to improve the lives of people with mental illnesses worldwide, but particularly in resource-limited settings where psychiatrists, psychologists, and other specialists are scarce. In this systematic review, the researchers investigated which psychosocial interventions for children and adolescents with intellectual disabilities or lower-functioning ASDs delivered by non-specialist providers in community settings produce improvements in development, daily skills, school performance, behavior, or family outcomes when compared to usual care (the control condition). A systematic review identifies all the research on a given topic using predefined criteria; psychosocial interventions are defined as therapy, education, training, or support aimed at improving behavior, overall development, or specific life skills without the use of drugs.
What Did the Researchers Do and Find?
The researchers identified 29 controlled studies (investigations with an intervention group and a control group) that examined the effects of various psychosocial interventions delivered by non-specialist providers to children (under 18 years old) who had a lower-functioning ASD or intellectual disability. The researchers retrieved information on the participants, design and methods, findings, and intervention characteristics for each study, and calculated effect sizes—a measure of the effectiveness of a test intervention relative to a control intervention—for several outcomes for each intervention. Across the studies, three-quarters of the effect size estimates were positive, and nearly half were greater than 0.50; effect sizes of less than 0.2, 0.2–0.5, and greater than 0.5 indicate that an intervention has no, a small, or a medium-to-large effect, respectively. For behavior analytic interventions (which aim to improve socially significant behavior by systematically analyzing behavior), the largest effect sizes were seen for development and daily skills. Cognitive rehabilitation, training, and support (interventions that facilitates the relearning of lost or altered cognitive skills) produced good improvements in developmental outcomes such as standardized IQ tests in children aged 6–11 years old. Finally, parental training interventions (which teach parents how to provide therapy services for their child) had strong effects on developmental, behavioral, and family outcomes.
What Do These Findings Mean?
Because few of the studies included in this systematic review were undertaken in low- and middle-income countries, the review's findings may not be generalizable to children living in resource-limited settings. Moreover, other characteristics of the included studies may limit the accuracy of these findings. Nevertheless, these findings support the delivery of psychosocial interventions by non-specialist providers to children who have intellectual disabilities or a lower-functioning ASD, and indicate which interventions are likely to produce the largest improvements in developmental, behavioral, and family outcomes. Further studies are needed, particularly in low- and middle-income countries, to confirm these findings, but given that specialists are scarce in many resource-limited settings, these findings may help to inform the implementation of programs to improve outcomes for children with intellectual disabilities or lower-functioning ASDs in low- and middle-income countries.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001572.
This study is further discussed in a PLOS Medicine Perspective by Bello-Mojeed and Bakare
The US Centers for Disease Control and Prevention provides information (in English and Spanish) on developmental disabilities, including autism spectrum disorders and intellectual disability
The US National Institute of Mental Health also provides detailed information about autism spectrum disorders, including the publication “A Parent's Guide to Autism Spectrum Disorder”
Autism Speaks, a US non-profit organization, provides information about all aspects of autism spectrum disorders and includes information on the Autism Speaks Global Autism Public Health Initiative
The National Autistic Society, a UK charity, provides information about all aspects of autism spectrum disorders and includes personal stories about living with these conditions
The UK National Health Service Choices website has an interactive guide to child development and information about autism and Asperger syndrome, including personal stories, and about learning disabilities
The UK National Institute for Health and Care Excellence provides clinical guidelines for the management and support of children with autism spectrum disorders
The World Health Organization provides information on its Mental Health Gap Action Programme (mhGAP), which includes recommendations on the management of developmental disorders by non-specialist providers; the mhGAP Evidence Resource Center provides evidence reviews for parent skills training for management of children with intellectual disabilities and pervasive developmental disorders and interventions for management of children with intellectual disabilities
PROSPERO, an international prospective register of systematic reviews, provides more information about this systematic review
doi:10.1371/journal.pmed.1001572
PMCID: PMC3866092  PMID: 24358029
19.  Gesture’s Neural Language 
When people talk to each other, they often make arm and hand movements that accompany what they say. These manual movements, called “co-speech gestures,” can convey meaning by way of their interaction with the oral message. Another class of manual gestures, called “emblematic gestures” or “emblems,” also conveys meaning, but in contrast to co-speech gestures, they can do so directly and independent of speech. There is currently significant interest in the behavioral and biological relationships between action and language. Since co-speech gestures are actions that rely on spoken language, and emblems convey meaning to the effect that they can sometimes substitute for speech, these actions may be important, and potentially informative, examples of language–motor interactions. Researchers have recently been examining how the brain processes these actions. The current results of this work do not yet give a clear understanding of gesture processing at the neural level. For the most part, however, it seems that two complimentary sets of brain areas respond when people see gestures, reflecting their role in disambiguating meaning. These include areas thought to be important for understanding actions and areas ordinarily related to processing language. The shared and distinct responses across these two sets of areas during communication are just beginning to emerge. In this review, we talk about the ways that the brain responds when people see gestures, how these responses relate to brain activity when people process language, and how these might relate in normal, everyday communication.
doi:10.3389/fpsyg.2012.00099
PMCID: PMC3317265  PMID: 22485103
gesture; language; brain; meaning; action understanding; fMRI
20.  Iconic gestures prime words: comparison of priming effects when gestures are presented alone and when they are accompanying speech 
Previous studies have shown that iconic gestures presented in an isolated manner prime visually presented semantically related words. Since gestures and speech are almost always produced together, this study examined whether iconic gestures accompanying speech would prime words and compared the priming effect of iconic gestures with speech to that of iconic gestures presented alone. Adult participants (N = 180) were randomly assigned to one of three conditions in a lexical decision task: Gestures-Only (the primes were iconic gestures presented alone); Speech-Only (the primes were auditory tokens conveying the same meaning as the iconic gestures); Gestures-Accompanying-Speech (the primes were the simultaneous coupling of iconic gestures and their corresponding auditory tokens). Our findings revealed significant priming effects in all three conditions. However, the priming effect in the Gestures-Accompanying-Speech condition was comparable to that in the Speech-Only condition and was significantly weaker than that in the Gestures-Only condition, suggesting that the facilitatory effect of iconic gestures accompanying speech may be constrained by the level of language processing required in the lexical decision task where linguistic processing of words forms is more dominant than semantic processing. Hence, the priming effect afforded by the co-speech iconic gestures was weakened.
doi:10.3389/fpsyg.2013.00779
PMCID: PMC3800814  PMID: 24155738
co-speech gestures; cross-modal priming; lexical decision; language processing
21.  Peculiarities in the gestural repertoire: An early marker for Rett syndrome? 
Highlights
► The emergence of first gestures in girls with RTT is not necessarily delayed. ► The repertoire of communicative gestures, however, is restricted. ► Although girls with RTT have difficulties in their verbal communicative domain, gestures do not constitute a compensatory mechanism. ► A limited repertoire of gestures and qualitative peculiarities in other speech-language domains might be characteristic for a severe neurodevelopmental disorder like RTT.
We studied the gestures used by children with classic Rett syndrome (RTT) to provide evidence as to how this essential aspect of communicative functions develops. Seven participants with RTT were longitudinally observed between 9 and 18 months of life. The gestures used by these participants were transcribed and coded from a retrospective analysis of a video footage. Gestures were classified as deictic gestures, play schemes, and representational gestures. Results of the analysis showed that the majority of gestures observed were of deictic character. There were no gestures that could be classified as play schemes and only two (e.g., head nodding and waving bye bye) that were coded as representational or symbolic gestures. The overall repertoire of gestures, even though not necessarily delayed in it's onset, was characterized by little variability and a restricted pragmatic functionality. We conclude that the gestural abilities in girls with RTT appear to remain limited and do not constitute a compensatory mechanism for the verbal language modality.
doi:10.1016/j.ridd.2012.05.014
PMCID: PMC3445810  PMID: 22699245
Communication; Gesture; Interaction; Language; Language impairment; Pointing; Rett; Speech; Video analysis
22.  Multimodality in infancy: vocal-motor and speech-gesture coordinations in typical and atypical development 
From very early in life, expressive behavior is multimodal, with early behavioral coordinations being refined and strengthened over time as they become used for the communication of meaning. Of these communicative coordinations, those that involve gesture and speech have received perhaps the greatest empirical attention, but little is known about the developmental origins of the gesture-speech link. One possibility is that the origins of speech-gesture coordinations lie in hand-mouth linkages that are observed in the everyday sensorimotor activity of very young infants who do not yet use the hand or mouth to communicate meaning. In this article, I review evidence suggesting that the study of gesture-speech links and developmentally prior couplings between the vocal and motor systems in infancy can provide valuable insight into a number of later developments that reflect the cognitive interdependence of gesture and speech. These include aspects of language development and delay, the infant origins of the adult speech-gesture system, and early signs of autism spectrum disorder. Implications of these findings for studying the development of multimodal communication are considered.
doi:10.4074/S0013754510003046
PMCID: PMC3074363  PMID: 21494413
gesture; language development; vocalization; motor development
23.  Playing Charades in the fMRI: Are Mirror and/or Mentalizing Areas Involved in Gestural Communication? 
PLoS ONE  2009;4(8):e6801.
Communication is an important aspect of human life, allowing us to powerfully coordinate our behaviour with that of others. Boiled down to its mere essentials, communication entails transferring a mental content from one brain to another. Spoken language obviously plays an important role in communication between human individuals. Manual gestures however often aid the semantic interpretation of the spoken message, and gestures may have played a central role in the earlier evolution of communication. Here we used the social game of charades to investigate the neural basis of gestural communication by having participants produce and interpret meaningful gestures while their brain activity was measured using functional magnetic resonance imaging. While participants decoded observed gestures, the putative mirror neuron system (pMNS: premotor, parietal and posterior mid-temporal cortex), associated with motor simulation, and the temporo-parietal junction (TPJ), associated with mentalizing and agency attribution, were significantly recruited. Of these areas only the pMNS was recruited during the production of gestures. This suggests that gestural communication relies on a combination of simulation and, during decoding, mentalizing/agency attribution brain areas. Comparing the decoding of gestures with a condition in which participants viewed the same gestures with an instruction not to interpret the gestures showed that although parts of the pMNS responded more strongly during active decoding, most of the pMNS and the TPJ did not show such significant task effects. This suggests that the mere observation of gestures recruits most of the system involved in voluntary interpretation.
doi:10.1371/journal.pone.0006801
PMCID: PMC2728843  PMID: 19710923
24.  Gestures make memories, but what kind? Patients with impaired procedural memory display disruptions in gesture production and comprehension 
Hand gesture, a ubiquitous feature of human interaction, facilitates communication. Gesture also facilitates new learning, benefiting speakers and listeners alike. Thus, gestures must impact cognition beyond simply supporting the expression of already-formed ideas. However, the cognitive and neural mechanisms supporting the effects of gesture on learning and memory are largely unknown. We hypothesized that gesture's ability to drive new learning is supported by procedural memory and that procedural memory deficits will disrupt gesture production and comprehension. We tested this proposal in patients with intact declarative memory, but impaired procedural memory as a consequence of Parkinson's disease (PD), and healthy comparison participants with intact declarative and procedural memory. In separate experiments, we manipulated the gestures participants saw and produced in a Tower of Hanoi (TOH) paradigm. In the first experiment, participants solved the task either on a physical board, requiring high arching movements to manipulate the discs from peg to peg, or on a computer, requiring only flat, sideways movements of the mouse. When explaining the task, healthy participants with intact procedural memory displayed evidence of their previous experience in their gestures, producing higher, more arching hand gestures after solving on a physical board, and smaller, flatter gestures after solving on a computer. In the second experiment, healthy participants who saw high arching hand gestures in an explanation prior to solving the task subsequently moved the mouse with significantly higher curvature than those who saw smaller, flatter gestures prior to solving the task. These patterns were absent in both gesture production and comprehension experiments in patients with procedural memory impairment. These findings suggest that the procedural memory system supports the ability of gesture to drive new learning.
doi:10.3389/fnhum.2014.01054
PMCID: PMC4292316  PMID: 25628556
hand gesture; procedural memory; declarative memory; Parkinson's disease; communication; learning; memory systems
25.  Multisensory Temporal Integration in Autism Spectrum Disorders 
The Journal of Neuroscience  2014;34(3):691-697.
The new DSM-5 diagnostic criteria for autism spectrum disorders (ASDs) include sensory disturbances in addition to the well-established language, communication, and social deficits. One sensory disturbance seen in ASD is an impaired ability to integrate multisensory information into a unified percept. This may arise from an underlying impairment in which individuals with ASD have difficulty perceiving the temporal relationship between cross-modal inputs, an important cue for multisensory integration. Such impairments in multisensory processing may cascade into higher-level deficits, impairing day-to-day functioning on tasks, such as speech perception. To investigate multisensory temporal processing deficits in ASD and their links to speech processing, the current study mapped performance on a number of multisensory temporal tasks (with both simple and complex stimuli) onto the ability of individuals with ASD to perceptually bind audiovisual speech signals. High-functioning children with ASD were compared with a group of typically developing children. Performance on the multisensory temporal tasks varied with stimulus complexity for both groups; less precise temporal processing was observed with increasing stimulus complexity. Notably, individuals with ASD showed a speech-specific deficit in multisensory temporal processing. Most importantly, the strength of perceptual binding of audiovisual speech observed in individuals with ASD was strongly related to their low-level multisensory temporal processing abilities. Collectively, the results represent the first to illustrate links between multisensory temporal function and speech processing in ASD, strongly suggesting that deficits in low-level sensory processing may cascade into higher-order domains, such as language and communication.
doi:10.1523/JNEUROSCI.3615-13.2014
PMCID: PMC3891950  PMID: 24431427
audiovisual; autism spectrum disorders; multisensory integration; sensory processing; speech perception; temporal processing

Results 1-25 (1082374)