The multisensory perception of co-speech gestures: a review and meta-analysis of neuroimaging studies

Marstaller, Lars and Burianová, Hana (2014) The multisensory perception of co-speech gestures: a review and meta-analysis of neuroimaging studies. Journal of Neurolinguistics, 30 1: 69-77. doi:10.1016/j.jneuroling.2014.04.003

Author Marstaller, Lars
Burianová, Hana
Title The multisensory perception of co-speech gestures: a review and meta-analysis of neuroimaging studies
Journal name Journal of Neurolinguistics   Check publisher's open access policy
ISSN 0911-6044
Publication date 2014-07
Sub-type Article (original research)
DOI 10.1016/j.jneuroling.2014.04.003
Open Access Status
Volume 30
Issue 1
Start page 69
End page 77
Total pages 9
Place of publication Kidlington, Oxford, United Kingdom
Publisher Pergamon
Collection year 2015
Language eng
Formatted abstract
• We investigate neural correlates of co-speech gesture perception.
• We focus on the neural overlap between different gesture types.
• We find two core systems for multisensory perception and action understanding.

Co-speech gestures constitute a unique form of multimodal communication because here the hand movements are temporally synchronized and semantically integrated with speech. Recent neuroimaging studies indicate that the perception of co-speech gestures might engage a core set of frontal, temporal, and parietal areas. However, no study has compared the neural processes during perception of different types of co-speech gestures, such as beat, deictic, iconic, and metaphoric co-speech gestures. The purpose of this study was to review the existing literature on the neural correlates of co-speech gesture perception and to test whether different types of co-speech gestures elicit a common pattern of brain activity in the listener. To this purpose, we conducted a meta-analysis of neuroimaging studies, which used different types of co-speech gestures to investigate the perception of multimodal (co-speech gestures) in contrast to unimodal (speech or gestures) stimuli. The results show that co-speech gesture perception consistently engages temporal regions related to auditory and movement perception as well as frontal-parietal regions associated with action understanding. The results of this study suggest that brain regions involved in multisensory processing and action understanding constitute the general core of co-speech gesture perception.
Keyword Co-speech gestures
Multisensory perception
Action understanding
Q-Index Code C1
Q-Index Status Confirmed Code
Institutional Status UQ

Document type: Journal Article
Sub-type: Article (original research)
Collections: Official 2015 Collection
Centre for Advanced Imaging Publications
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 3 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 3 times in Scopus Article | Citations
Google Scholar Search Google Scholar
Created: Mon, 12 May 2014, 00:58:01 EST by Lars Marstaller on behalf of Centre for Advanced Imaging