Dataset Information


Aero-tactile integration in speech perception.

ABSTRACT: Visual information from a speaker's face can enhance or interfere with accurate auditory perception. This integration of information across auditory and visual streams has been observed in functional imaging studies, and has typically been attributed to the frequency and robustness with which perceivers jointly encounter event-specific information from these two modalities. Adding the tactile modality has long been considered a crucial next step in understanding multisensory integration. However, previous studies have found an influence of tactile input on speech perception only under limited circumstances, either where perceivers were aware of the task or where they had received training to establish a cross-modal mapping. Here we show that perceivers integrate naturalistic tactile information during auditory speech perception without previous training. Drawing on the observation that some speech sounds produce tiny bursts of aspiration (such as English 'p'), we applied slight, inaudible air puffs on participants' skin at one of two locations: the right hand or the neck. Syllables heard simultaneously with cutaneous air puffs were more likely to be heard as aspirated (for example, causing participants to mishear 'b' as 'p'). These results demonstrate that perceivers integrate event-relevant tactile information in auditory perception in much the same way as they do visual information.


PROVIDER: S-EPMC3662541 | BioStudies | 2009-01-01T00:00:00Z

REPOSITORIES: biostudies

Similar Datasets

2013-01-01 | S-EPMC3701087 | BioStudies
2019-01-01 | S-EPMC6634411 | BioStudies
2013-01-01 | S-EPMC3741276 | BioStudies
2016-01-01 | S-EPMC4980767 | BioStudies
2019-01-01 | S-EPMC6716985 | BioStudies
2016-01-01 | S-EPMC4744562 | BioStudies
2020-01-01 | S-EPMC6289876 | BioStudies
1000-01-01 | S-EPMC1317952 | BioStudies
2020-01-01 | S-EPMC7055540 | BioStudies
2018-01-01 | S-EPMC5828660 | BioStudies