Abstract
Our daily living includes bodily contacts with objects and people. While this physical contact occurs naturally, it could also pose a risk of bodily harm—for example, when objects are sharp, or people have bad intentions. It is therefore imperative to have a mechanism that predicts the consequences of bodily contact before it occurs, to guide our interactions appropriately. Evidence from a range of studies suggests a neurofunctional coupling between external visual or auditory stimuli near the body and tactile stimuli on the body. While these multisensory peripersonal representations have been linked to spatial attention, motor control, and social behaviour, a discussion on whether these functions involve a similar mechanism has been missing. Here we suggest that prediction is central to this multimodal coding: visual or auditory stimuli near the body predict tactile consequences of bodily contact. This predictive mechanism is based on learned visuo-tactile associations and modulated by higher-order visual contextual information.
Original language | English |
---|---|
Title of host publication | The world at our fingertips |
Subtitle of host publication | a multidisciplinary exploration of peripersonal space |
Editors | Frederique De Vignemont, Andrea Serino, Hong Yu Wong, Alessandro Farne |
Publisher | Oxford University Press |
Chapter | 5 |
Pages | 81-100 |
Number of pages | 19 |
ISBN (Print) | 978-0-19-885173-8 |
DOIs | |
Publication status | Published - 2021 |
Keywords
- cross-modal
- somatosensory
- visual
- social
- predictive mechanism
- peripersonal reference frames