Vision automatically exerts online and offline influences on bimanual tactile spatial perception
Introduction
We live in chaotic multisensory environments and the nervous system combines information over multiple sensory cues to support perception (Fetsch et al., 2013, Stein and Stanford, 2008, Yau et al., 2015). Interactions between sensory signals can result in more reliable sensory estimates (Ernst and Bulthoff, 2004, Green and Angelaki, 2010), more accurate perceptual decisions (Odegaard et al., 2015), and faster behavioral responses (Hecht et al., 2008, Otto et al., 2013). Commonly, the nervous system combines sensory signals that convey redundant or correlated information. Importantly, interactions between multisensory cues not only modulate immediate behavioral responses, but they can also induce persistent behavioral changes (Ernst, 2007, Navarra et al., 2007, Senna et al., 2014, Shams et al., 2011, Zilber et al., 2014). Accordingly, there has been longstanding interest in characterizing multisensory interactions in different perceptual domains and relating these multisensory effects to behavioral adaptation and learning (Shams & Seitz, 2008).
For spatial perception, we rely extensively on vision and touch, and these modalities exhibit robust interactions in the perception of space (Farne et al., 2003, Ladavas et al., 1998, Maravita et al., 2000, Ro et al., 2004), size (Ernst & Banks, 2002), shape (Bisiach et al., 2004, Hadjikhani and Roland, 1998, Helbig et al., 2012, Streri and Molina, 1993), and motion (Bensmaia et al., 2006, Konkle et al., 2009). With feature-specific processing, visuo-tactile interactions have been related to analogous coding mechanisms (Maunsell et al., 1991, Yau et al., 2016, Yau et al., 2009, Zhou and Fuster, 2000) and shared representations (Amedi et al., 2002, Konkle et al., 2009, Lacey and Sathian, 2011, Mancini et al., 2011, Van Der Groen et al., 2013). Visuo-tactile processing, particularly involving simple sensory cues, has also been linked to spatial attention interactions (Driver and Spence, 1998, Spence et al., 2000a) and there is evidence that shared attention processing resources may support both vision and touch as well as their multisensory engagement (Lakatos et al., 2009, Macaluso et al., 2002, Spence and Driver, 1996).
In addition to visual influences on touch experienced on a single hand, there is also extensive evidence that vision modulates tactile perception in bimanual contexts (Heed and Azañón, 2014, Soto-Faraco et al., 2004, Spence, Pavani, Maravita et al., 2004). Understanding bimanual touch is critical as many of our routine behaviors involve sensorimotor coordination between the hands (Swinnen & Wenderoth, 2004) and tactile cues experienced on one hand can influence tactile perception on the other (Craig and Qian, 1997, Kuroki et al., 2017, Rahman and Yau, 2019, Sherrick, 1964, Tamè et al., 2011, Verrillo et al., 1983). Studies exploiting the visuo-tactile crossmodal congruency effect (Shore et al., 2006, Spence, Pavani and Driver, 2004) have shown that ignored visual cues presented near the hands automatically exert strong influences on the performance of a bimanual task requiring subjects to localize peri-threshold tactile cues that differ in elevation. While these perceptual effects reveal the attentional, spatial, and temporal constraints for visuo-tactile interactions in a bimanual context, it is unclear how these interactions can be understood according to signal detection theory. Conceivably, vision could influence bimanual touch by modulating either the sensitivity or decision criterion parameters in the tactile signal detection processes. Indeed, visual distractors have been reported to increase the tactile detection rates on a single hand through criterion reductions rather than sensitivity increases (Lloyd et al., 2008, Mirams et al., 2017). Furthermore, to the extent that visual or multisensory experience results in immediate changes in how tactile cues are subsequently perceived, it would also be important to relate these learning or adaptation effects to changes in sensory processing or decision making.
Here, we sought to understand how online and offline visuo-tactile interactions in a bimanual context can be understood according to signal detection theory. In psychophysical experiments, we characterized the effects of brief light flashes (distractors) on the performance of a simple bimanual spatial task that required healthy human subjects to detect and localize faint taps that were delivered to one hand or both hands simultaneously. By pairing bright distractors with one hand and dim distractors with the other hand, we established the dependence of the visuo-tactile interactions on distractor brightness. Although we explicitly instructed participants to ignore the visual distractors, we hypothesized that the non-informative visual stimuli would nevertheless exert spatially-specific influences on tactile spatial perception. We measured tactile localization behavior prior to exposing participants to multisensory experiences and we compared performance during this baseline block to performance during subsequent visuo-tactile blocks that comprised both visuo-tactile and tactile-only trials. This design allowed us to quantify the influence of visual distractors on the detection and localization of simultaneously experienced taps (online effects) as well as changes in performance that occurred even in the absence of visual distractors (offline effects). Using a modeling framework which assumed separate signal detection processes for the two hands, we evaluated how visual distractor effects related to changes in either sensitivity or decision criterion. By dissociating online and offline effects, we compared how these were separately related to the signal detection parameters and evaluated how they interact.
Section snippets
Participants
Sixteen healthy individuals (10 females; mean age SD: 23 ± 4.5 age; range: 18–32 years) participated in the experiment. All participants were right-handed according to the Edinburgh Handedness Inventory (Oldfield, 1971). All participants reported normal tactile sensitivity and normal or corrected-to-normal vision. No participant reported a neurological or psychiatric history. All testing procedures were conducted in compliance with the policies and procedures of the Baylor College of Medicine
Tactile detection and localization in the absence of visual distractors
Participants performed a 4AFC tactile detection task (LBRN task) in which they reported the perception of peri-threshold taps delivered to the hand associated with the bright LED, the hand associated with the dim LED, both hands, or no touch in the absence of visual cues (Fig. 2A). Across all conditions, group-averaged accuracy was nearly 80% (mean SEM; touch on the hand associated with the bright LED: 0.77 ± 0.02; touch on the hand associated with the dim LED: 0.79 ± 0.02; touch on both
Discussion
We set out to characterize the influence of non-informative visual cues on the detection of peri-threshold taps on the two hands. We found that detection performance on the left and right hands was unbiased during the baseline block, prior to exposure to visual distractors and multisensory trials. During the visuo-tactile blocks, tactile performance could be strongly influenced by the visual distractors. On trials comprising unilateral LED illumination, responses were significantly elevated on
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgments
This work was supported in part by funding from the Sloan Foundation, BCM seed funds, and NSF grant 2019959. We thank Yau Lab members for helpful discussions. We are especially grateful to Lucy Lai for initial guidance on the modeling work. This work was performed in the Neuromodulation and Behavioral Testing Facility of BCM’s Core for Advanced MRI (CAMRI). Data available at https://github.com/YauLab/VTLRBN2020.
References (86)
- et al.
Changing reference frames during the encoding of tactile events
Current Biology
(2008) - et al.
The posterior parietal cortex remaps touch into external space
Current Biology
(2010) - et al.
Flexibly weighted integration of tactile reference frames
Neuropsychologia
(2015) - et al.
Visual and tactile length matching in spatial neglect
Cortex
(2004) - et al.
Proprioceptive alignment of visual and somatosensory maps in the posterior parietal cortex
Current Biology
(2007) - et al.
Attention and the crossmodal construction of space
Trends in Cognitive Sciences
(1998) - et al.
Merging the senses into a robust percept
Trends in Cognitive Sciences
(2004) - et al.
Beyond the window: multisensory representation of peripersonal space across a transparent barrier
International Journal of Psychophysiology
(2003) - et al.
Multisensory integration: resolving sensory ambiguities to build novel representations
Current Opinion in Neurobiology
(2010) - et al.
Tactile remapping: From coordinate transformation to integration in sensorimotor processing
Trends in Cognitive Sciences
(2015)