Vision automatically exerts online and offline influences on bimanual tactile spatial perception

https://doi.org/10.1016/j.jmp.2020.102480Get rights and content

Highlights

  • Tactile detection and localization performance differs for uni- and bimanual touch.

  • Tactile performance is biased by concurrently experienced visual distractors.

  • Biased performance toward a brighter distractor is evident in online and offline effects.

  • A signal detection model can account for online and offline visual influences.

Abstract

Vision and touch interact in spatial perception. How vision exerts online influences on tactile spatial perception is well-appreciated, but far less is known regarding how recent visual experiences modulate tactile perception offline, particularly in a bimanual context. Here, we investigated how visual cues exert both online and offline biases in bimanual tactile spatial perception. In a series of experiments, participants performed a 4-alternative forced-choice tactile detection task in which they reported the perception of peri-threshold taps on the left hand, right hand, both hands, or no touch (LRBN). Participants initially performed the LRBN task in the absence of visual cues. Subsequently, participants performed the LRBN task in blocks comprising non-informative visual cues that were presented on the left and right hands. To explore the effect of distractor salience on the visuo-tactile spatial interactions, we varied the brightness of the visual cues such that visual stimuli associated with one hand were consistently brighter than visual stimuli associated with the other hand. We found that participants performed the tactile detection task in an unbiased manner prior to experiencing visual distractors. Concurrent visual cues biased tactile performance, despite an instruction to ignore vision, and these online effects tended to be larger with brighter distractors. Moreover, tactile performance was biased toward the side of the brighter visual cues even on trials when no visual cues were presented during the visuo-tactile block. Using a modeling framework based on signal detection theory, we compared a number of alternative models to recapitulate the behavioral results and to link the visual influences on touch to sensitivity and criterion reductions. Our collective results imply that recent visual experiences alter the sensitivity of tactile signal detection processes while concurrent visual cues induce more liberal perceptual decisions in the context of bimanual touch.

Introduction

We live in chaotic multisensory environments and the nervous system combines information over multiple sensory cues to support perception (Fetsch et al., 2013, Stein and Stanford, 2008, Yau et al., 2015). Interactions between sensory signals can result in more reliable sensory estimates (Ernst and Bulthoff, 2004, Green and Angelaki, 2010), more accurate perceptual decisions (Odegaard et al., 2015), and faster behavioral responses (Hecht et al., 2008, Otto et al., 2013). Commonly, the nervous system combines sensory signals that convey redundant or correlated information. Importantly, interactions between multisensory cues not only modulate immediate behavioral responses, but they can also induce persistent behavioral changes (Ernst, 2007, Navarra et al., 2007, Senna et al., 2014, Shams et al., 2011, Zilber et al., 2014). Accordingly, there has been longstanding interest in characterizing multisensory interactions in different perceptual domains and relating these multisensory effects to behavioral adaptation and learning (Shams & Seitz, 2008).

For spatial perception, we rely extensively on vision and touch, and these modalities exhibit robust interactions in the perception of space (Farne et al., 2003, Ladavas et al., 1998, Maravita et al., 2000, Ro et al., 2004), size (Ernst & Banks, 2002), shape (Bisiach et al., 2004, Hadjikhani and Roland, 1998, Helbig et al., 2012, Streri and Molina, 1993), and motion (Bensmaia et al., 2006, Konkle et al., 2009). With feature-specific processing, visuo-tactile interactions have been related to analogous coding mechanisms (Maunsell et al., 1991, Yau et al., 2016, Yau et al., 2009, Zhou and Fuster, 2000) and shared representations (Amedi et al., 2002, Konkle et al., 2009, Lacey and Sathian, 2011, Mancini et al., 2011, Van Der Groen et al., 2013). Visuo-tactile processing, particularly involving simple sensory cues, has also been linked to spatial attention interactions (Driver and Spence, 1998, Spence et al., 2000a) and there is evidence that shared attention processing resources may support both vision and touch as well as their multisensory engagement (Lakatos et al., 2009, Macaluso et al., 2002, Spence and Driver, 1996).

In addition to visual influences on touch experienced on a single hand, there is also extensive evidence that vision modulates tactile perception in bimanual contexts (Heed and Azañón, 2014, Soto-Faraco et al., 2004, Spence, Pavani, Maravita et al., 2004). Understanding bimanual touch is critical as many of our routine behaviors involve sensorimotor coordination between the hands (Swinnen & Wenderoth, 2004) and tactile cues experienced on one hand can influence tactile perception on the other (Craig and Qian, 1997, Kuroki et al., 2017, Rahman and Yau, 2019, Sherrick, 1964, Tamè et al., 2011, Verrillo et al., 1983). Studies exploiting the visuo-tactile crossmodal congruency effect (Shore et al., 2006, Spence, Pavani and Driver, 2004) have shown that ignored visual cues presented near the hands automatically exert strong influences on the performance of a bimanual task requiring subjects to localize peri-threshold tactile cues that differ in elevation. While these perceptual effects reveal the attentional, spatial, and temporal constraints for visuo-tactile interactions in a bimanual context, it is unclear how these interactions can be understood according to signal detection theory. Conceivably, vision could influence bimanual touch by modulating either the sensitivity or decision criterion parameters in the tactile signal detection processes. Indeed, visual distractors have been reported to increase the tactile detection rates on a single hand through criterion reductions rather than sensitivity increases (Lloyd et al., 2008, Mirams et al., 2017). Furthermore, to the extent that visual or multisensory experience results in immediate changes in how tactile cues are subsequently perceived, it would also be important to relate these learning or adaptation effects to changes in sensory processing or decision making.

Here, we sought to understand how online and offline visuo-tactile interactions in a bimanual context can be understood according to signal detection theory. In psychophysical experiments, we characterized the effects of brief light flashes (distractors) on the performance of a simple bimanual spatial task that required healthy human subjects to detect and localize faint taps that were delivered to one hand or both hands simultaneously. By pairing bright distractors with one hand and dim distractors with the other hand, we established the dependence of the visuo-tactile interactions on distractor brightness. Although we explicitly instructed participants to ignore the visual distractors, we hypothesized that the non-informative visual stimuli would nevertheless exert spatially-specific influences on tactile spatial perception. We measured tactile localization behavior prior to exposing participants to multisensory experiences and we compared performance during this baseline block to performance during subsequent visuo-tactile blocks that comprised both visuo-tactile and tactile-only trials. This design allowed us to quantify the influence of visual distractors on the detection and localization of simultaneously experienced taps (online effects) as well as changes in performance that occurred even in the absence of visual distractors (offline effects). Using a modeling framework which assumed separate signal detection processes for the two hands, we evaluated how visual distractor effects related to changes in either sensitivity or decision criterion. By dissociating online and offline effects, we compared how these were separately related to the signal detection parameters and evaluated how they interact.

Section snippets

Participants

Sixteen healthy individuals (10 females; mean age ± SD: 23 ± 4.5 age; range: 18–32 years) participated in the experiment. All participants were right-handed according to the Edinburgh Handedness Inventory (Oldfield, 1971). All participants reported normal tactile sensitivity and normal or corrected-to-normal vision. No participant reported a neurological or psychiatric history. All testing procedures were conducted in compliance with the policies and procedures of the Baylor College of Medicine

Tactile detection and localization in the absence of visual distractors

Participants performed a 4AFC tactile detection task (LBRN task) in which they reported the perception of peri-threshold taps delivered to the hand associated with the bright LED, the hand associated with the dim LED, both hands, or no touch in the absence of visual cues (Fig. 2A). Across all conditions, group-averaged accuracy was nearly 80% (mean ± SEM; touch on the hand associated with the bright LED: 0.77 ± 0.02; touch on the hand associated with the dim LED: 0.79 ± 0.02; touch on both

Discussion

We set out to characterize the influence of non-informative visual cues on the detection of peri-threshold taps on the two hands. We found that detection performance on the left and right hands was unbiased during the baseline block, prior to exposure to visual distractors and multisensory trials. During the visuo-tactile blocks, tactile performance could be strongly influenced by the visual distractors. On trials comprising unilateral LED illumination, responses were significantly elevated on

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Acknowledgments

This work was supported in part by funding from the Sloan Foundation, BCM seed funds, and NSF grant 2019959. We thank Yau Lab members for helpful discussions. We are especially grateful to Lucy Lai for initial guidance on the modeling work. This work was performed in the Neuromodulation and Behavioral Testing Facility of BCM’s Core for Advanced MRI (CAMRI). Data available at https://github.com/YauLab/VTLRBN2020.

References (86)

  • HelbigH.B. et al.

    The neural mechanisms of reliability weighted integration of shape information from vision and touch

    Neuroimage

    (2012)
  • KonkleT. et al.

    Motion aftereffects transfer between touch and vision

    Current Biology

    (2009)
  • LaceyS. et al.

    Multisensory object representation: insights from studies of vision and touch

    Progress in Brain Research

    (2011)
  • LakatosP. et al.

    The leading sense: supramodal control of neurophysiological context by attention

    Neuron

    (2009)
  • LloydD.M. et al.

    Development of a paradigm for measuring somatic disturbance in clinical populations with medically unexplained symptoms

    Journal of Psychosomatic Research

    (2008)
  • MacalusoE. et al.

    Multisensory spatial interactions: a window onto functional integration in the human brain

    Trends in Neuroscience

    (2005)
  • MaravitaA. et al.

    Tools for the body (schema)

    Trends in Cognitive Sciences

    (2004)
  • NavarraJ. et al.

    Adaptation to audiotactile asynchrony

    Neuroscience Letters

    (2007)
  • OldfieldR.C.

    The assessment and analysis of handedness: the Edinburgh inventory

    Neuropsychologia

    (1971)
  • ShamsL. et al.

    Benefits of multisensory learning

    Trends in Cognitive Sciences

    (2008)
  • ShoreD.I. et al.

    Temporal aspects of the visuotactile congruency effect

    Neuroscience Letters

    (2006)
  • SpenceC. et al.

    Multisensory contributions to the 3-D representation of visuotactile peripersonal space in humans: evidence from the crossmodal congruency task

    Journal of Physiology, Paris

    (2004)
  • SwinnenS.P. et al.

    Two hands, one brain: cognitive neuroscience of bimanual skill

    Trends in Cognitive Sciences

    (2004)
  • TamèL. et al.

    Spatial coding of touch at the fingers: Insights from double simultaneous stimulation within and between hands

    Neuroscience Letters

    (2011)
  • ZilberN. et al.

    Supramodal processing optimizes visual perceptual learning and plasticity

    Neuroimage

    (2014)
  • AcerbiL. et al.

    Practical Bayesian optimization for model fitting with Bayesian adaptive direct search

    Advances in Neural Information Processing Systems

    (2017)
  • AmediA. et al.

    Convergence of visual and tactile shape processing in the human lateral occipital complex

    Cerebral Cortex

    (2002)
  • BensmaiaS.J. et al.

    The influence of visual motion on tactile motion perception

    Journal of Neurophysiology

    (2006)
  • BraunC. et al.

    The right hand knows what the left hand is feeling

    Experimental Brain Research

    (2005)
  • BrouwerG.J. et al.

    Normalization in human somatosensory cortex

    Journal of Neurophysiology

    (2015)
  • BurnhamK.P. et al.

    Multimodel inference: Understanding AIC and BIC in model selection

    Sociological Methods & Research

    (2004)
  • ConventoS. et al.

    Selective attention gates the interactive crossmodal coupling between perceptual systems

    Current Biology

    (2018)
  • CraigJ.C. et al.

    Tactile pattern perception by two fingers: temporal interference and response competition

    Perception & Psychophysics

    (1997)
  • DuhamelJ.R. et al.

    Ventral intraparietal area of the macaque: congruent visual and somatic response properties

    Journal of Neurophysiology

    (1998)
  • Ernst, M. O. (2007). Learning to integrate arbitrary signals from vision and touch. 7,...
  • ErnstM.O. et al.

    Humans integrate visual and haptic information in a statistically optimal fashion

    Nature

    (2002)
  • FarneA. et al.

    Investigating multisensory spatial cognition through the phenomenon of extinction

  • FetschC.R. et al.

    Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons

    Nature Reviews Neuroscience

    (2013)
  • GepshteinS. et al.

    The combination of vision and touch depends on spatial proximity

    Journal of Visualization

    (2005)
  • GrazianoM.S. et al.

    A bimodal map of space: somatosensory receptive fields in the macaque putamen with corresponding visual receptive fields

    Experimental Brain Research

    (1993)
  • GreenD.M. et al.

    Signal detection theory and psychophysics

    (1988)
  • HadjikhaniN. et al.

    Cross-modal transfer of information between the tactile and the visual representations in the human brain: A positron emission tomographic study

    The Journal of Neuroscience

    (1998)
  • HechtD. et al.

    Enhancement of response times to bi- and tri-modal sensory stimuli during active movements

    Experimental Brain Research

    (2008)
  • Cited by (0)

    View full text