-
Addressing the Association Between Action Video Game Playing Experience and Visual Search in Naturalistic Multisensory Scenes Multisensory Research (IF 1.6) Pub Date : 2024-02-13 Mohammad Hamzeloo, Daria Kvasova, Salvador Soto-Faraco
Prior studies investigating the effects of routine action video game play have demonstrated improvements in a variety of cognitive processes, including improvements in attentional tasks. However, there is little evidence indicating that the cognitive benefits of playing action video games generalize from simplified unisensory stimuli to multisensory scenes — a fundamental characteristic of natural
-
Spatial Sensory References for Vestibular Self-Motion Perception Multisensory Research (IF 1.6) Pub Date : 2023-12-20 Silvia Zanchi, Luigi F. Cuturi, Giulio Sandini, Monica Gori, Elisa R. Ferrè
While navigating through the surroundings, we constantly rely on inertial vestibular signals for self-motion along with visual and acoustic spatial references from the environment. However, the interaction between inertial cues and environmental spatial references is not yet fully understood. Here we investigated whether vestibular self-motion sensitivity is influenced by sensory spatial references
-
Cross-Modal Contributions to Episodic Memory for Voices Multisensory Research (IF 1.6) Pub Date : 2023-12-20 Joshua R. Tatz, Zehra F. Peynircioğlu
Multisensory context often facilitates perception and memory. In fact, encoding items within a multisensory context can improve memory even on strictly unisensory tests (i.e., when the multisensory context is absent). Prior studies that have consistently found these multisensory facilitation effects have largely employed multisensory contexts in which the stimuli were meaningfully related to the items
-
Stationary Haptic Stimuli Do not Produce Ocular Accommodation in Most Individuals Multisensory Research (IF 1.6) Pub Date : 2023-11-28 Lawrence R. Stark, Kim Shiraishi, Tyler Sommerfeld
This study aimed to determine the extent to which haptic stimuli can influence ocular accommodation, either alone or in combination with vision. Accommodation was measured objectively in 15 young adults as they read stationary targets containing Braille letters. These cards were presented at four distances in the range 20–50 cm. In the Touch condition, the participant read by touch with their dominant
-
Reflections on Cross-Modal Correspondences: Current Understanding and Issues for Future Research Multisensory Research (IF 1.6) Pub Date : 2023-11-10 Kosuke Motoki, Lawrence E. Marks, Carlos Velasco
The past two decades have seen an explosion of research on cross-modal correspondences. Broadly speaking, this term has been used to encompass associations between and among features, dimensions, or attributes across the senses. There has been an increasing interest in this topic amongst researchers from multiple fields (psychology, neuroscience, music, art, environmental design, etc.) and, importantly
-
Beyond the Eye: Multisensory Contributions to the Sensation of Illusory Self-Motion (Vection) Multisensory Research (IF 1.6) Pub Date : 2023-10-27 Bernhard E. Riecke, Brandy Murovec, Jennifer L. Campos, Behrang Keshavarz
Vection is typically defined as the embodied illusion of self-motion in the absence of real physical movement through space. Vection can occur in real-life situations (e.g., ‘train illusion’) and in virtual environments and simulators. The vast majority of vection research focuses on vection caused by visual stimulation. Even though visually induced vection is arguably the most compelling type of vection
-
Joint Contributions of Auditory, Proprioceptive and Visual Cues on Human Balance Multisensory Research (IF 1.6) Pub Date : 2023-10-27 Max Teaford, Zachary J. Mularczyk, Alannah Gernon, Shauntelle Cannon, Megan Kobel, Daniel M. Merfeld
One’s ability to maintain their center of mass within their base of support (i.e., balance) is believed to be the result of multisensory integration. Much of the research in this literature has focused on integration of visual, vestibular, and proprioceptive cues. However, several recent studies have found evidence that auditory cues can impact balance control metrics. In the present study, we sought
-
Investigating the Role of Leading Sensory Modality and Autistic Traits in the Visual–Tactile Temporal Binding Window Multisensory Research (IF 1.6) Pub Date : 2023-10-18 Michelle K. Huntley, An Nguyen, Matthew A. Albrecht, Welber Marinovic
Our ability to integrate multisensory information depends on processes occurring during the temporal binding window. There is limited research investigating the temporal binding window for visual–tactile integration and its relationship with autistic traits, sensory sensitivity, and unusual sensory experiences. We measured the temporal binding window for visual-tactile integration in 27 neurotypical
-
Motor Signals Mediate Stationarity Perception Multisensory Research (IF 1.6) Pub Date : 2023-10-13 Savannah Halow, James Liu, Eelke Folmer, Paul R. MacNeilage
Head movement relative to the stationary environment gives rise to congruent vestibular and visual optic-flow signals. The resulting perception of a stationary visual environment, referred to herein as stationarity perception, depends on mechanisms that compare visual and vestibular signals to evaluate their congruence. Here we investigate the functioning of these mechanisms and their dependence on
-
Subjective Audibility Modulates the Susceptibility to Sound-Induced Flash Illusion: Effect of Loudness and Auditory Masking Multisensory Research (IF 1.6) Pub Date : 2023-09-29 Yuki Ito, Hanaka Matsumoto, Kohta I. Kobayasi
When a brief flash is presented along with two brief sounds, the single flash is often perceived as two flashes. This phenomenon is called a sound-induced flash illusion, in which the auditory sense, with its relatively higher reliability in providing temporal information, modifies the visual perception. Decline of audibility due to hearing impairment is known to make subjects less susceptible to the
-
From the Outside in: ASMR Is Characterised by Reduced Interoceptive Accuracy but Higher Sensation Seeking Multisensory Research (IF 1.6) Pub Date : 2023-09-27 Giulia L. Poerio, Fatimah Osman, Jennifer Todd, Jasmeen Kaur, Lovell Jones, Flavia Cardini
Autonomous Sensory Meridian Response (ASMR) is a complex sensory-perceptual phenomenon characterised by relaxing and pleasurable scalp-tingling sensations. The ASMR trait is nonuniversal, thought to have developmental origins, and a prevalence rate of 20%. Previous theory and research suggest that trait ASMR may be underlined by atypical multisensory perception from both interoceptive and exteroceptive
-
Exploring Crossmodal Associations Between Sound and the Chemical Senses: A Systematic Review Including Interactive Visualizations Multisensory Research (IF 1.6) Pub Date : 2023-09-21 Brayan Rodríguez, Luis H. Reyes, Felipe Reinoso-Carvalho
This is the first systematic review that focuses on the influence of product-intrinsic and extrinsic sounds on the chemical senses involving both food and aroma stimuli. This review has a particular focus on all methodological details (stimuli, experimental design, dependent variables, and data analysis techniques) of 95 experiments, published in 83 publications from 2012 to 2023. 329 distinct crossmodal
-
The Audiovisual Mismatch Negativity in Predictive and Non-Predictive Speech Stimuli in Older Adults With and Without Hearing Loss Multisensory Research (IF 1.6) Pub Date : 2023-09-06 Melissa Randazzo, Paul J. Smith, Ryan Priefer, Deborah R. Senzer, Karen Froud
Adults with aging-related hearing loss (ARHL) experience adaptive neural changes to optimize their sensory experiences; for example, enhanced audiovisual (AV) and predictive processing during speech perception. The mismatch negativity (MMN) event-related potential is an index of central auditory processing; however, it has not been explored as an index of AV and predictive processing in adults with
-
Synergistic Combination of Visual Features in Vision–Taste Crossmodal Correspondences Multisensory Research (IF 1.6) Pub Date : 2023-08-14 Byron P. Lee, Charles Spence
There has been a rapid recent growth in academic attempts to summarise, understand, and predict the taste profile matching complex images that incorporate multiple visual design features. While there is now ample research to document the patterns of vision–taste correspondences involving individual visual features (such as colour and shape curvilinearity in isolation), little is known about the taste
-
Motion-Binding Property Contributes to Accurate Temporal-Order Perception in Audiovisual Synchrony Multisensory Research (IF 1.6) Pub Date : 2023-08-03 Jinhwan Kwon, Yoshihiro Miyake
Temporal perception in multisensory processing is important for an accurate and efficient understanding of the physical world. In general, it is executed in a dynamic environment in our daily lives. In particular, the motion-binding property is important for correctly identifying moving objects in the external environment. However, how this property affects multisensory temporal perception remains
-
Visuo-Tactile Congruence Leads to Stronger Illusion Than Visuo-Proprioceptive Congruence: a Quantitative and Qualitative Approach to Explore the Rubber Hand Illusion Multisensory Research (IF 1.6) Pub Date : 2023-06-20 Roxane L. Bartoletti, Ambre Denis-Noël, Séraphin Boulvert, Marie Lopez, Sylvane Faure, Xavier Corveleyn
The Rubber Hand Illusion (RHI) arises through multisensory congruence and informative cues from the most relevant sensory channels. Some studies have explored the RHI phenomenon on the fingers, but none of them modulated the congruence of visuo-tactile and visuo-proprioceptive information by changing the posture of the fingers. This study hypothesizes that RHI induction is possible despite a partial
-
What Makes the Detection of Movement Different Within the Autistic Traits Spectrum? Evidence From the Audiovisual Depth Paradigm Multisensory Research (IF 1.6) Pub Date : 2023-06-06 Rachel Poulain, Magali Batty, Céline Cappe
Atypical sensory processing is now considered a diagnostic feature of autism. Although multisensory integration (MSI) may have cascading effects on the development of higher-level skills such as socio-communicative functioning, there is a clear lack of understanding of how autistic individuals integrate multiple sensory inputs. Multisensory dynamic information is a more ecological construct than static
-
Does Task-Irrelevant Brightness Modulation Affect Auditory Contrast Processing? Exploring the Interplay Between Temporal Synchrony and Stimulus Salience Multisensory Research (IF 1.6) Pub Date : 2023-05-31 Hiu Mei Chow, Danielle Briggs, Vivian M. Ciaramitaro
Stimulus factors such as timing, spatial location, and stimulus effectiveness affect whether and how information across the senses is integrated. Extending recent work highlighting interactions between stimulus factors, here we investigated the influence of visual information on auditory processing, complementing previous studies on the influence of auditory information on visual processing. We hypothesized
-
Linking Auditory-Induced Bouncing and Auditory-Induced Illusory Crescents: an Individual-Differences Approach Multisensory Research (IF 1.6) Pub Date : 2023-05-16 Hauke S. Meyerhoff, Marlena J. Stegemann, Christian Frings
When two disks move toward each other, overlap, and then move apart, the visual system can resolve the ambiguity either as two disks streaming past each other or two disks bouncing off each other. Presenting a brief beep at the moment of overlap has been observed to increase the proportion of reported bouncing impressions (i.e., auditory-induced bouncing) as well as to reduce the perceived overlap
-
Colour–Touch Cross-Modal Correspondence and Its Impact on Single-Modal Judgement in Multimodal Perception Multisensory Research (IF 1.6) Pub Date : 2023-05-11 Tianyi Yuan, Pei-Luen Patrick Rau, Jingyu Zhao, Jian Zheng
This study explored the colour–touch cross-modal correspondence and its impact on colour–touch multisensory perception. Two laboratory experiments were conducted based on a pre-experiment. In the first experiment, participants chose the colour according to their tactile sense against the vibration generated by the smartphone simulator. A positive cross-modal correspondence was obtained between the
-
Audio–Visual Cross-Modal Correspondences of Perceived Urgency: Examination through a Speeded Discrimination Task Multisensory Research (IF 1.6) Pub Date : 2023-05-10 Kiichi Naka, Katsuya Yamauchi
When presenting information in vehicle cockpits, it is essential to convey an appropriate urgency to the drivers. Perceived urgency has been investigated over the years for each modality, particularly audition and vision. However, the interaction between the modalities of perceived urgency has rarely been examined. To expand the insight into the design application of information presentation, we investigated
-
Association Between Body Tilt and Egocentric Estimates Near Upright Multisensory Research (IF 1.6) Pub Date : 2023-04-07 Keisuke Tani, Shintaro Uehara, Satoshi Tanaka
The mechanisms underlying geocentric (orientations of an object or the body relative to ‘gravity’) and egocentric estimates (object orientation relative to the ‘body’) have each been examined; however, little is known regarding the association between these estimates, especially when the body is nearly upright. To address this, we conducted two psychophysical experiments. In Experiment 1, participants
-
Explaining Visual Shape–Taste Crossmodal Correspondences Multisensory Research (IF 1.6) Pub Date : 2023-03-20 Charles Spence
A growing body of experimental research now demonstrates that neurologically normal individuals associate different taste qualities with design features such as curvature, symmetry, orientation, texture and movement. The form of everything from the food itself through to the curvature of the plateware on which it happens to be served, and from glassware to typeface, not to mention the shapes of/on
-
Off-Vertical Body Orientation Delays the Perceived Onset of Visual Motion Multisensory Research (IF 1.6) Pub Date : 2023-03-08 William Chung, Michael Barnett-Cowan
The integration of vestibular, visual and body cues is a fundamental process in the perception of self-motion and is commonly experienced in an upright posture. However, when the body is tilted in an off-vertical orientation these signals are no longer aligned relative to the influence of gravity. In this study, the perceived timing of visual motion was examined in the presence of sensory conflict
-
Metacognition and Causal Inference in Audiovisual Speech Multisensory Research (IF 1.6) Pub Date : 2023-02-23 Faith Kimmet, Samantha Pedersen, Victoria Cardenas, Camila Rubiera, Grey Johnson, Addison Sans, Matthew Baldwin, Brian Odegaard
In multisensory environments, our brains perform causal inference to estimate which sources produce specific sensory signals. Decades of research have revealed the dynamics which underlie this process of causal inference for multisensory (audiovisual) signals, including how temporal, spatial, and semantic relationships between stimuli influence the brain’s decision about whether to integrate or segregate
-
Assessing the Effects of Exercise, Cognitive Demand, and Rest on Audiovisual Multisensory Processing in Older Adults: A Pilot Study Multisensory Research (IF 1.6) Pub Date : 2023-01-24 Aysha Basharat, Michael Barnett-Cowan
A single bout of aerobic exercise is related to positive changes in higher-order cognitive function among older adults; however, the impact of aerobic exercise on multisensory processing remains unclear. Here we assessed the effects of a single bout of aerobic exercise on commonly utilized tasks that measure audiovisual multisensory processing: response time (RT), simultaneity judgements (SJ), and
-
Neural Correlates of Audiovisual Speech Processing in Autistic and Non-Autistic Youth Multisensory Research (IF 1.6) Pub Date : 2023-01-19 Kacie Dunham, Alisa Zoltowski, Jacob I. Feldman, Samona Davis, Baxter Rogers, Michelle D. Failla, Mark T. Wallace, Carissa J. Cascio, Tiffany G. Woynaroski
Autistic youth demonstrate differences in processing multisensory information, particularly in temporal processing of multisensory speech. Extensive research has identified several key brain regions for multisensory speech processing in non-autistic adults, including the superior temporal sulcus (STS) and insula, but it is unclear to what extent these regions are involved in temporal processing of
-
Audio-Visual Interference During Motion Discrimination in Starlings Multisensory Research (IF 1.6) Pub Date : 2023-01-17 Gesa Feenders, Georg M. Klump
Motion discrimination is essential for animals to avoid collisions, to escape from predators, to catch prey or to communicate. Although most terrestrial vertebrates can benefit by combining concurrent stimuli from sound and vision to obtain a most salient percept of the moving object, there is little research on the mechanisms involved in such cross-modal motion discrimination. We used European starlings
-
Can We Train Multisensory Integration in Adults? A Systematic Review Multisensory Research (IF 1.6) Pub Date : 2023-01-13 Jessica O’Brien, Amy Mason, Jason Chan, Annalisa Setti
The ability to efficiently combine information from different senses is an important perceptual process that underpins much of our daily activities. This process, known as multisensory integration, varies from individual to individual, and is affected by the ageing process, with impaired processing associated with age-related conditions, including balance difficulties, mild cognitive impairment and
-
‘Tasting Imagination’: What Role Chemosensory Mental Imagery in Multisensory Flavour Perception? Multisensory Research (IF 1.6) Pub Date : 2022-12-30 Charles Spence
A number of perplexing phenomena in the area of olfactory/flavour perception may fruitfully be explained by the suggestion that chemosensory mental imagery can be triggered automatically by perceptual inputs. In particular, the disconnect between the seemingly limited ability of participants in chemosensory psychophysics studies to distinguish more than two or three odorants in mixtures and the rich
-
The Impact of Singing on Visual and Multisensory Speech Perception in Children on the Autism Spectrum Multisensory Research (IF 1.6) Pub Date : 2022-12-30 Jacob I. Feldman, Alexander Tu, Julie G. Conrad, Wayne Kuang, Pooja Santapuram, Tiffany G. Woynaroski
Autistic children show reduced multisensory integration of audiovisual speech stimuli in response to the McGurk illusion. Previously, it has been shown that adults can integrate sung McGurk tokens. These sung speech tokens offer more salient visual and auditory cues, in comparison to the spoken tokens, which may increase the identification and integration of visual speech cues in autistic children
-
Crossmodal Texture Perception Is Illumination-Dependent Multisensory Research (IF 1.6) Pub Date : 2022-12-28 Karina Kangur, Martin Giesel, Julie M. Harris, Constanze Hesse
Visually perceived roughness of 3D textures varies with illumination direction. Surfaces appear rougher when the illumination angle is lowered resulting in a lack of roughness constancy. Here we aimed to investigate whether the visual system also relies on illumination-dependent features when judging roughness in a crossmodal matching task or whether it can access illumination-invariant surface features
-
Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making Multisensory Research (IF 1.6) Pub Date : 2022-12-01 Rosanne R. M. Tuip, Wessel van der Ham, Jeannette A. M. Lorteije, Filip Van Opstal
Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision
-
Prior Exposure to Dynamic Visual Displays Reduces Vection Onset Latency Multisensory Research (IF 1.6) Pub Date : 2022-11-16 Jing Ni, Hiroyuki Ito, Masaki Ogawa, Shoji Sunaga, Stephen Palmisano
While compelling illusions of self-motion (vection) can be induced purely by visual motion, they are rarely experienced immediately. This vection onset latency is thought to represent the time required to resolve sensory conflicts between the stationary observer’s visual and nonvisual information about self-motion. In this study, we investigated whether manipulations designed to increase the weightings
-
Can the Perceived Timing of Multisensory Events Predict Cybersickness? Multisensory Research (IF 1.6) Pub Date : 2022-10-24 Ogai Sadiq, Michael Barnett-Cowan
Humans are constantly presented with rich sensory information that the central nervous system (CNS) must process to form a coherent perception of the self and its relation to its surroundings. While the CNS is efficient in processing multisensory information in natural environments, virtual reality (VR) poses challenges of temporal discrepancies that the CNS must solve. These temporal discrepancies
-
Relating Sound and Sight in Simulated Environments Multisensory Research (IF 1.6) Pub Date : 2022-09-08 Kevin Y. Tsang, Damien J. Mannion
The auditory signals at the ear can be affected by components arriving both directly from a sound source and indirectly via environmental reverberation. Previous studies have suggested that the perceptual separation of these contributions can be aided by expectations of likely reverberant qualities. Here, we investigated whether vision can provide information about the auditory properties of physical
-
Something in the Sway: Effects of the Shepard–Risset Glissando on Postural Activity and Vection Multisensory Research (IF 1.6) Pub Date : 2022-09-01 Rebecca A. Mursic, Stephen Palmisano
This study investigated claims of disrupted equilibrium when listening to the Shepard–Risset glissando (which creates an auditory illusion of perpetually ascending/descending pitch). During each trial, 23 participants stood quietly on a force plate for 90 s with their eyes either open or closed (30 s pre-sound, 30 s of sound and 30 s post-sound). Their centre of foot pressure (CoP) was continuously
-
Odor-Induced Taste Enhancement Is Specific to Naturally Occurring Temporal Order and the Respiration Phase Multisensory Research (IF 1.6) Pub Date : 2022-08-23 Shogo Amano, Takuji Narumi, Tatsu Kobayakawa, Masayoshi Kobayashi, Masahiko Tamura, Yuko Kusakabe, Yuji Wada
Interaction between odor and taste information creates flavor perception. There are many possible determinants of the interaction between odor and taste, one of which may be the somatic sensations associated with breathing. We assumed that a smell stimulus accompanied by inhaling or exhaling enhances taste intensity if the order is congruent with natural drinking. To present an olfactory stimulus from
-
Exploring Group Differences in the Crossmodal Correspondences Multisensory Research (IF 1.6) Pub Date : 2022-08-09 Charles Spence
There has been a rapid growth of interest amongst researchers in the cross-modal correspondences in recent years. In part, this has resulted from the emerging realization of the important role that the correspondences can sometimes play in multisensory integration. In turn, this has led to an interest in the nature of any differences between individuals, or rather, between groups of individuals, in
-
Size and Quality of Drawings Made by Adults Under Visual and Haptic Control Multisensory Research (IF 1.6) Pub Date : 2022-07-01 Magdalena Szubielska, Paweł Augustynowicz, Delphine Picard
The aim of this study was twofold. First, our objective was to test the influence of an object’s actual size (size rank) on the drawn size of the depicted object. We tested the canonical size effect (i.e., drawing objects larger in the physical world as larger) in four drawing conditions — two perceptual conditions (blindfolded or sighted) crossed with two materials (paper or special foil for producing
-
Crossmodal Correspondence between Music and Ambient Color Is Mediated by Emotion Multisensory Research (IF 1.6) Pub Date : 2022-06-08 Pia Hauck, Christoph von Castell, Heiko Hecht
The quality of a concert hall primarily depends on its acoustics. But does visual input also have an impact on musical enjoyment? Does the color of ambient lighting modulate the perceived music quality? And are certain colors perceived to fit better than others with a given music piece? To address these questions, we performed three within-subjects experiments. We carried out two pretests to select
-
Investigating the Crossmodal Influence of Odour on the Visual Perception of Facial Attractiveness and Age Multisensory Research (IF 1.6) Pub Date : 2022-05-31 Yi-Chuan Chen, Charles Spence
We report two experiments designed to investigate whether the presentation of a range of pleasant fragrances, containing both floral and fruity notes, would modulate people’s judgements of the facial attractiveness (Experiment 1) and age (Experiment 2) of a selection of typical female faces varying in age in the range 20–69 years. In Experiment 1, male participants rated the female faces as less attractive
-
Do Congruent Auditory Stimuli Facilitate Visual Search in Dynamic Environments? An Experimental Study Based on Multisensory Interaction Multisensory Research (IF 1.6) Pub Date : 2022-05-06 Xiaofang Sun, Pin-Hsuan Chen, Pei-Luen Patrick Rau
The purpose of this study was to investigate the cue congruency effect of auditory stimuli during visual search in dynamic environments. Twenty-eight participants were recruited to conduct a visual search experiment. The experiment applied auditory stimuli to understand whether they could facilitate visual search in different types of background. Additionally, target location and target orientation
-
Multisensory Perception and Learning: Linking Pedagogy, Psychophysics, and Human–Computer Interaction Multisensory Research (IF 1.6) Pub Date : 2022-04-19 Monica Gori, Sara Price, Fiona N. Newell, Nadia Berthouze, Gualtiero Volpe
In this review, we discuss how specific sensory channels can mediate the learning of properties of the environment. In recent years, schools have increasingly been using multisensory technology for teaching. However, it still needs to be sufficiently grounded in neuroscientific and pedagogical evidence. Researchers have recently renewed understanding around the role of communication between sensory
-
Evaluating the Effect of Semantic Congruency and Valence on Multisensory Integration Multisensory Research (IF 1.6) Pub Date : 2022-04-07 Elyse Letts, Aysha Basharat, Michael Barnett-Cowan
Previous studies have found that semantics, the higher-level meaning of stimuli, can impact multisensory integration; however, less is known about the effect of valence, an affective response to stimuli. This study investigated the effects of both semantic congruency and valence of non-speech audiovisual stimuli on multisensory integration via response time (RT) and temporal-order judgement (TOJ) tasks
-
Influence of Sensory Conflict on Perceived Timing of Passive Rotation in Virtual Reality Multisensory Research (IF 1.6) Pub Date : 2022-04-05 William Chung, Michael Barnett-Cowan
Integration of incoming sensory signals from multiple modalities is central in the determination of self-motion perception. With the emergence of consumer virtual reality (VR), it is becoming increasingly common to experience a mismatch in sensory feedback regarding motion when using immersive displays. In this study, we explored whether introducing various discrepancies between the vestibular and
-
Influence of Tactile Flow on Visual Heading Perception Multisensory Research (IF 1.6) Pub Date : 2022-03-09 Lisa Rosenblum, Elisa Grewe, Jan Churan, Frank Bremmer
The integration of information from different sensory modalities is crucial for successful navigation through an environment. Among others, self-motion induces distinct optic flow patterns on the retina, vestibular signals and tactile flow, which contribute to determine traveled distance (path integration) or movement direction (heading). While the processing of combined visual–vestibular information
-
Impacts of Rotation Axis and Frequency on Vestibular Perceptual Thresholds Multisensory Research (IF 1.6) Pub Date : 2022-01-05 Andrew R. Wagner, Megan J. Kobel, Daniel M. Merfeld
In an effort to characterize the factors influencing the perception of self-motion rotational cues, vestibular self-motion perceptual thresholds were measured in 14 subjects for rotations in the roll and pitch planes, as well as in the planes aligned with the anatomic orientation of the vertical semicircular canals (i.e., left anterior, right posterior; LARP, and right anterior, left posterior; RALP)
-
The Effects of Mandarin Chinese Lexical Tones in Sound–Shape and Sound–Size Correspondences Multisensory Research (IF 1.6) Pub Date : 2021-12-30 Yen-Han Chang, Mingxue Zhao, Yi-Chuan Chen, Pi-Chun Huang
Crossmodal correspondences refer to when specific domains of features in different sensory modalities are mapped. We investigated how vowels and lexical tones drive sound–shape (rounded or angular) and sound–size (large or small) mappings among native Mandarin Chinese speakers. We used three vowels (/i/, /u/, and /a/), and each vowel was articulated in four lexical tones. In the sound–shape matching
-
Crossmodal Correspondence Between Auditory Timbre and Visual Shape Multisensory Research (IF 1.6) Pub Date : 2021-12-30 Daniel Gurman, Colin R. McCormick, Raymond M. Klein
Crossmodal correspondences are defined as associations between crossmodal stimuli based on seemingly irrelevant stimulus features (i.e., bright shapes being associated with high-pitched sounds). There is a large body of research describing auditory crossmodal correspondences involving pitch and volume, but not so much involving auditory timbre, the character or quality of a sound. Adeli and colleagues
-
Reducing Cybersickness in 360-Degree Virtual Reality Multisensory Research (IF 1.6) Pub Date : 2021-12-16 Iqra Arshad, Paulo De Mello, Martin Ender, Jason D. McEwen, Elisa R. Ferré
Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real
-
Imagine Your Crossed Hands as Uncrossed: Visual Imagery Impacts the Crossed-Hands Deficit Multisensory Research (IF 1.6) Pub Date : 2021-10-22 Lisa Lorentz, Kaian Unwalla, David I. Shore
Successful interaction with our environment requires accurate tactile localization. Although we seem to localize tactile stimuli effortlessly, the processes underlying this ability are complex. This is evidenced by the crossed-hands deficit, in which tactile localization performance suffers when the hands are crossed. The deficit results from the conflict between an internal reference frame, based
-
Temporal Alignment but not Complexity of Audiovisual Stimuli Influences Crossmodal Duration Percepts Multisensory Research (IF 1.6) Pub Date : 2021-10-08 Alexandra N. Scurry, Daniela M. Lemus, Fang Jiang
Reliable duration perception is an integral aspect of daily life that impacts everyday perception, motor coordination, and subjective passage of time. The Scalar Expectancy Theory (SET) is a common model that explains how an internal pacemaker, gated by an external stimulus-driven switch, accumulates pulses during sensory events and compares these accumulated pulses to a reference memory duration for
-
Serial Dependence of Emotion Within and Between Stimulus Sensory Modalities Multisensory Research (IF 1.6) Pub Date : 2021-09-29 Erik Van der Burg, Alexander Toet, Anne-Marie Brouwer, Jan B. F. Van Erp
How we perceive the world is not solely determined by what we sense at a given moment in time, but also by what we processed recently. Here we investigated whether such serial dependencies for emotional stimuli transfer from one modality to another. Participants were presented a random sequence of emotional sounds and images and instructed to rate the valence and arousal of each stimulus (Experiment
-
Developmental Changes in Gaze Behavior and the Effects of Auditory Emotion Word Priming in Emotional Face Categorization Multisensory Research (IF 1.6) Pub Date : 2021-09-16 Michael Vesker, Daniela Bahn, Christina Kauschke, Gudrun Schwarzer
Social interactions often require the simultaneous processing of emotions from facial expressions and speech. However, the development of the gaze behavior used for emotion recognition, and the effects of speech perception on the visual encoding of facial expressions is less understood. We therefore conducted a word-primed face categorization experiment, where participants from multiple age groups
-
Multisensory Information Facilitates the Categorization of Untrained Stimuli Multisensory Research (IF 1.6) Pub Date : 2021-08-12 Jie Wu, Qitian Li, Qiufang Fu, Michael Rose, Liping Jing
Although it has been demonstrated that multisensory information can facilitate object recognition and object memory, it remains unclear whether such facilitation effect exists in category learning. To address this issue, comparable car images and sounds were first selected by a discrimination task in Experiment 1. Then, those selected images and sounds were utilized in a prototype category learning
-
Neural Basis of the Sound-Symbolic Crossmodal Correspondence Between Auditory Pseudowords and Visual Shapes Multisensory Research (IF 1.6) Pub Date : 2021-08-11 Kelly McCormick, Simon Lacey, Randall Stilla, Lynne C. Nygaard, K. Sathian
Sound symbolism refers to the association between the sounds of words and their meanings, often studied using the crossmodal correspondence between auditory pseudowords, e.g., ‘takete’ or ‘maluma’, and pointed or rounded visual shapes, respectively. In a functional magnetic resonance imaging study, participants were presented with pseudoword–shape pairs that were sound-symbolically congruent or incongruent
-
Multisensory Effects on Illusory Self-Motion (Vection): the Role of Visual, Auditory, and Tactile Cues Multisensory Research (IF 1.6) Pub Date : 2021-08-11 Brandy Murovec, Julia Spaniol, Jennifer L. Campos, Behrang Keshavarz
A critical component to many immersive experiences in virtual reality (VR) is vection, defined as the illusion of self-motion. Traditionally, vection has been described as a visual phenomenon, but more recent research suggests that vection can be influenced by a variety of senses. The goal of the present study was to investigate the role of multisensory cues on vection by manipulating the availability
-
Orienting Auditory Attention through Vision: the Impact of Monaural Listening Multisensory Research (IF 1.6) Pub Date : 2021-08-11 Silvia Turri, Mehdi Rizvi, Giuseppe Rabini, Alessandra Melonio, Rosella Gennari, Francesco Pavani
The understanding of linguistic messages can be made extremely complex by the simultaneous presence of interfering sounds, especially when they are also linguistic in nature. In two experiments, we tested if visual cues directing attention to spatial or temporal components of speech in noise can improve its identification. The hearing-in-noise task required identification of a five-digit sequence (target)
-
Exploring Reference Frame Integration Using Response Demands in a Tactile Temporal-Order Judgement Task Multisensory Research (IF 1.6) Pub Date : 2021-07-23 Kaian Unwalla, Daniel Goldreich, David I. Shore
Exploring the world through touch requires the integration of internal (e.g., anatomical) and external (e.g., spatial) reference frames — you only know what you touch when you know where your hands are in space. The deficit observed in tactile temporal-order judgements when the hands are crossed over the midline provides one tool to explore this integration. We used foot pedals and required participants