当前位置: X-MOL 学术Displays › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Using visual and auditory cues to locate out-of-view objects in head-mounted augmented reality
Displays ( IF 3.7 ) Pub Date : 2021-06-18 , DOI: 10.1016/j.displa.2021.102032
Nicola Binetti , Luyan Wu , Shiping Chen , Ernst Kruijff , Simon Julier , Duncan P. Brumby

When looking for an object in a complex visual scene, Augmented Reality (AR) can assist search with visual cues persistently pointing in the target’s direction. The effectiveness of these visual cues can be reduced if they are placed at a different visual depth plane to the target they are indicating. To overcome this visual-depth problem, we test the effectiveness of adding simultaneous spatialized auditory cues that are fixed at the target’s location. In an experiment we manipulated which cue(s) were available (visual-only vs. visual + auditory), and which disparity plane relative to the target the visual cue was displayed on. Results show that participants were slower at finding targets when the visual cue was placed on a different disparity plane to the target. However, this slowdown in search performance could be substantially reduced with auditory cueing. These results demonstrate the importance of AR cross-modal cueing under conditions of visual uncertainty and show that designers should consider augmenting visual cues with auditory ones.



中文翻译:

使用视觉和听觉线索在头戴式增强现实中定位视野外的物体

在复杂的视觉场景中寻找对象时,增强现实 (AR) 可以通过持续指向目标方向的视觉线索来协助搜索。如果将这些视觉提示放置在与它们所指示的目标不同的视觉深度平面上,则会降低这些视觉提示的有效性。为了克服这个视觉深度问题,我们测试了添加固定在目标位置的同步空间化听觉线索的有效性。在一个实验中,我们操纵了哪些提示可用(仅视觉与视觉 + 听觉),以及视觉提示显示在相对于目标的哪个视差平面上。结果表明,当视觉提示放置在与目标不同的视差平面上时,参与者寻找目标的速度较慢。然而,使用听觉提示可以大大减少搜索性能的下降。这些结果证明了 AR 跨模式提示在视觉不确定性条件下的重要性,并表明设计师应该考虑用听觉提示来增强视觉提示。

更新日期:2021-06-23
down
wechat
bug