当前位置: X-MOL 学术Univ. Access Inf. Soc. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Understanding visually impaired people’s experiences of social signal perception in face-to-face communication
Universal Access in the Information Society ( IF 2.4 ) Pub Date : 2019-11-04 , DOI: 10.1007/s10209-019-00698-3
Shi Qiu , Pengcheng An , Jun Hu , Ting Han , Matthias Rauterberg

Social signals (e.g., facial expression, gestures) are important in social interactions. Most of them are visual cues, which are hardly accessible for visually impaired people, causing difficulties in their daily living. In human–computer interaction (HCI), assistive systems for social interactions are getting increasing attention due to related technological advancements. Yet, there is still lack of a comprehensive and vivid understanding of visually impaired people’s social signal perception to broadly identify their needs in face-to-face communication. To fill this gap, we conducted in-depth interviews to study the lived experiences of 20 visually impaired participants. We analyzed a rich set of qualitative empirical data based on a comprehensive taxonomy of social signals, using a standard qualitative content analysis method. Our results revealed a set of vivid examples and an overview of visually impaired people’s lived experiences regarding social signals, including both their capabilities and limitations. As reported, the participants perceived social signals through their compensatory modalities such as hearing, touch, smell, or obstacle sense. However, their perception of social signals is generally with low resolution and limited by certain environmental factors (e.g., crowdedness, or noise level of the surrounding). Interestingly, sight was still importantly relied on by low-vision participants in social signal perception (e.g., rough postures and gestures). Besides, the participants experienced difficulties in sensing others’ subtle emotional states which are often revealed by nuanced behaviors (e.g., a smile). Based on rich empirical findings, we propose a set of design implications to inform future-related HCI works aimed at supporting visually impaired users’ social signal perception.



中文翻译:

了解视障人士面对面交流中的社交信号感知体验

社交信号(例如面部表情,手势)在社交互动中很重要。它们大多数是视觉提示,视障人士很难获得,从而在日常生活中造成困难。在人机交互(HCI)中,由于相关的技术进步,用于社交交互的辅助系统越来越受到关注。然而,对于视觉障碍者的社会信号感知仍然缺乏全面而生动的理解,无法广泛地识别他们在面对面交流中的需求。为了填补这一空白,我们进行了深入的采访,以研究20名视障人士的生活经历。我们使用标准的定性内容分析方法,基于社会信号的综合分类法,分析了丰富的定性经验数据。我们的结果揭示了一系列生动的例子,并对视觉障碍者在社交信号方面的生活经历进行了概述,包括他们的能力和局限性。据报道,参与者通过他们的补偿方式(例如听觉,触觉,嗅觉或障碍感)感知社交信号。但是,他们对社交信号的感知通常分辨率较低,并受某些环境因素(例如,人群拥挤或周围的噪音水平)的限制。有趣的是,视力仍然是弱视参与者对社交信号感知(例如粗略的姿势和手势)的重要依赖。此外,参与者在感知他人的微妙情绪状态时遇到了困难,这些情绪状态通常通过细微的行为(例如微笑)来揭示。基于丰富的经验发现,

更新日期:2019-11-04
down
wechat
bug