当前位置: X-MOL 学术ACM Trans. Internet Technol. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Cognitive Wearable Robotics for Autism Perception Enhancement
ACM Transactions on Internet Technology ( IF 5.3 ) Pub Date : 2021-07-22 , DOI: 10.1145/3450630
Min Chen 1 , Wenjing Xiao 1 , Long Hu 1 , Yujun Ma 1 , Yin Zhang 2 , Guangming Tao 3
Affiliation  

Autism spectrum disorder (ASD) is a serious hazard to the physical and mental health of children, which limits the social activities of patients throughout their lives and places a heavy burden on families and society. The developments of communication techniques and artificial intelligence (AI) have provided new potential methods for the treatment of autism. The existing treatment systems based on AI for children with ASD focus on detecting health status and developing social skills. However, the contradiction between the terminal interaction capability and availability cannot meet the needs for real application scenarios. At the same time, the lack of diverse data cannot provide individualized care for autistic children. To explore this robot-based approach, a novel AI-based first-view-robot architecture is proposed in this article. By providing care from the first-person perspective, the proposed wearable robot overcomes the difficulty of the absence of cognitive ability in the third-view of traditional robotics and improves the social interaction ability of children with ASD. The first-view-robot architecture meets the requirements of dynamic, individualized, and highly immersed interaction services for autistic children. First, the multi-modal and multi-scene data collection processes of standard, static, and dynamic datasets are introduced in detail. Then, to comprehensively evaluate the learning ability of children with ASD through mental states and external performances, a learning assessment model with emotion correction is proposed. Besides, a wearable robot-assisted environment perception and expression enhancement mechanism for children with ASD is realized by reinforcement learning, which can be adapted to interactive environments with optimal action policies. An interactive testbed for children with ASD treatments is demonstrated and experimental cases for test subjects are presented. Last, three open issues are discussed from data processing, robot designing, and service responding perspectives.

中文翻译:

用于增强自闭症感知的认知可穿戴机器人

自闭症谱系障碍(ASD)严重危害儿童身心健康,限制了患者一生的社交活动,给家庭和社会带来沉重负担。通信技术和人工智能(AI)的发展为自闭症的治疗提供了新的潜在方法。现有的基于人工智能的 ASD 儿童治疗系统侧重于检测健康状况和培养社交技能。然而,终端交互能力和可用性之间的矛盾无法满足实际应用场景的需求。同时,缺乏多样化的数据也无法为自闭症儿童提供个性化的照护。为了探索这种基于机器人的方法,本文提出了一种新颖的基于人工智能的第一视角机器人架构。所提出的可穿戴机器人通过第一人称视角提供护理,克服了传统机器人第三视角认知能力缺失的难题,提高了自闭症儿童的社交能力。第一视角机器人架构满足了自闭症儿童动态、个性化、高度沉浸的交互服务需求。首先,详细介绍了标准数据集、静态数据集和动态数据集的多模态、多场景数据采集过程。然后,通过心理状态和外在表现综合评价自闭症儿童的学习能力,提出了一种带有情绪矫正的学习评估模型。除了,通过强化学习实现了一种可穿戴机器人辅助的自闭症儿童环境感知和表达增强机制,可以适应具有最优动作策略的交互环境。演示了一个针对患有 ASD 治疗的儿童的交互式测试平台,并展示了测试对象的实验案例。最后,从数据处理、机器人设计和服务响应的角度讨论了三个未解决的问题。
更新日期:2021-07-22
down
wechat
bug