当前位置: X-MOL 学术IEEE Trans. Multimedia › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Head Motion Modeling for Human Behavior Analysis in Dyadic Interaction
IEEE Transactions on Multimedia ( IF 8.4 ) Pub Date : 2015-07-01 , DOI: 10.1109/tmm.2015.2432671
Bo Xiao 1 , Panayiotis Georgiou 1 , Brian Baucom 2 , Shrikanth S Narayanan 1
Affiliation  

This paper presents a computational study of head motion in human interaction, notably of its role in conveying interlocutors' behavioral characteristics. Head motion is physically complex and carries rich information; current modeling approaches based on visual signals, however , are still limited in their ability to adequately capture these important properties. Guided by the methodology of kinesics , we propose a data-driven approach to identify typical head motion patterns. The approach follows the steps of first segmenting motion events, then parametrically representing the motion by linear predictive features, and finally generalizing the motion types using Gaussian mixture models. The proposed approach is experimentally validated using video recordings of communication sessions from real couples involved in a couples therapy study. In particular we use the head motion model to classify binarized expert judgments of the interactants' specific behavioral characteristics where entrainment in head motion is hypothesized to play a role: Acceptance, Blame, Positive, and Negative behavior. We achieve accuracies in the range of 60% to 70% for the various experimental settings and conditions. In addition, we describe a measure of motion similarity between the interaction partners based on the proposed model. We show that the relative change of head motion similarity during the interaction significantly correlates with the expert judgments of the interactants' behavioral characteristics. These findings demonstrate the effectiveness of the proposed head motion model, and underscore the promise of analyzing human behavioral characteristics through signal processing methods.

中文翻译:

二元交互中人类行为分析的头部运动建模

本文介绍了人类互动中头部运动的计算研究,特别是其在传达对话者行为特征方面的作用。头部运动在物理上是复杂的,并且携带着丰富的信息;然而,当前基于视觉信号的建模方法在充分捕捉这些重要特性的能力方面仍然有限。在运动学方法的指导下,我们提出了一种数据驱动的方法来识别典型的头部运动模式。该方法遵循以下步骤:首先分割运动事件,然后通过线性预测特征​​参数化表示运动,最后使用高斯混合模型概括运动类型。所提出的方法通过使用来自参与夫妻治疗研究的真实夫妻的交流会话视频记录进行实验验证。特别是,我们使用头部运动模型对交互者特定行为特征的二值化专家判断进行分类,其中假设头部运动的夹带起作用:接受、责备、积极和消极行为。对于各种实验设置和条件,我们实现了 60% 到 70% 范围内的准确度。此外,我们基于所提出的模型描述了交互伙伴之间运动相似性的度量。我们表明,交互过程中头部运动相似性的相对变化与专家对交互者行为特征的判断显着相关。这些发现证明了所提出的头部运动模型的有效性,并强调了通过信号处理方法分析人类行为特征的前景。
更新日期:2015-07-01
down
wechat
bug