当前位置: X-MOL 学术Signal Image Video Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Student behavior analysis to measure engagement levels in online learning environments
Signal, Image and Video Processing ( IF 2.3 ) Pub Date : 2021-05-14 , DOI: 10.1007/s11760-021-01869-7
Khawlah Altuwairqi 1 , Salma Kammoun Jarraya 1, 2 , Arwa Allinjawi 1 , Mohamed Hammami 2, 3
Affiliation  

After the COVID-19 pandemic, no one refutes the importance of smart online learning systems in the educational process. Measuring student engagement is a crucial step towards smart online learning systems. A smart online learning system can automatically adapt to learners’ emotions and provide feedback about their motivations. In the last few decades, online learning environments have generated tremendous interest among researchers in computer-based education. The challenge that researchers face is how to measure student engagement based on their emotions. There has been an increasing interest towards computer vision and camera-based solutions as technology that overcomes the limits of both human observations and expensive equipment used to measure student engagement. Several solutions have been proposed to measure student engagement, but few are behavior-based approaches. In response to these issues, in this paper, we propose a new automatic multimodal approach to measure student engagement levels in real time. Thus, to offer robust and accurate student engagement measures, we combine and analyze three modalities representing students’ behaviors: emotions from facial expressions, keyboard keystrokes, and mouse movements. Such a solution operates in real time while providing the exact level of engagement and using the least expensive equipment possible. We validate the proposed multimodal approach through three main experiments, namely single, dual, and multimodal research modalities in novel engagement datasets. In fact, we build new and realistic student engagement datasets to validate our contributions. We record the highest accuracy value (95.23%) for the multimodal approach and the lowest value of “0.04” for mean square error (MSE).



中文翻译:

学生行为分析以衡量在线学习环境中的参与度

在 COVID-19 大流行之后,没有人反驳智能在线学习系统在教育过程中的重要性。衡量学生的参与度是迈向智能在线学习系统的关键一步。智能在线学习系统可以自动适应学习者的情绪并提供有关其动机的反馈。在过去的几十年中,在线学习环境引起了研究人员对计算机教育的极大兴趣。研究人员面临的挑战是如何根据学生的情绪来衡量学生的参与度。人们对计算机视觉和基于相机的解决方案越来越感兴趣,因为它可以克服人类观察和用于衡量学生参与度的昂贵设备的限制。已经提出了几种解决方案来衡量学生的参与度,但很少有基于行为的方法。针对这些问题,在本文中,我们提出了一种新的自动多模式方法来实时测量学生的参与度。因此,为了提供强大而准确的学生参与度测量,我们结合并分析了代表学生行为的三种模式:面部表情、键盘击键和鼠标移动的情绪。这样的解决方案实时运行,同时提供准确的参与度,并尽可能使用最便宜的设备。我们通过三个主要实验验证了所提出的多模式方法,即新参与数据集中的单模式、双模式和多模式研究模式。事实上,我们构建了新的和现实的学生参与数据集来验证我们的贡献。我们记录了最高精度值(95.

更新日期:2021-05-14
down
wechat
bug