当前位置: X-MOL 学术J. Circuits Syst. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Human Activity Recognition-Oriented Incremental Learning with Knowledge Distillation
Journal of Circuits, Systems and Computers ( IF 1.5 ) Pub Date : 2020-10-17 , DOI: 10.1142/s0218126621500961
Caijuan Chen 1 , Kaoru Ota 2 , Mianxiong Dong 2 , Chen Yu 1 , Hai Jin 1
Affiliation  

Recently, a variety of different machine learning methods improve the applicability of activity recognition systems in different scenarios. For many current activity recognition models, it is assumed that all data are prepared well in advance and the device has no storage space limitation. However, the process of the sensor data collection is dynamically changing over time, the activity category may be continuously increasing, and the device has limited storage space. Therefore, in this study, we propose a novel class incremental learning comprehensive solution towards activity recognition with knowledge distillation. Besides, we develop the representative sample selection method to select and update a specific number of preserved old samples. When new activity classes samples arrive, we only need the new classes samples and the representative old samples to preserve the network’s performance for old classes while identifying the new classes. Finally, we carry out experiments using two different public datasets, and they show good accuracy for old and new categories. Besides, the method can significantly reduce the space required to store old classes samples.

中文翻译:

面向人类活动识别的增量学习与知识蒸馏

最近,各种不同的机器学习方法提高了活动识别系统在不同场景中的适用性。对于当前的许多活动识别模型,假设所有数据都提前准备好,并且设备没有存储空间限制。然而,传感器数据采集的过程是随时间动态变化的,活动类别可能不断增加,设备存储空间有限。因此,在本研究中,我们提出了一种新颖的类增量学习综合解决方案,用于知识蒸馏的活动识别。此外,我们开发了代表性样本选择方法来选择和更新特定数量的保留旧样本。当新的活动类别样本到来时,我们只需要新的类样本和具有代表性的旧样本,就可以在识别新类的同时保留旧类的网络性能。最后,我们使用两个不同的公共数据集进行了实验,它们对新旧类别都显示出良好的准确性。此外,该方法可以显着减少存储旧类样​​本所需的空间。
更新日期:2020-10-17
down
wechat
bug