当前位置: X-MOL 学术J. Circuits Syst. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Knowledge Distillation for Lightweight 2D Single-Person Pose Estimation
Journal of Circuits, Systems and Computers ( IF 0.9 ) Pub Date : 2022-09-09 , DOI: 10.1142/s0218126623500500
Shihao Zhang 1, 2 , Baohua Qiang 1 , Xianyi Yang 1 , Mingliang Zhou 3 , Ruidong Chen 1
Affiliation  

The current state-of-the-art single-person pose estimation methods require heavily parameterized models for accurate predictions. A promising technique to achieve accurate yet lightweight pose estimation is knowledge distillation. However, existing pose knowledge distillation methods rely on the most common large basic building blocks and a complex multi-branch architecture. In this study, we propose a Single-branch Lightweight Knowledge Distillation method to increase pose distillation efficiency for 2D Single-person pose estimation, termed SLKD2S. First, we design a novel single-branch pose knowledge distillation framework, which is composed of connected lightweight pose estimation stages. Second, we utilize a special pose distillation loss based on the joint confidence map. Finally, we only keep the initial stage and the first refinement stage to achieve a good performance. Extensive experiments on two standard benchmark datasets show the superiority of the proposed SLKD2S in terms of cost and accuracy, and the average detection accuracies are increased by 1.43% and 2.74% compared with the top-performing pose distillation method, respectively.



中文翻译:

轻量级二维单人姿态估计的知识蒸馏

当前最先进的单人姿势估计方法需要大量参数化模型才能进行准确预测。实现准确而轻量级姿势估计的有前途的技术是知识蒸馏。然而,现有的姿势知识蒸馏方法依赖于最常见的大型基本构建块和复杂的多分支架构。在这项研究中,我们提出了一种S单分支轻量级知识D蒸馏方法,以提高2 D S人姿态估计的姿态蒸馏效率,称为SLKD2S. 首先,我们设计了一个新颖的单分支姿势知识蒸馏框架,它由连接的轻量级姿势估计阶段组成。其次,我们利用基于联合置信图的特殊姿态蒸馏损失。最后,我们只保留初始阶段和第一个细化阶段以获得良好的性能。在两个标准基准数据集上进行的大量实验表明,所提出的 SLKD2S 在成本和准确性方面具有优势,与表现最佳的姿态蒸馏方法相比,平均检测精度分别提高了 1.43% 和 2.74%。

更新日期:2022-09-09
down
wechat
bug