当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hierarchical distillation learning for scalable person search
Pattern Recognition ( IF 8 ) Pub Date : 2021-02-01 , DOI: 10.1016/j.patcog.2021.107862
Wei Li , Shaogang Gong , Xiatian Zhu

Existing person search methods typically focus on improving person detection accuracy. This ignores the model inference efficiency, which however is fundamentally significant for real-world applications. In this work, we address this limitation by investigating the scalability problem of person search involving both model accuracy and inference efficiency simultaneously. Specifically, we formulate a Hierarchical Distillation Learning (HDL) approach. With HDL, we aim to comprehensively distil the knowledge of a strong teacher model with strong learning capability to a lightweight student model with weak learning capability. To facilitate the HDL process, we design a simple and powerful teacher model for joint learning of person detection and person re-identification matching in unconstrained scene images. Extensive experiments show the modelling advantages and cost-effectiveness superiority of HDL over the state-of-the-art person search methods on three large person search benchmarks: CUHK-SYSU, PRW, and DukeMTMC-PS.



中文翻译:

分层蒸馏学习可扩展人员搜索

现有人员搜索方法通常着重于提高人员检测准确性。这忽略了模型推理效率,但是对于实际应用而言,这是至关重要的。在这项工作中,我们通过研究同时涉及模型准确性和推理效率的人员搜索的可伸缩性问题来解决此限制。具体来说,我们制定了分级蒸馏学习(HDL)方法。借助HDL,我们旨在将具有强大学习能力的强大教师模型的知识全面地分配给具有弱学习能力的轻量级学生模型。为了促进HDL过程,我们设计了一个简单而强大的教师模型,用于联合学习无约束场景图像中的人检测和人重新识别匹配。

更新日期:2021-02-08
down
wechat
bug