当前位置: X-MOL 学术IET Image Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Acceleration of multi-task cascaded convolutional networks
IET Image Processing ( IF 2.3 ) Pub Date : 2020-09-07 , DOI: 10.1049/iet-ipr.2019.0141
Long‐Hua Ma 1 , Hang‐Yu Fan 1, 2 , Zhe‐Ming Lu 2 , Dong Tian 2
Affiliation  

Multi-task cascaded convolutional neural network (MTCNN) is a human face detection architecture which uses a cascaded structure with three stages (P-Net, R-Net and O-Net). The authors intend to reduce the computation time of the whole process of the MTCNN. They find that the non-maximum suppression (NMS) processes after the P-Net occupy over half of the computation time. Therefore, the authors propose a self-fine-tuning method which makes the control of computation time for the NMS process easier. Self-fine-tuning is a training trick which uses hard samples generated by P-Net to retrain P-Net. After self-fine-tuning, the distribution of human face probabilities generated by P-Net is changed, and the tail of distribution becomes thinner. The control of the number of NMS input boxes can be made easier when the distribution has a thinner tail, and choosing a suitable threshold to filter the face boxes will generate less boxes. So the computation time can be reduced. In order to keep the performance of MTCNN, the authors still propose a landmark data set augmentation, which can enhance the performance of the self-fine-tuned MTCNN. From the experiments, it is found that the proposed scheme can significantly reduce the computation time of MTCNN.

中文翻译:

多任务级联卷积网络的加速

多任务级联卷积神经网络(MTCNN)是一种人脸检测架构,它使用具有三个阶段(P-Net,R-Net和O-Net)的级联结构。作者打算减少MTCNN整个过程的计算时间。他们发现,在P-Net之后,非最大抑制(NMS)过程占据了一半以上的计算时间。因此,作者提出了一种自微调方法,使NMS过程的计算时间控制更加容易。自微调是一种训练技巧,它使用P-Net生成的硬样本重新训练P-Net。经过自我微调后,P-Net生成的人脸概率分布发生了变化,分布的尾部变得更细。当分布的尾部较细时,可以更轻松地控制NMS输入框的数量,并选择合适的阈值来过滤面框会产生较少的框。因此可以减少计算时间。为了保持MTCNN的性能,作者仍然提出了具有里程碑意义的数据集扩充,可以增强自微调MTCNN的性能。从实验中发现,该方案可以显着减少MTCNN的计算时间。
更新日期:2020-09-08
down
wechat
bug