当前位置: X-MOL 学术IEEE Trans. Neural Netw. Learn. Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Class-Variant Margin Normalized Softmax Loss for Deep Face Recognition
IEEE Transactions on Neural Networks and Learning Systems ( IF 10.4 ) Pub Date : 2020-08-28 , DOI: 10.1109/tnnls.2020.3017528
Wanping Zhang , Yongru Chen , Wenming Yang , Guijin Wang , Jing-Hao Xue , Qingmin Liao

In deep face recognition, the commonly used softmax loss and its newly proposed variations are not yet sufficiently effective to handle the class imbalance and softmax saturation issues during the training process while extracting discriminative features. In this brief, to address both issues, we propose a class-variant margin (CVM) normalized softmax loss, by introducing a true-class margin and a false-class margin into the cosine space of the angle between the feature vector and the class-weight vector. The true-class margin alleviates the class imbalance problem, and the false-class margin postpones the early individual saturation of softmax. With negligible computational complexity increment during training, the new loss function is easy to implement in the common deep learning frameworks. Comprehensive experiments on the LFW, YTF, and MegaFace protocols demonstrate the effectiveness of the proposed CVM loss function.

中文翻译:

用于深度人脸识别的类变体边距归一化 Softmax 损失

在深度人脸识别中,常用的 softmax loss 及其新提出的变体还不足以有效地处理训练过程中的类不平衡和 softmax 饱和问题,同时提取判别特征。在这篇简报中,为了解决这两个问题,我们提出了一个类变边距 (CVM) 归一化 softmax 损失,通过在特征向量和类之间的角度的余弦空间中引入一个真类边距和一个假类边距-权重向量。true-class margin 缓解了类不平衡问题,false-class margin 推迟了softmax的早期个体饱和。由于训练期间的计算复杂度增量可以忽略不计,新的损失函数很容易在常见的深度学习框架中实现。LFW、YTF、
更新日期:2020-08-28
down
wechat
bug