当前位置: X-MOL 学术arXiv.cs.CV › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
SVMax: A Feature Embedding Regularizer
arXiv - CS - Computer Vision and Pattern Recognition Pub Date : 2021-03-04 , DOI: arxiv-2103.02770
Ahmed Taha, Alex Hanson, Abhinav Shrivastava, Larry Davis

A neural network regularizer (e.g., weight decay) boosts performance by explicitly penalizing the complexity of a network. In this paper, we penalize inferior network activations -- feature embeddings -- which in turn regularize the network's weights implicitly. We propose singular value maximization (SVMax) to learn a more uniform feature embedding. The SVMax regularizer supports both supervised and unsupervised learning. Our formulation mitigates model collapse and enables larger learning rates. We evaluate the SVMax regularizer using both retrieval and generative adversarial networks. We leverage a synthetic mixture of Gaussians dataset to evaluate SVMax in an unsupervised setting. For retrieval networks, SVMax achieves significant improvement margins across various ranking losses. Code available at https://bit.ly/3jNkgDt

中文翻译:

SVMax:功能嵌入正则化器

神经网络调节器(例如,权重衰减)通过显式惩罚网络的复杂性来提高性能。在本文中,我们对劣质的网络激活(功能嵌入)进行了惩罚,这反过来又隐式调整了网络的权重。我们提出奇异值最大化(SVMax)以学习更统一的特征嵌入。SVMax正则化器支持有监督的学习和无监督的学习。我们的公式减轻了模型崩溃的可能性,并提高了学习率。我们使用检索和生成对抗网络评估SVMax正则化器。我们利用高斯数据集的合成混合物在无人监督的情况下评估SVMax。对于检索网络,SVMax可以在各种排名损失中实现显着的改善。可以在https://bit.ly/3jNkgDt获得的代码
更新日期:2021-03-05
down
wechat
bug