当前位置: X-MOL 学术IEEE Trans. Image Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax Layer.
IEEE Transactions on Image Processing ( IF 10.6 ) Pub Date : 2020-05-06 , DOI: 10.1109/tip.2020.2990277
Xiaoxu Li , Dongliang Chang , Zhanyu Ma , Zheng-Hua Tan , Jing-Hao Xue , Jie Cao , Jingyi Yu , Jun Guo

A deep neural network of multiple nonlinear layers forms a large function space, which can easily lead to overfitting when it encounters small-sample data. To mitigate overfitting in small-sample classification, learning more discriminative features from small-sample data is becoming a new trend. To this end, this paper aims to find a subspace of neural networks that can facilitate a large decision margin. Specifically, we propose the Orthogonal Softmax Layer (OSL), which makes the weight vectors in the classification layer remain orthogonal during both the training and test processes. The Rademacher complexity of a network using the OSL is only $\frac {1}{K}$ , where $K$ is the number of classes, of that of a network using the fully connected classification layer, leading to a tighter generalization error bound. Experimental results demonstrate that the proposed OSL has better performance than the methods used for comparison on four small-sample benchmark datasets, as well as its applicability to large-sample datasets. Codes are available at: https://github.com/dongliangchang/OSLNet .

中文翻译:

OSLNet:具有正交Softmax层的深层小样本分类。

包含多个非线性层的深层神经网络形成了一个较大的函数空间,当遇到小样本数据时,很容易导致过度拟合。为了减轻小样本分类中的过度拟合,从小样本数据中学习更多区分特征正成为一种新趋势。为此,本文旨在寻找可以促进较大决策余量的神经网络子空间。具体来说,我们建议正交Softmax层(OSL),这使得分类层中的权向量在训练和测试过程中保持正交。使用OSL的网络的Rademacher复杂度仅为 $ \ frac {1} {K} $ ,在哪里 $ K $ 是使用完全连接的分类层的网络的类的数量,从而导致更严格的泛化误差范围。实验结果表明,与在四个小样本基准数据集上进行比较的方法相比,拟议的OSL具有更好的性能,并且适用于大样本数据集。可以在以下位置找到代码:https://github.com/dongliangchang/OSLNet
更新日期:2020-07-03
down
wechat
bug