当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neurodynamical classifiers with low model complexity
Neural Networks ( IF 6.0 ) Pub Date : 2020-08-27 , DOI: 10.1016/j.neunet.2020.08.013
Himanshu Pant , Sumit Soman , Jayadeva , Amit Bhaya

The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an upper bound on the Vapnik–Chervonenkis (VC) dimension. The VC dimension measures the capacity or model complexity of a learning machine. Vapnik’s risk formula indicates that models with smaller VC dimension are expected to show improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number used by SVMs. In this paper, we describe a neural network that converges to the MCM solution. We employ the MCM neurodynamical system as the final layer of a neural network architecture. Our approach also optimizes the weights of all layers in order to minimize the objective, which is a combination of a bound on the VC dimension and the classification error. We illustrate the use of this model for robust binary and multi-class classification. Numerical experiments on benchmark datasets from the UCI repository show that the proposed approach is scalable and accurate, and learns models with improved accuracies and fewer support vectors.



中文翻译:

低模型复杂度的神经动力学分类器

最近提出的最小复杂度机器(MCM)通过最小化Vapnik–Chervonenkis(VC)维度的上限找到超平面分类器。VC维度衡量学习机的容量或模型复杂性。Vapnik的风险公式表明,具有较小VC维的模型有望显示出更好的泛化性。在许多基准数据集上,MCM的概括性要优于SVM,并且使用的支持向量要比SVM使用的数目少得多。在本文中,我们描述了一个收敛到MCM解决方案的神经网络。我们将MCM神经动力系统用作神经网络体系结构的最后一层。我们的方法还优化了所有层的权重,以最小化目标,这是VC维度上的界限和分类误差的组合。我们说明了此模型在稳健的二进制和多类分类中的使用。对UCI资料库中基准数据集的数值实验表明,该方法具有可扩展性和准确性,并且可以学习精度更高且支持向量较少的模型。

更新日期:2020-10-02
down
wechat
bug