当前位置: X-MOL 学术Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Minimum class variance multiple kernel learning
Knowledge-Based Systems ( IF 7.2 ) Pub Date : 2020-09-16 , DOI: 10.1016/j.knosys.2020.106469
Xiaoming Wang , Shitong Wang , Yajun Du , Zengxi Huang

The purpose of multiple kernel learning (MKL) is to learn an appropriate kernel from a set of predefined base kernels. Most of the MKL methods follow the basic idea of support vector machine (SVM) to learn the optimal weights of base kernels and build the used classifier. However, SVM is a local method and ignores the structure information of the data in that its solution is exclusively determined by the so-called support vectors. In the paper, we propose an improved SVM-based MKL method called minimum class variance multiple kernel learning (MCVMKL). The key characteristic of MCVMKL is that it exploits the ellipsoidal structure of the data during learning the optimal weights and building the classifier. Besides, its formulation is invariant to scalings of the weights of base kernels. We develop two optimization strategies to handle the optimization model of MCVMKL. Further, we derive a rough upper bound for the objective function of MCVMKL and propose a variant called trace-constrained multiple kernel learning (TCMKL) by using the trace of the within-class scatter matrix. TCMKL enlarges the margin between different classes and simultaneously shrinks the region covering the data as much as possible. Moreover, it can automatically tune the regularization parameter and so saves the training time due to avoiding using the time-consuming cross-validation technique to select an appropriate regularization parameter. Finally, the comprehensive experiments are conducted and the results demonstrate that the proposed methods are effective and can achieve better performance over the competing methods.



中文翻译:

最小类方差多核学习

多重内核学习(MKL)的目的是从一组预定义的基本内核中学习适当的内核。大多数MKL方法遵循支持向量机(SVM)的基本思想,以学习基本内核的最佳权重并构建使用的分类器。但是,SVM是一种局部方法,它忽略了数据的结构信息,因为其解决方案仅由所谓的支持向量确定。在本文中,我们提出了一种改进的基于SVM的MKL方法,称为最小类方差多核学习(MCVMKL)。MCVMKL的关键特征在于,它在学习最佳权重并建立分类器的过程中利用了数据的椭圆结构。此外,其公式对于基本核的权重换算是不变的。我们开发了两种优化策略来处理MCVMKL的优化模型。此外,我们得出了MCVMKL的目标函数的大致上限,并通过使用类内散布矩阵的迹线提出了一种称为迹线约束多核学习(TCMKL)的变体。TCMKL扩大了不同类别之间的边距,同时尽可能缩小了覆盖数据的区域。此外,由于避免使用费时的交叉验证技术来选择合适的正则化参数,因此它可以自动调整正则化参数,从而节省训练时间。最后,进行了全面的实验,结果表明所提出的方法是有效的,并且可以在竞争方法上取得更好的性能。我们导出了MCVMKL的目标函数的大致上限,并通过使用类内散布矩阵的跟踪,提出了一种称为跟踪约束多核学习(TCMKL)的变体。TCMKL扩大了不同类别之间的边距,同时尽可能缩小了覆盖数据的区域。此外,由于避免使用费时的交叉验证技术来选择合适的正则化参数,因此它可以自动调整正则化参数,从而节省训练时间。最后,进行了全面的实验,结果表明所提出的方法是有效的,并且可以在竞争方法上取得更好的性能。我们导出了MCVMKL的目标函数的大致上限,并通过使用类内散布矩阵的跟踪,提出了一种称为跟踪约束多核学习(TCMKL)的变体。TCMKL扩大了不同类别之间的边距,同时尽可能缩小了覆盖数据的区域。而且,由于可以避免使用费时的交叉验证技术来选择合适的正则化参数,因此它可以自动调整正则化参数,从而节省训练时间。最后,进行了全面的实验,结果表明所提出的方法是有效的,并且可以在竞争方法上取得更好的性能。

更新日期:2020-09-20
down
wechat
bug