当前位置: X-MOL 学术Eng. Appl. Artif. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An accelerator for online SVM based on the fixed-size KKT window
Engineering Applications of Artificial Intelligence ( IF 8 ) Pub Date : 2020-04-10 , DOI: 10.1016/j.engappai.2020.103637
Husheng Guo , Aijuan Zhang , Wenjian Wang

Support vector machine (SVM), as a general and useful supervised learning tool, is facing with some challenges such as low learning efficiency, poor generalization performance, noise sensitivity, etc. when it is applied to online learning tasks. To overcome these limitations, an accelerator model based on window technology and the KKT conditions for online SVM learning is proposed in this paper. The proposed model is not an independent online algorithm but may be regarded as an accelerator for other online SVM learning algorithms, and it constructs working set of SVM by a fixed-size window with the samples which violate the KKT conditions. The relationship between Lagrangain multipliers in dual problem of SVM and KKT conditions are analyzed in the case of online learning. On this basis, a fixed-size KKT window can be constructed according to whether the samples violate KKT conditions or not. Then, it takes the samples that violate the KKT conditions as the training window, which not only makes the training samples with the same size each time, but also ensures that all samples are useful for the hyperplane updating (it means that the classifier can be updated more smoothly). Two typical and specific online SVM algorithms are used as baseline, and the corresponding speeding online SVM learning algorithms with ”X+accelerator” models are proposed to testing the performance of the proposed accelerator. Comprehensive experiments clearly show that the proposed model can accelerate the online learning process effectively and has good robustness and generalization performance.



中文翻译:

基于固定大小KKT窗口的在线SVM加速器

支持向量机(SVM)作为一种通用且有用的监督学习工具,在应用于在线学习任务时面临着学习效率低,泛化性能差,噪声敏感等问题。为了克服这些限制,本文提出了一种基于窗口技术和KKT条件的在线SVM学习加速器模型。所提出的模型不是独立的在线算法,而是可以视为其他在线SVM学习算法的加速器,并且它通过固定大小的窗口构造违反了KKT条件的样本的SVM工作集。在在线学习的情况下,分析了SVM对偶问题中的Lagrangain乘数与KKT条件之间的关系。在此基础上,可以根据样本是否违反KKT条件来构造固定大小的KKT窗口。然后,将违反KKT条件的样本作为训练窗口,这不仅使每次训练样本的大小相同,而且确保了所有样本都对超平面更新有用(这意味着分类器可以更新更加顺利)。以两种典型和特定的在线SVM算法为基准,并提出了相应的带有“ X +加速器”模型的超速在线SVM学习算法,以测试该加速器的性能。综合实验清楚地表明,该模型可以有效地加速在线学习过程,具有良好的鲁棒性和泛化性能。它将违反KKT条件的样本作为训练窗口,这不仅使每次训练样本的大小相同,而且确保了所有样本都对超平面更新有用(这意味着分类器可以进行更多更新顺利)。以两种典型和特定的在线SVM算法为基准,并提出了相应的带有“ X +加速器”模型的超速在线SVM学习算法,以测试该加速器的性能。综合实验清楚地表明,该模型可以有效地加速在线学习过程,具有良好的鲁棒性和泛化性能。它将违反KKT条件的样本作为训练窗口,这不仅使每次训练样本的大小相同,而且确保了所有样本都对超平面更新有用(这意味着分类器可以进行更多更新顺利)。以两种典型和特定的在线SVM算法为基准,并提出了相应的带有“ X +加速器”模型的超速在线SVM学习算法,以测试该加速器的性能。综合实验清楚地表明,该模型可以有效地加速在线学习过程,具有良好的鲁棒性和泛化性能。而且还可以确保所有样本都可用于超平面更新(这意味着分类器可以更平滑地更新)。以两种典型和特定的在线SVM算法为基准,并提出了相应的带有“ X +加速器”模型的超速在线SVM学习算法,以测试该加速器的性能。综合实验清楚地表明,该模型可以有效地加速在线学习过程,具有良好的鲁棒性和泛化性能。而且还可以确保所有样本都可用于超平面更新(这意味着分类器可以更平滑地更新)。以两种典型和特定的在线SVM算法为基准,并提出了相应的带有“ X +加速器”模型的超速在线SVM学习算法,以测试该加速器的性能。综合实验清楚地表明,该模型可以有效地加速在线学习过程,具有良好的鲁棒性和泛化性能。

更新日期:2020-04-10
down
wechat
bug