当前位置: X-MOL 学术Technometrics › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast and Exact Leave-One-Out Analysis of Large-Margin Classifiers
Technometrics ( IF 2.3 ) Pub Date : 2021-09-22 , DOI: 10.1080/00401706.2021.1967199
Boxiang Wang 1 , Hui Zou 2
Affiliation  

Abstract

Motivated by the Golub–Heath–Wahba formula for ridge regression, we first present a new leave-one-out lemma for the kernel support vector machines (SVM) and related large-margin classifiers. We then use the lemma to design a novel and efficient algorithm, named “magicsvm,” for training the kernel SVM and related large-margin classifiers and computing the exact leave-one-out cross-validation error. By “magicsvm,” the computational cost of leave-one-out analysis is of the same order of fitting a single SVM on the training data. We show that “magicsvm” is much faster than the state-of-the-art SVM solvers based on extensive simulations and benchmark examples. The same idea is also used to boost the computation speed of the V-fold cross-validation of the kernel classifiers.



中文翻译:

大边距分类器的快速准确的留一分析

摘要

受岭回归的 Golub-Heath-Wahba 公式的启发,我们首先为核支持向量机 (SVM) 和相关的大边距分类器提出了一个新的留一法引理。然后,我们使用引理设计了一种新颖且高效的算法,名为“magicsvm”,用于训练内核 SVM 和相关的大边距分类器,并计算准确的留一法交叉验证误差。通过“magicsvm”,留一法分析的计算成本与在训练数据上拟合单个 SVM 的数量级相同。我们表明,“magicsvm”比基于广泛模拟和基准示例的最先进的 SVM 求解器快得多。同样的想法也被用于提高核分类器的V折交叉验证的计算速度。

更新日期:2021-09-22
down
wechat
bug