当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Sparse SVM for Sufficient Data Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 2021-04-23 , DOI: 10.1109/tpami.2021.3075339
Shenglong Zhou

Kernel-based methods for support vector machines (SVM) have shown highly advantageous performance in various applications. However, they may incur prohibitive computational costs for large-scale sample datasets. Therefore, data reduction (reducing the number of support vectors) appears to be necessary, which gives rise to the topic of the sparse SVM. Motivated by this problem, the sparsity constrained kernel SVM optimization has been considered in this paper in order to control the number of support vectors. Based on the established optimality conditions associated with the stationary equations, a Newton-type method is developed to handle the sparsity constrained optimization. This method is found to enjoy the one-step convergence property if the starting point is chosen to be close to a local region of a stationary point, thereby leading to a super-high computational speed. Numerical comparisons with several powerful solvers demonstrate that the proposed method performs exceptionally well, particularly for large-scale datasets in terms of a much lower number of support vectors and shorter computational time.

中文翻译:


用于充分减少数据的稀疏 SVM



基于核的支持向量机 (SVM) 方法在各种应用中都表现出了非常优越的性能。然而,对于大规模样本数据集,它们可能会产生高昂的计算成本。因此,数据缩减(减少支持向量的数量)就显得很有必要,这就产生了稀疏SVM的话题。受此问题的启发,本文考虑稀疏约束核SVM优化以控制支持向量的数量。基于已建立的与平稳方程相关的最优性条件,开发了一种牛顿型方法来处理稀疏约束优化。发现如果起始点选择接近驻点的局部区域,则该方法具有一步收敛特性,从而导致超高的计算速度。与几个强大的求解器的数值比较表明,所提出的方法表现得非常好,特别是对于大规模数据集,支持向量的数量要少得多,计算时间更短。
更新日期:2021-04-23
down
wechat
bug