当前位置: X-MOL 学术J. R. Stat. Soc. B › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Variable Selection for Support Vector Machines in Moderately High Dimensions.
The Journal of the Royal Statistical Society, Series B (Statistical Methodology) ( IF 3.1 ) Pub Date : 2016-01-19 , DOI: 10.1111/rssb.12100
Xiang Zhang , Yichao Wu 1 , Lan Wang 2 , Runze Li 3
Affiliation  

The support vector machine (SVM) is a powerful binary classification tool with high accuracy and great flexibility. It has achieved great success, but its performance can be seriously impaired if many redundant covariates are included. Some efforts have been devoted to studying variable selection for SVMs, but asymptotic properties, such as variable selection consistency, are largely unknown when the number of predictors diverges to infinity. In this work, we establish a unified theory for a general class of nonconvex penalized SVMs. We first prove that in ultra-high dimensions, there exists one local minimizer to the objective function of nonconvex penalized SVMs possessing the desired oracle property. We further address the problem of nonunique local minimizers by showing that the local linear approximation algorithm is guaranteed to converge to the oracle estimator even in the ultra-high dimensional setting if an appropriate initial estimator is available. This condition on initial estimator is verified to be automatically valid as long as the dimensions are moderately high. Numerical examples provide supportive evidence.

中文翻译:

中高维支持向量机的变量选择。

支持向量机(SVM)是功能强大的二进制分类工具,具有很高的准确性和灵活性。它已经取得了巨大的成功,但是如果包含许多冗余协变量,则会严重损害其性能。已经进行了一些努力来研究SVM的变量选择,但是当预测变量的数量趋于无穷大时,渐近性质(例如变量选择一致性)在很大程度上是未知的。在这项工作中,我们为非凸罚分SVM的一般类别建立了统一的理论。我们首先证明,在超高维上,对于具有期望的oracle属性的非凸罚分SVM的目标函数,存在一个局部极小值。通过显示即使可以使用合适的初始估计器,即使在超高维设置中,也可以保证局部线性逼近算法收敛到oracle估计器,从而进一步解决了非唯一局部最小化器的问题。只要尺寸适中,此初始估算器上的条件即被验证为自动有效。数值示例提供了支持性证据。
更新日期:2019-11-01
down
wechat
bug