当前位置: X-MOL 学术Int. J. Mach. Learn. & Cyber. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Universal consistency of twin support vector machines
International Journal of Machine Learning and Cybernetics ( IF 5.6 ) Pub Date : 2021-03-14 , DOI: 10.1007/s13042-021-01281-0
Weixia Xu , Dingjiang Huang , Shuigeng Zhou

A classification problem aims at constructing a best classifier with the smallest risk. When the sample size approaches infinity, the learning algorithms for a classification problem are characterized by an asymptotical property, i.e., universal consistency. It plays a crucial role in measuring the construction of classification rules. A universal consistent algorithm ensures that the larger the sample size of the algorithm is, the more accurately the distribution of the samples could be reconstructed. Support vector machines (SVMs) are regarded as one of the most important models in binary classification problems. How to effectively extend SVMs to twin support vector machines (TWSVMs) so as to improve performance of classification has gained increasing interest in many research areas recently. Many variants for TWSVMs have been proposed and used in practice. Thus in this paper, we focus on the universal consistency of TWSVMs in a binary classification setting. We first give a general framework for TWSVM classifiers that unifies most of the variants of TWSVMs for binary classification problems. Based on it, we then investigate the universal consistency of TWSVMs. To do this, we give some useful definitions of risk, Bayes risk and universal consistency for TWSVMs. Theoretical results indicate that universal consistency is valid for various TWSVM classifiers under some certain conditions, including covering number, localized covering number and stability. For applications of our general framework, several variants of TWSVMs are considered.



中文翻译:

双支持向量机的通用一致性

分类问题旨在构建风险最小的最佳分类器。当样本量接近无穷大时,分类问题的学习算法的特征是无渐近性质,即通用一致性。它在衡量分类规则的构造中起着至关重要的作用。通用一致性算法可确保算法的样本量越大,重构的样本分布越准确。支持向量机(SVM)被视为二进制分类问题中最重要的模型之一。近年来,如何有效地将支持向量机扩展到双支持向量机(TWSVM)以提高分类性能,已引起越来越多的研究兴趣。已经提出并在实践中使用了TWSVM的许多变体。因此,在本文中,我们将重点放在二进制分类设置中的TWSVM的通用一致性上。我们首先为TWSVM分类器提供一个通用框架,该框架将TWSVM的大多数变体统一起来用于二进制分类问题。基于此,我们然后研究TWSVM的通用一致性。为此,我们为TWSVM提供了一些有用的风险定义,贝叶斯风险和通用一致性。理论结果表明,在某些条件下,包括覆盖数,局部覆盖数和稳定性,通用一致性对于各种TWSVM分类器都是有效的。对于我们通用框架的应用,考虑了TWSVM的几种变体。我们将重点放在二进制分类设置中的TWSVM的通用一致性上。我们首先为TWSVM分类器提供一个通用框架,该框架将TWSVM的大多数变体统一起来用于二进制分类问题。基于此,我们然后研究TWSVM的通用一致性。为此,我们为TWSVM提供了一些有用的风险定义,贝叶斯风险和通用一致性。理论结果表明,在某些条件下,包括覆盖数,局部覆盖数和稳定性,通用一致性对于各种TWSVM分类器都是有效的。对于我们通用框架的应用,考虑了TWSVM的几种变体。我们将重点放在二进制分类设置中的TWSVM的通用一致性上。我们首先为TWSVM分类器提供一个通用框架,该框架将TWSVM的大多数变体统一起来用于二进制分类问题。基于此,我们然后研究TWSVM的通用一致性。为此,我们为TWSVM提供了一些有用的风险定义,贝叶斯风险和通用一致性。理论结果表明,在某些条件下,包括覆盖数,局部覆盖数和稳定性,通用一致性对于各种TWSVM分类器都是有效的。对于我们通用框架的应用,考虑了TWSVM的几种变体。然后,我们研究TWSVM的通用一致性。为此,我们为TWSVM提供了一些有用的风险定义,贝叶斯风险和通用一致性。理论结果表明,在某些条件下,包括覆盖数,局部覆盖数和稳定性,通用一致性对于各种TWSVM分类器都是有效的。对于我们通用框架的应用,考虑了TWSVM的几种变体。然后,我们研究TWSVM的通用一致性。为此,我们为TWSVM提供了一些有用的风险定义,贝叶斯风险和通用一致性。理论结果表明,在某些条件下,包括覆盖数量,局部覆盖数量和稳定性,通用一致性对于各种TWSVM分类器都是有效的。对于我们通用框架的应用,考虑了TWSVM的几种变体。

更新日期:2021-03-15
down
wechat
bug