当前位置: X-MOL 学术J. Stat. Comput. Simul. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Information criteria in classification: new divergence-based classifiers
Journal of Statistical Computation and Simulation ( IF 1.2 ) Pub Date : 2020-07-28 , DOI: 10.1080/00949655.2020.1798445
William D. A. Rodríguez 1 , Getúlio J. A. Amaral 1 , Abraão D. C. Nascimento 1 , Jodavid A. Ferreira 1
Affiliation  

The proposal of efficient classification methods is often required when common conditions (like additivity and normal stochastic behaviour) are not satisfied. Three classical classifiers are the Linear Discriminant Analysis (LDA), K Nearest Neighbours (KNN) and Quadratic Discriminant Analysis (QDA) methods. It is known that the performance of these techniques is strongly affected by the absence of linearity in the separation between/among two or more multivariate data classes. In this paper, we propose semiparametric classification methods that can be lesser sensible to the previous phenomenon. Our classifiers are based on LDA and KNN methods combined with Rényi and Kullback–Leibler stochastic divergences. The performance of various classifiers is evaluated on both normal and non-normal simulated data Further, they are applied to classify multidimensional features of synthetic aperture radar images, which have a multiplicative and non-normal nature due to the speckle noise presence. Results from empirical and real data furnish evidence that proposed methods can provide classification with smaller error rates than the classical LDA, KNN and QDA classification procedures. In particular, the optimum performance happens when order parameter tends to 0.5 for simulated data, while the best classification is achieved at order around 0.95 for real data.

中文翻译:

分类中的信息标准:新的基于散度的分类器

当不满足常见条件(如可加性和正常随机行为)时,通常需要提出有效的分类方法。三个经典分类器是线性判别分析 (LDA)、K 最近邻 (KNN) 和二次判别分析 (QDA) 方法。众所周知,这些技术的性能受到两个或多个多元数据类之间的分离缺乏线性的强烈影响。在本文中,我们提出了对先前现象不太敏感的半参数分类方法。我们的分类器基于 LDA 和 KNN 方法,并结合了 Rényi 和 Kullback-Leibler 随机散度。在正常和非正常模拟数据上评估各种分类器的性能进一步,它们用于对合成孔径雷达图像的多维特征进行分类,这些特征由于散斑噪声的存在而具有乘性和非正态性。来自经验和真实数据的结果证明,所提出的方法可以提供比经典 LDA、KNN 和 QDA 分类程序更小的错误率的分类。特别是,当模拟数据的阶参数趋于 0.5 时,性能最佳,而真实数据的阶参数在 0.95 左右时达到最佳分类。KNN 和 QDA 分类程序。特别是,当模拟数据的阶参数趋于 0.5 时,性能最佳,而真实数据的阶参数在 0.95 左右时达到最佳分类。KNN 和 QDA 分类程序。特别是,当模拟数据的阶参数趋于 0.5 时,性能最佳,而真实数据的阶参数在 0.95 左右时达到最佳分类。
更新日期:2020-07-28
down
wechat
bug