当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Classifier ensemble methods in feature selection
Neurocomputing ( IF 6 ) Pub Date : 2021-01-01 , DOI: 10.1016/j.neucom.2020.07.113
Hakan Ezgi Kiziloz

Abstract Feature selection has become an indispensable preprocessing step in an expert system. Improving the feature selection performance could guide such a system to make better decisions. Classifier ensembles are known to improve performance when compared to the use of a single classifier. In this study, we aim to perform a formal comparison of different classifier ensemble methods on the feature selection domain. For this purpose, we compare the performances of six classifier ensemble methods: a greedy approach, two average-based approaches, two majority voting approaches, and a meta-classifier approach. In our study, the classifier ensemble involves five machine learning techniques: Logistic Regression, Support Vector Machines, Extreme Learning Machine, Naive Bayes, and Decision Tree. Experiments are carried on 12 well-known datasets, and results with statistical tests are provided. The results indicate that ensemble methods perform better than single classifiers, yet, they require a longer execution time. Moreover, they can minimize the number of features better than existing ensemble algorithms, namely Random Forest, AdaBoost, and Gradient Boosting, in a less amount of time. Among ensemble methods, the greedy based method performs well in terms of both classification accuracy and execution time.

中文翻译:

特征选择中的分类器集成方法

摘要 特征选择已成为专家系统中必不可少的预处理步骤。提高特征选择性能可以指导这样的系统做出更好的决策。与使用单个分类器相比,分类器集成可以提高性能。在这项研究中,我们旨在对特征选择域上的不同分类器集成方法进行正式比较。为此,我们比较了六种分类器集成方法的性能:贪婪方法、两种基于平均值的方法、两种多数投票方法和元分类器方法。在我们的研究中,分类器集成涉及五种机器学习技术:逻辑回归、支持向量机、极限学习机、朴素贝叶斯和决策树。实验在 12 个知名数据集上进行,并提供统计检验的结果。结果表明,集成方法比单个分类器表现更好,但它们需要更长的执行时间。此外,与现有的集成算法(即随机森林、AdaBoost 和梯度提升)相比,它们可以在更短的时间内更好地最小化特征数量。在集成方法中,基于贪婪的方法在分类精度和执行时间方面都表现良好。
更新日期:2021-01-01
down
wechat
bug