当前位置: X-MOL 学术Appl. Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Feature selection based on regularization of sparsity based regression models by hesitant fuzzy correlation
Applied Soft Computing ( IF 8.7 ) Pub Date : 2020-03-24 , DOI: 10.1016/j.asoc.2020.106255
Mahla Mokhtia , Mahdi Eftekhari , Farid Saberi-Movahed

In this paper, the Ridge, LASSO and Elastic Net regression methods are adapted for the task of selecting feature. In order to enhance the feature selection performance via these methods, a Hesitant Fuzzy Correlation Matrix (HFCM) is added to the objective functions of these models for addressing the minimum redundancy of features. To this end, the fuzzy C-means clustering is utilized, and the obtained fuzzy clusters are projected on the features in a way that the number of fuzzy Membership Functions (MF) for each feature is equal to the number of clusters. Then, the projected MFs on each feature are considered as a Hesitant Fuzzy Set (HFS), and thereby the hesitant fuzzy correlation between features is calculated. Afterward, the obtained HFCM is employed in the regression methods for securing the minimum redundancy of features. Eventually, the accuracies of the selected features, achieved by these methods, are determined by three different classification models such as Naive Bayes, SVM and Decision Tree. A large number of experiments are conducted over twenty-four classification datasets to demonstrate the efficiency and applicability of using HFCM in some classical regression methods.



中文翻译:

基于犹豫模糊相关性的基于稀疏性回归模型正则化的特征选择

在本文中,Ridge,LASSO和Elastic Net回归方法适用于特征选择任务。为了通过这些方法增强特征选择性能,将犹豫模糊相关矩阵(HFCM)添加到这些模型的目标函数中,以解决特征的最小冗余问题。为此,利用模糊C-均值聚类,将获得的模糊聚类投影到特征上,使得每个特征的模糊隶属函数(MF)的数量等于聚类的数量。然后,将在每个特征上投影的MF视为犹豫模糊集(HFS),从而计算特征之间的犹豫模糊相关性。然后,将获得的HFCM用于回归方法,以确保特征的最小冗余。最终,通过这些方法实现的所选功能的准确性由三种不同的分类模型(如朴素贝叶斯,支持向量机和决策树)确定。在24个分类数据集中进行了大量实验,以证明在某些经典回归方法中使用HFCM的效率和适用性。

更新日期:2020-03-24
down
wechat
bug