当前位置: X-MOL 学术Memetic Comp. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Class-specific cost-sensitive boosting weighted ELM for class imbalance learning
Memetic Computing ( IF 3.3 ) Pub Date : 2018-06-28 , DOI: 10.1007/s12293-018-0267-4
Bhagat Singh Raghuwanshi , Sanyam Shukla

Class imbalance problem happens when the training dataset contains significantly fewer instances of one class compared to another class. Traditional classification algorithms like extreme learning machine (ELM) and support vector machine (SVM) are biased towards the majority class. They minimize the least squares error due to which the minority class instances are usually misclassified. The classification algorithms like weighted ELM (WELM) and weighted SVM minimize the weighted least squares error to address the class imbalance problem effectively. The variants of WELM like boosting WELM, ensemble WELM etc. further improve the performance of WELM by employing ensemble method. This work proposes class-specific AdaC1, class-specific AdaC2 and class-specific AdaC3 algorithms for addressing the class imbalance problem more effectively. This work employs kernelized WELM as the component classifier to make the proposed ensemble. The proposed classifier ensembles are the variants of AdaC1, AdaC2 and AdaC3 algorithms. The cost-sensitive boosting classifiers AdaC1, AdaC2 and AdaC3 assign initial weights to the training instances without considering class skewness. This work assigns the initial weights to the training instances based on the class skewness. Moreover, the proposed ensemble rescales the weights assigned to the instances of each class after each iteration such that the total weight assigned to the instances belonging to each class remains equal. The proposed algorithms are assessed by employing the benchmark real-world imbalanced datasets downloaded from the KEEL dataset repository. The superiority of the proposed work over the other state-of-the-art ensemble methods for the class imbalance learning can be observed from the experimental results.

中文翻译:

针对班级不平衡学习的班级特定成本敏感型提升加权ELM

当训练数据集包含的一个类的实例明显少于另一类时,就会发生类不平衡问题。传统分类算法(例如极限学习机(ELM)和支持向量机(SVM))偏向多数类。它们将最小二乘误差最小化,由于该误差,少数类实例通常被错误分类。诸如加权ELM(WELM)和加权SVM之类的分类算法可将加权最小二乘误差最小化,以有效解决类不平衡问题。WELM的变体,例如增强WELM,集成WELM等,通过采用集成方法进一步提高了WELM的性能。这项工作提出了特定于类别的AdaC1,特定于类别的AdaC2和特定于类别的AdaC3算法,以更有效地解决类别不平衡问题。这项工作采用内核化的WELM作为组件分类器,以使建议的集合成为可能。拟议的分类器集成是AdaC1,AdaC2和AdaC3算法的变体。成本敏感的提升分类器AdaC1,AdaC2和AdaC3将初始权重分配给训练实例,而无需考虑类偏度。这项工作根据班级偏度将初始权重分配给训练实例。此外,提出的集成在每次迭代后重新分配分配给每个类的实例的权重,以使分配给属于每个类的实例的总权重保持相等。通过使用从KEEL数据集存储库下载的基准现实世界不平衡数据集来评估提出的算法。
更新日期:2018-06-28
down
wechat
bug