当前位置: X-MOL 学术Sādhanā › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Simultaneous two-sample learning to address binary class imbalance problem in low-resource scenarios
Sādhanā ( IF 1.6 ) Pub Date : 2020-07-04 , DOI: 10.1007/s12046-020-01411-4
Sri Harsha Dumpala , Rupayan Chakraborty , Sunil Kumar Kopparapu

Binary class imbalance problem refers to the scenario where the number of training samples in one class is much lower compared with the number of samples in the other class. This imbalance hinders the applicability of conventional machine learning algorithms to classify accurately. Moreover, many real world training datasets often fall in the category where data is not only imbalanced but also low-resourced. In this paper we introduce a novel technique to handle the class imbalance problem, even in low-resource scenarios. In our approach, instead of, as is common, learning using one sample at a time, two samples are simultaneously considered to train the classifier. The simultaneous two-sample learning seems to help the classifier learn both intra- and inter-class properties. Experiments conducted on a large number of benchmarked datasets demonstrate the enhanced performance of our technique over the existing state of the art techniques.



中文翻译:

同时进行两样本学习,以解决资源短缺情况下的二元类不平衡问题

二元类不平衡问题是指一种情况,其中一类训练样本的数量比另一类训练样本的数量少得多。这种不平衡妨碍了常规机器学习算法进行准确分类的适用性。此外,许多现实世界中的训练数据集通常属于数据不仅不平衡而且资源匮乏的类别。在本文中,我们介绍了一种即使在资源匮乏的情况下也可以处理类不平衡问题的新技术。在我们的方法中,不是像通常一次使用一个样本那样学习,而是同时考虑两个样本来训练分类器。同时进行的两个样本学习似乎有助于分类器学习类内和类间属性。

更新日期:2020-07-05
down
wechat
bug