当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Empirically Estimable Classification Bounds Based on a Nonparametric Divergence Measure
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2016-02-01 , DOI: 10.1109/tsp.2015.2477805
Visar Berisha , Alan Wisler , Alfred O. Hero , Andreas Spanias

Information divergence functions play a critical role in statistics and information theory. In this paper we show that a nonparametric f-divergence measure can be used to provide improved bounds on the minimum binary classification probability of error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm these theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks.

中文翻译:

基于非参数散度测度的经验可估计分类界限

信息发散函数在统计学和信息论中起着至关重要的作用。在本文中,我们展示了非参数 f 散度测度可用于为训练和测试数据来自相同分布的情况以及存在一些分布的情况下的最小二元分类错误概率提供改进的界限。训练和测试分布之间的不匹配。我们通过使用这些边界的标准设计特征选择算法并通过在一系列病理语音分类任务上评估算法来确认这些理论结果。
更新日期:2016-02-01
down
wechat
bug