当前位置: X-MOL 学术Comput. Math. Math. Phys. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Tradeoff Relation between Mutual Information and Error Probability in Data Classification Problem
Computational Mathematics and Mathematical Physics ( IF 0.7 ) Pub Date : 2021-08-22 , DOI: 10.1134/s0965542521070113
A. M. Lange 1 , M. M. Lange 1 , S. V. Paramonov 1
Affiliation  

Abstract

A data classification model in which the average mutual information between source objects and made decisions is a function of the error probability is investigated. Optimization of the model consists in finding a tradeoff “mutual information–error probability” (MIEP) relation between the minimal average mutual information and the error probability, which is analogous to the well-known rate distortion function for source coding with a given fidelity in the case of a noisy observation channel. A lower bound for the MIEP relation is constructed, which provides a lower bound for the classification error probability on a given set of objects for any fixed value of the average mutual information. The MIEP relation and its lower bound are generalized to ensembles of sources. The obtained bounds are useful for estimating the error probability redundancy for decision algorithms with given sets of discriminant functions.



中文翻译:

数据分类问题中互信息与错误概率的权衡关系

摘要

研究了一种数据分类模型,其中源对象和做出的决策之间的平均互信息是错误概率的函数。模型的优化在于找到最小平均互信息和错误概率之间的权衡“互信息 - 错误概率”(MIEP)关系,这类似于众所周知的具有给定保真度的源编码率失真函数嘈杂的观察通道的情况。构建了 MIEP 关系的下限,该下限为给定对象集的分类错误概率提供了平均互信息的任何固定值的下限。MIEP 关系及其下界被推广到源集合。

更新日期:2021-08-23
down
wechat
bug