Skip to main content
Log in

Tradeoff Relation between Mutual Information and Error Probability in Data Classification Problem

  • INFORMATION SCIENCE
  • Published:
Computational Mathematics and Mathematical Physics Aims and scope Submit manuscript

Abstract

A data classification model in which the average mutual information between source objects and made decisions is a function of the error probability is investigated. Optimization of the model consists in finding a tradeoff “mutual information–error probability” (MIEP) relation between the minimal average mutual information and the error probability, which is analogous to the well-known rate distortion function for source coding with a given fidelity in the case of a noisy observation channel. A lower bound for the MIEP relation is constructed, which provides a lower bound for the classification error probability on a given set of objects for any fixed value of the average mutual information. The MIEP relation and its lower bound are generalized to ensembles of sources. The obtained bounds are useful for estimating the error probability redundancy for decision algorithms with given sets of discriminant functions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

Similar content being viewed by others

REFERENCES

  1. R. G. Gallager, Information Theory and Reliable Communication (Wiley, New York, 1968).

    MATH  Google Scholar 

  2. L. I. Kuncheva, C. J. Whitaker, C. A. Shipp, and R. P. W. Duin, “Limits on the majority vote accuracy in classifier fusion,” Pattern Anal. Appl. 6, 22–31 (2003). https://doi.org/10.1007/s10044-002-0173-7

    Article  MathSciNet  MATH  Google Scholar 

  3. L. Lam and C. Y. Suen, “Application of majority voting to pattern recognition: An analysis of its behavior and performance,” IEEE Trans. Syst. Man Cybern. 27 (5), 553–568 (1997). https://doi.org/10.1109/3468.618255

    Article  Google Scholar 

  4. D. J. C. MacKay, Information Theory, Inference, and Learning Algorithms (Cambridge Univ. Press, Cambridge, 2003).

    MATH  Google Scholar 

  5. R. L. Dobrushin and B. S. Tsybakov, “Information transmission with additional noise,” IRE Trans. Inf. Theory 8 (5), 293–304 (1962). https://doi.org/10.1109/TIT.1962.1057738

    Article  MATH  Google Scholar 

  6. M. Lange, S. Ganebnykh, and A. Lange, “An information approach to accuracy comparison for classification schemes in an ensemble of data sources,” Intelligent Data Processing (Springer, Berlin, 2019), Vol. 794, pp. 28–43. https://doi.org/10.1007/978-3-030-35400-8_3.

  7. M. M. Lange, S. N. Ganebnykh, and A. M. Lange, “On an information-theoretical lower bound to a classification error probability,” Abstracts of the 19th Russian Conference on Mathematical Methods for Pattern Recognition (MMPR-19) (Moscow, 2019).

  8. R. O. Duda, P. E. Hart, and D. G. Stork, Pattern Classification, 2nd ed. (Wiley, New York, 2001).

    MATH  Google Scholar 

  9. M. M. Lange and S. N. Ganebnykh, “On fusion schemes for multiclass object classification with reject in a given ensemble of sources,” J. Phys. Conf. Ser. 1096 (012048) (2018). https://doi.org/10.1088/1742-6596/1096/1/012048

  10. "Distance matrices for sign images." http://sourceforge.net/projects/distance-matrices-signature. Accessed June 2020.

  11. "Distance matrices for face images." http://sourceforge.net/projects/distance-matrices-face. Accessed June 2020.

Download references

Funding

This work was supported in part by the Russian Foundation for Basic Research, project no. 18-07-01231.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to A. M. Lange, M. M. Lange or S. V. Paramonov.

Additional information

Translated by I. Ruzanova

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lange, A.M., Lange, M.M. & Paramonov, S.V. Tradeoff Relation between Mutual Information and Error Probability in Data Classification Problem. Comput. Math. and Math. Phys. 61, 1181–1193 (2021). https://doi.org/10.1134/S0965542521070113

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0965542521070113

Keywords:

Navigation