Skip to main content
Log in

Entropy and Compression: A Simple Proof of an Inequality of Khinchin-Ornstein-Shields

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

This paper concerns the folklore statement that “entropy is a lower bound for compression.” More precisely, we derive from the entropy theorem a simple proof of a pointwise inequality first stated by Ornstein and Shields and which is the almost-sure version of an average inequality first stated by Khinchin in 1953. We further give an elementary proof of the original Khinchin inequality, which can be used as an exercise for information theory students, and we conclude by giving historical and technical notes of such inequality.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ornstein, D. and Shields, P.C., Universal Almost Sure Data Compression, Ann. Probab., 1990, vol. 18, no. 2, pp. 441–452.

    Article  MathSciNet  Google Scholar 

  2. Shields, P.C., The Ergodic Theory of Discrete Sample Paths, Providence, R.I.: Amer. Math. Soc., 1996.

    Book  Google Scholar 

  3. Shannon, C.E., A Mathematical Theory of Communication, Bell Syst. Tech. J., 1948, vol. 27, no. 3, pp. 379–423.

    Article  MathSciNet  Google Scholar 

  4. Cover, T.M. and Thomas, J.A., Elements of Information Theory, Hoboken, NJ: Wiley, 2006, 2nd ed.

    MATH  Google Scholar 

  5. McMillan, B., The Basic Theorems of Information Theory, Ann. Math. Statist., 1953, vol. 24, no. 2, pp. 196–219.

    Article  MathSciNet  Google Scholar 

  6. Breiman, L., The Individual Ergodic Theorem of Information Theory, Ann. Math. Statist., 1957, vol. 28, no. 3, pp. 809–811.

    Article  MathSciNet  Google Scholar 

  7. Breiman, L., Correction Notes: Correction to “The Individual Ergodic Theorem of Information Theory,” Ann. Math. Statist., 1960, vol. 31, no. 3, pp. 809–810.

    Article  MathSciNet  Google Scholar 

  8. Chung, K.L., A Note on the Ergodic Theorem of Information Theory, Ann. Math. Statist., 1961, vol. 2, no. 2, pp. 612–614.

    Article  MathSciNet  Google Scholar 

  9. Chung, K.L., The Ergodic Theorem of Information Theory, Recent Developments in Information and Decision Processes (Proc. 3rd Sympos. on Information and Decision Processes, Purdue Univ., Lafayette, IN, USA, April 12–13, 1961), Machol, R.E. and Gray, P.E., Eds., New York: Macmillan, 1962, pp. 141–148.

    Google Scholar 

  10. Blundo, C. and De Prisco, R., New Bounds on the Expected Length of One-to-One Codes, IEEE Trans. Inform. Theory, 1996, vol. 42, no. 1, pp. 246–250.

    Article  MathSciNet  Google Scholar 

  11. Kontoyiannis, I., Second-Order Noiseless Source Coding Theorems, IEEE Trans. Inform. Theory, 1997, vol. 43, no. 4, pp. 1339–1341.

    Article  MathSciNet  Google Scholar 

  12. Kontoyiannis, I. and Verdú, S., Optimal Lossless Data Compression: Non-asymptotics and Asymptotics, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 2, pp. 777–795.

    Article  MathSciNet  Google Scholar 

  13. Khinchin, A.Ya., The Entropy Concept in Probability Theory, Uspekhi Mat. Nauk, 1953, vol. 8, no. 3 (55), pp. 3–20.

    MATH  Google Scholar 

  14. Yeung, R.W., Information Theory and Network Coding, Boston: Springer, 2008.

    MATH  Google Scholar 

  15. Csiszar, I. and Körner, J., Information Theory: Coding Theorems for Discrete Memoryless Systems, Cambridge, UK: Cambridge Univ. Press, 2011, 2nd ed.

    Book  Google Scholar 

  16. MacKay, D.J.C., Information Theory, Inference and Learning Algorithms, New York: Cambridge Univ. Press, 2003.

    MATH  Google Scholar 

  17. Barron, A.R., Logically Smooth Density Estimation, PhD Thesis, Stanford Univ., CA, USA, 1985.

    Google Scholar 

  18. Kieffer, J.C., Sample Converses in Source Coding Theory, IEEE Trans. Inform. Theory, 1991, vol. 37, no. 2, pp. 263–268.

    Article  MathSciNet  Google Scholar 

  19. Algoet, P.H., Log-Optimum Investment, PhD Thesis, Stanford Univ., CA, USA, 1985.

    Google Scholar 

  20. Khinchin, A.I., Mathematical Foundations of Information Theory, New York: Dover, 1957.

    MATH  Google Scholar 

  21. Kieffer, J.C., A Simple Proof of the Moy-Perez Generalization of the Shannon-McMillan Theorem, Pacific J. Math., 1974, vol. 51, no. 1, pp. 203–206.

    Article  MathSciNet  Google Scholar 

  22. Gallager, R.G., Information Theory and Reliable Communication, New York: Wiley, 1968.

    MATH  Google Scholar 

  23. Algoet, P.H. and Cover, T.M., A Sandwich Proof of the Shannon-McMillan-Breiman Theorem, Ann. Probab., 1988, vol. 16, no. 2, pp. 899–909.

    Article  MathSciNet  Google Scholar 

  24. Barron, A.R., The Strong Ergodic Theorem for Densities: Generalized Shannon-McMillan-Breiman Theorem, Ann. Probab., 1985, vol. 13, no. 4, pp. 1292–1303.

    Article  MathSciNet  Google Scholar 

  25. Bjelaković, I., Krüger, T., Siegmund-Schultze, R., and Szkoła, A., The Shannon-McMillan Theorem for Ergodic Quantum Lattice Systems, Invent. Math., 2004, vol. 155, no. 1, pp. 203–222.

    Article  MathSciNet  Google Scholar 

  26. Longo, G. and Sgarro, A., The Source Coding Theorem Revisited: A Combinatorial Approach, IEEE Trans. Inform. Theory, 1979, vol. 25, no. 5, pp. 544–548.

    Article  MathSciNet  Google Scholar 

  27. Hansel, G., Perrin, D., and Simon, I., Compression and Entropy, Proc. 9th Annual Sympos. on Theoretical Aspects of Computer Science (STACS’92), Cachan, France, Feb. 13–15, 1992, Finkel, A. and Jantzen, M., Eds., Lect. Notes Comp. Sci., vol. 577, Berlin: Springer, 1992, pp. 515–528.

    Google Scholar 

  28. Elias, P., Universal Codeword Sets and Representations of the Integers, IEEE Trans. Inform. Theory, 1975, vol. 21, no. 2, pp. 194–203.

    Article  MathSciNet  Google Scholar 

  29. Kraft, L.G., A Device for Quantizing, Grouping, and Coding Amplitude-Modulated Pulses, PhD Thesis, MIT, Cambridge, USA, 1949.

    Google Scholar 

  30. de Luca, A., On the Entropy of a Formal Language, Automata Theory and Formal Languages (Proc. 2nd GI Conf., Kaiserslautern, Germany, May 20–23, 1975), Brakhage, H., Ed., Lect. Notes Comp. Sci., vol. 33, Berlin: Springer, 1975, pp. 103–109.

    Chapter  Google Scholar 

Download references

Acknowledgements

The authors thank Professor D. Perrin for pointing out the reference [2] during a conference in Rome, July 11–12, 2019, in memoriam of Professor de Luca, where they presented a preliminary version of the above results. The authors are also grateful to the referees for their suggestions.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to R. Aragona, F. Marzi, F. Mignosi or M. Spezialetti.

Additional information

In memoriam of Professor Aldo de Luca

Additional Information

R. Aragona is a member of INdAM-GNSAGA, Italy.

Russian Text © The Author(s), 2020, published in Problemy Peredachi Informatsii, 2020, Vol. 56, No. 1, pp. 15–25.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aragona, R., Marzi, F., Mignosi, F. et al. Entropy and Compression: A Simple Proof of an Inequality of Khinchin-Ornstein-Shields. Probl Inf Transm 56, 13–22 (2020). https://doi.org/10.1134/S0032946020010020

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0032946020010020

Key words

Navigation