当前位置: X-MOL 学术Phys. Rev. E › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Information entropy and temperature of binary Markov chains
Physical Review E ( IF 2.2 ) Pub Date : 2022-09-19 , DOI: 10.1103/physreve.106.034127
O V Usatenko 1 , S S Melnyk 1 , G M Pritula 1 , V A Yampol'skii 1, 2
Affiliation  

We propose two different approaches for introducing the information temperature of binary Nth-order Markov chains. The first approach is based on a comparison of Markov sequences with equilibrium Ising chains at given temperatures. The second approach uses probabilities of finite-length subsequences of symbols occurring, which determine their entropies. The derivative of the entropy with respect to the energy gives the information temperature measured on the scale of introduced energy. For the case of a nearest-neighbor spin-symbol interaction, both approaches give similar results. However, the method based on the correspondence of the N-step Markov and Ising chains appears to be very cumbersome for N>3. We also introduce the information temperature for the weakly correlated one-parametric Markov chains and present results for the stepwise and power memory functions. An application of the developed method to obtain the information temperature of some literary texts is given.

中文翻译:

二元马尔可夫链的信息熵和温度

我们提出了两种不同的方法来引入二进制的信息温度ñth阶马尔可夫链。第一种方法是基于在给定温度下马尔可夫序列与平衡伊辛链的比较。第二种方法使用有限长度的符号子序列出现的概率,这决定了它们的熵。熵对能量的导数给出了在引入能量的尺度上测量的信息温度。对于最近邻自旋符号交互的情况,两种方法都给出了相似的结果。但是,基于对应关系的方法ñ-step Markov 和 Ising 链对于ñ>3. 我们还介绍了弱相关单参数马尔可夫链的信息温度,并给出了逐步和功率记忆函数的结果。给出了该方法在获取部分文学文本信息温度中的应用。
更新日期:2022-09-20
down
wechat
bug