Brought to you by:
Article The following article is Open access

An explicit derivation of the time–information uncertainty relation in thermodynamics

Published 4 March 2021 © 2021 The Author(s). Published by IOP Publishing Ltd
, , Citation Katsuaki Tanabe 2021 IOPSciNotes 2 015202 DOI 10.1088/2633-1357/abe99f

2633-1357/2/1/015202

Abstract

A direct, explicit derivation of the recently discovered time–information uncertainty relation in thermodynamics [S. B. Nicholson et al (2020), Nat. Phys. 16, 1211] is presented.

Export citation and abstract BibTeX RIS

Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

The evolution of entropy and related uncertainty relations are of great importance in nonequilibrium thermodynamics and statistical mechanics [13]. Nicholson et al have recently discovered a time–information uncertainty relation in thermodynamics [4]:

Equation (1)

S is the Shannon entropy:

Equation (2)

where kB is the Boltzmann constant, px is the probability of the state x (= 1, 2, ... , N), and the simplified symbol of summation in this note denotes:

Equation (3)

where ax is a general variable. $\dot{S}$ is the evolution rate or time derivative of the entropy:

Equation (4)

ΔI is the standard deviation of the surprisal or information content Ix :

Equation (5)

and ${\rm{\Delta }}\dot{I}$ is the standard deviation of the evolution rate of the surprisal [57]. Because px is a stochastic variable, other statistical variables such as S, Ix , and their derivatives are also stochastic. The relation of equation (1) provides an upper bound of the entropy evolution rate in a system, and thus is positioned as a milestone in multiple fields including informatics, nonequilibrium thermodynamics, and energy science and engineering. The time–information uncertainty relation and the associated speed limit for flows of heat and entropy were validated with a number of examples [4]. This uncertainty applies to various systems ranging from energy transducers [810] to consciousness neuroinformatics [1113]. In this short note, we present a more explicit derivation of the time–information uncertainty relation of equation (1), solely using the most primitive variable, px , as a supplement of the original article by Nicholson et al [4], for the convenience of the community.

Let us start with the formulation of the standard deviation of the surprisal and of the surprisal evolution rate. From equation (5),

Equation (6)

where the symbol of expectation denotes:

Equation (7)

Equation (8)

since

Equation (9)

Next, let us move onto the entropy evolution rate. From equation (2),

Equation (10)

Because $\displaystyle \sum {p}_{x}=1,$ $\displaystyle \sum {\dot{p}}_{x}=\displaystyle \frac{d}{dt}\displaystyle \sum {p}_{x}=0.$ This fact allows for a trick to multiply the second term of equation (10) by $-\displaystyle \sum {p}_{x}\,\mathrm{ln}\,{p}_{x}$ (=$S/{k}_{B}$):

Equation (11)

From equation (11), by further tricky transformations,

Equation (12)

For the transformation across the second equality sign in equation (12), a general property $A=\displaystyle \sum {p}_{x}A,$ where A is a constant, was used. Using the Cauchy–Schwarz inequality ${\left(\displaystyle \sum {a}_{x}{b}_{x}\right)}^{2}\leqslant \left(\displaystyle \sum {{a}_{x}}^{2}\right)\left(\displaystyle \sum {{b}_{x}}^{2}\right),$ where bx is a general variable, from equation (12),

Equation (13)

via equations (6) and (8). Hence, $\left|\dot{S}\right|/{k}_{B}\leqslant {\rm{\Delta }}\dot{I}{\rm{\Delta }}I$ (equation (1)).

Data availability statement

All data that support the findings of this study are included within the article (and any supplementary files).

Please wait… references are loading.