gptejms
- 386
- 2
From what I have roughly understood, memory entropy is greater than decrease of entropy of the system:-so,'information about the system/memory entropy' + 'entropy of the system' is constant or increasing--right?vanhees71 said:Of course, I'm talking about information in the information-theoretical sense, which has nothing to do with consciousness or vague philosophical ideas of this kind.
Maxwell's demon (particularly its quantum realization in cavity QED in recent works) for me is the prime example for the necessity to introduce information-theoretical methods into a full understanding of (quantum) statistical physics. Among other things these investigations clearly show the correctness of the concept of entropy in the sense of the Shannon-Jaynes-von Neumann entropy of statistical physics. Recently one has proven that indeed the maximum entropy of a qubit is ##k_{\text{B}} \ln 2##.
http://www.pnas.org/content/114/29/7561
https://arxiv.org/abs/1702.05161
https://www.nature.com/articles/s41567-018-0250-5