# Is there a generalized second law of thermodynamics?

• A
Heidi
Hi Pfs,
There are different kinds of entropies.
I discoved the free entropy.
https://arxiv.org/pdf/math/0304341.pdf
the second law says that the total entropy cannot decrease when time goes by.
Is it always the same "time" for the different entropies?
the author, Voiculescu, wrote articles about Alain Connes factors
Does his entropy obey a second law of thermodynamics for the one parameter group Connes discovered?

Homework Helper
Shannon's "entropy" and thermodynamic "entropy" are mathematically similar but they have quite different meanings and very different relationships with time.

In thermodynamics, one is just comparing different equilibrium states. The time lapse between them is not material so long as one follows from the other as the result of some process. What is important in comparing two states is the order in which they occurred. The second law, which in the real world requires entropy to increase overall in successive states of the universe, implies a single direction for time ie. time flows in the direction of increasing entropy of the universe.

In Shannon entropy, messages contain information about something that existed or has occurred. Entropy is a measure of the usefulness of the information contained in a message. But there is no requirement that entropy must increase with successive messages.

It is important to distinguish thermodynamic entropy from information entropy. They are completely different in nature.
• In thermodynamics, entropy is a statistical concept applying to large numbers of molecules in thermodynamic equilibrium. It is a measure of the number of possible ways for a system to exist at the molecular level and still exhibit the same macroscopic state. The concept of equilibrium does not apply to a single atom or molecule or even a small number of molecules. It is meaningless to talk about the entropy of a molecule, for example
• Shannon entropy deals with information involving countable units of data. In information theory, entropy is a measure of the information content of a message. A message communicating the result of a single coin flip contains less information than a message communicating the results of 10 coin flips. So the latter message is assigned a higher entropy value. If a two headed coin is flipped, we know that it will come up heads so the result of such a flip conveys no information: entropy=0.

As far as the superficial mathematical similarities are concerned:
• Shannon entropy varies as the logarithm of the number of possible ways to compose the message to convey different information. So if you send a message conveying the results of 10 head/tail coin flips, there are 210=1024 possible ways that message could be configured to convey different information. So such a message has an entropy that is 10 times greater than a message conveying the results of one flip (21 possibilities).
• In statistical thermodynamics, entropy varies as the logarithm of the number of possible states (position and velocity) of the molecules of a system that would result in the same macroscopic equilibrium state for that system.

AM

• dRic2 and Delta2
Gold Member
The Landauer's principle relates those two kinds of entropies. According to this principle, when you erase information entropy, you produce thermodynamic entropy by an amount equal or larger than the erased information entropy.

• • dRic2, protonsarecool and vanhees71
Homework Helper
The Landauer's principle relates those two kinds of entropies. According to this principle, when you erase information entropy, you produce thermodynamic entropy by an amount equal or larger than the erased information entropy.

As I understand the principle, it says that regardless of how small the physical means of carrying information, changing the smallest amount of information will still require a non-zero expenditure of energy. This will result in some increase in thermodynamic entropy. But this is more about thermodynamics than information theory. I really don't see how it relates Shannon entropy.

Feynman, in his Lectures ( no. 46), alludes to a similar principle. He suggests using a microscopic ratchet and pawl mechanism to select only molecules with high energy to pass through an aperture in a partition and separate a gas into hot and cold compartments without using energy and, thereby, avoid the second law of thermodynamics. But then he shows that without an expenditure of a minimum amount of energy in the form of mechanical work, the ratchet and pawl would not function. As a result, there would be no separation of the faster and slower molecules. Second Law prevails.

AM

• Delta2
Gold Member
As I understand the principle, it says that regardless of how small the physical means of carrying information, changing the smallest amount of information will still require a non-zero expenditure of energy.
Erasing, not necessarily changing. Reversible logical gates don't necessarily need to produce thermodynamic entropy.

This will result in some increase in thermodynamic entropy. But this is more about thermodynamics than information theory. I really don't see how it relates Shannon entropy.
It's more about thermodynamic entropy, I agree, but it doesn't mean that it's not about Shannon entropy at all. Physical carrier of Shannon entropy is related to abstract Shannon entropy, just as physical computer is related to abstract computing.

Fra
The information content in physics, is presumable encoded in the distinguishable microstates of matter, it's just a generalization of binary code. But in the general case, the information can even be encoded in several different microstructures(or probability spaces), not just one. Because the microstate of an obserer, can of course encode also information about motion. Different observers may not agree upon what is distinguishable to start with, which leads to be well known issues of unitarity in QM and QG foundations. These spaces may not even commute or be independent. If you even add not just stationary states, but also states of motion, then the entropic principles effectively is elevate to a kind of action principles. As the "entropy measure" is generalised to an "action measure" or information divergence/relative entropy, like the kullback leibler divergence. It's given different names. But the abstractions are IMO quite similar and have common roots. These things is what comes to my mind if one starts to aske for generalized second laws, and wether "time" (arrow of time?) is the same for different intrinsic statistical flows?

I think entropy is as a relative notion as is probability, but it depends on the perspective and "interpretation" of probability as well. Especially in the foundational context of QM where one wants to include gravity. The only way "entropy" is special is that it's just a log measure of probability, to make multiplicative combinations appear additive. All other philosophical questions about entropy, are the same as with probabilities, or "transition probabilities" or "actions".

/Fredrik

• Delta2
Heidi
I have a different question about the entropy and the second law
the entropy cannot decreas if is greater or equal a time t>0 than what it was a t=0.
Would a CPT observer have also a second law in its reversed time?

Gold Member
I have a different question about the entropy and the second law
the entropy cannot decreas if is greater or equal a time t>0 than what it was a t=0.
Would a CPT observer have also a second law in its reversed time?
CPT invariance is a property of the microscopic laws. The second law, on the other hand, is only a macroscopic law. At the microscopic level, it is not valid. In fact, from a microscopic point of view, the send "law" is not a law but a property of a specific solution (of the equations of motion) in which our universe happens to be. The second "law" is in fact a consequence of special initial conditions, namely small initial entropy. So if you apply the CPT transformation to this solution, you get another solution in which entropy decreases with time.

*now*
This detailed account looks relevant -
https://quantum-journal.org/papers/q-2021-08-09-520/

The arrow of time in operational formulations of quantum theory
“The operational formulations of quantum theory are drastically time oriented. However, to the best of our knowledge, microscopic physics is time-symmetric. We address this tension by showing that the asymmetry of the operational formulations does not reflect a fundamental time-orientation of physics…”

• vanhees71
Fra
I think there is also the complementary perspective to Rovellis to turn the argument around, that the time-symmetric physics may be the result of the inferential perspective of collecting statistics of repeats of time subatomic processes from a massive labframe. It is not be a conincidence that such inference scheme produces timless laws, as that is implicitly what you seek when doing statistics. But wether these inferred frozen laws are the true fundamentals is not obvious I think, and the more you introduce gravity and cosmological models, the less obvious does it get IMHO.

/Fredrik

• Delta2
Heidi
I am not sure to understand these lines in the Rovelli's paper:
Decoherence requires information loss and an increase in entropy. Hence RQM is a time-symmetric formulation of quantum theory, but the dynamics of relative facts is time symmetric while the dynamics of stable facts is time oriented.

Does this time symmetry in the decoherence process implies that entropy increases in the two time symmetric measurements or processes?

Homework Helper
In thermodynamics, the arrow of time has to do with changes from one equilibrium state (of a closed isolated system) to another . The equilibrium state is time symmetric itself. That is to say that a change from one particular microstate of an equilibrium state to another microstate of that same equilibrium state is time symmetric - there is no requirement that one follow the other. Similarly, changes between quantum states are time symmetric. It is only when the quantum states decohere into a macroscopic state that time reversal is not possible.

AM

• *now* and DrChinese
Heidi
Rovelli writes:
the dynamics of stable facts is time oriented.
could give examples of such stable facts?

*now*
if I understand it, examples of stable facts can be systems that may be manipulated and measured macroscopically.

*now*
I came across a more recent paper that might update some finer details -

“Information is Physical: Cross-Perspective Links in Relational Quantum Mechanics“

https://arxiv.org/pdf/2203.13342.pdf

*now*
Further, as an inference was drawn (in some alternative case), this paper considers possible such inferences (although this may not apply in the alternative case),

https://arxiv.org/abs/1910.02474

### Neither Presentism nor Eternalism​

Carlo Rovelli
Is reality three-dimensional and becoming real (Presentism), or is reality four-dimensional and becoming illusory (Eternalism)? Both options raise difficulties. I argue that we do not need to be trapped by this dilemma. There is a third possibility: reality has a more complex temporal structure than either of these two naive options. Fundamental becoming is real, but local and unoriented. A notion of present is well defined, but only locally and in the context of approximations.