Entropy in Information theory vs thermodynamic

Click For Summary
SUMMARY

The discussion centers on the relationship between entropy in information theory and thermodynamics, specifically addressing whether the second law of thermodynamics applies when considering functions of a random variable. It is established that the entropy of a function H[g(x)] is greater than or equal to H(x), and that the second law of thermodynamics holds true when additional arguments, such as randomness, are included in the function. The conversation highlights that quantum randomness contributes to entropy increase, and that the definitions of entropy in physics and information theory differ significantly, affecting how entropy is perceived in various contexts.

PREREQUISITES
  • Understanding of entropy in information theory
  • Familiarity with the second law of thermodynamics
  • Basic knowledge of quantum mechanics
  • Concept of functions in mathematical terms
NEXT STEPS
  • Explore the implications of the data processing inequality in information theory
  • Study the definitions and calculations of entropy in thermodynamics
  • Investigate the role of quantum randomness in entropy and information conservation
  • Learn about the relationship between macroscopic states and entropy in physical systems
USEFUL FOR

Researchers, physicists, and information theorists interested in the interplay between entropy in different domains, particularly those exploring quantum mechanics and thermodynamic principles.

Angella
Messages
2
Reaction score
0
We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X.


25s4i95.jpg


Does It Break The Second Law Of Thermodynamic?
 
Science news on Phys.org
Unless you add energy to a system, its Entropy will increase.
So you have it backwards
H[g(x)] > H(x)
Entropy is a measure of disorder which usually increases
unless work is being done or energy is being added.
 
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.
 
DrZoidberg said:
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.

thanks a lot DrZoidberg, but i still have a bit ambiguity.

do you mean that second law applies only in situations that our function have more than one argument? because in the data processing inequality g has just one argument and we see that the second law is not applied!

and can please explain more about R and what is the source of randomness?

and why quantum randomness cause an increase in entropy?

thank you again
 
Actually my previous answer was not completely correct. Of course the entropy law in information also works for more than one argument. But if you have two arguments then you have to look at the combined entropy of both.
So H(x, R) >= H(g(x, R)).
However even if there was no randomness in physics the 2nd law of thermodynamics would probably still be there because the definition of entropy in physics is different from the one in information theory. In physics the entropy/information content of a system is the log of the number of possible states that have the same macroscopic description. Which is of course kind of arbitrary since any physicist can decide what qualifies as macroscopic in any particular case.
That means it's possible for entropy in a physical system to increase even if from an information theory point of view it is staying constant.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 18 ·
Replies
18
Views
6K
  • · Replies 33 ·
2
Replies
33
Views
3K
Replies
1
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K