Entropy in Information theory vs thermodynamic

In summary, the second law of thermodynamics is not always obeyed, depending on the situation. The increase in entropy is due to quantum randomness.
  • #1
Angella
2
0
We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X.


25s4i95.jpg


Does It Break The Second Law Of Thermodynamic?
 
Science news on Phys.org
  • #2
Unless you add energy to a system, its Entropy will increase.
So you have it backwards
H[g(x)] > H(x)
Entropy is a measure of disorder which usually increases
unless work is being done or energy is being added.
 
  • #3
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.
 
  • #4
DrZoidberg said:
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.

thanks a lot DrZoidberg, but i still have a bit ambiguity.

do you mean that second law applies only in situations that our function have more than one argument? because in the data processing inequality g has just one argument and we see that the second law is not applied!

and can please explain more about R and what is the source of randomness?

and why quantum randomness cause an increase in entropy?

thank you again
 
  • #5
Actually my previous answer was not completely correct. Of course the entropy law in information also works for more than one argument. But if you have two arguments then you have to look at the combined entropy of both.
So H(x, R) >= H(g(x, R)).
However even if there was no randomness in physics the 2nd law of thermodynamics would probably still be there because the definition of entropy in physics is different from the one in information theory. In physics the entropy/information content of a system is the log of the number of possible states that have the same macroscopic description. Which is of course kind of arbitrary since any physicist can decide what qualifies as macroscopic in any particular case.
That means it's possible for entropy in a physical system to increase even if from an information theory point of view it is staying constant.
 

1. What is the difference between entropy in information theory and thermodynamic?

In information theory, entropy is a measure of the uncertainty or randomness of a system, while in thermodynamics, it is a measure of the disorder or energy dispersal of a system.

2. How is entropy related to disorder?

In both information theory and thermodynamics, entropy is related to disorder. In information theory, higher entropy means more uncertainty and randomness in a system, while in thermodynamics, higher entropy means more disorder and less available energy.

3. Can entropy be decreased?

In information theory, entropy can be decreased by reducing the uncertainty or randomness in a system. However, in thermodynamics, entropy cannot be decreased, it can only be redistributed or converted into other forms of energy.

4. How is entropy calculated in information theory and thermodynamics?

In information theory, entropy is calculated using the Shannon entropy formula, which takes into account the probabilities of different outcomes in a system. In thermodynamics, entropy is calculated using the Boltzmann formula, which takes into account the number of microstates and the total energy of a system.

5. What is the relationship between information and entropy?

In information theory, information and entropy are inversely related. This means that as the amount of information increases, the entropy decreases. In thermodynamics, there is also a relationship between information and entropy, as information can be thought of as a form of energy and therefore, can contribute to entropy in a system.

Similar threads

Replies
12
Views
1K
  • Thermodynamics
Replies
18
Views
4K
  • Thermodynamics
Replies
33
Views
2K
  • Thermodynamics
Replies
2
Views
773
Replies
100
Views
6K
Replies
6
Views
2K
Replies
1
Views
658
Replies
13
Views
1K
  • Thermodynamics
Replies
3
Views
788
Replies
1
Views
907
Back
Top