Entropy in Information theory vs thermodynamic

AI Thread Summary
Entropy in information theory indicates that the entropy of a function of a random variable is less than or equal to the entropy of the variable itself, which raises questions about its relationship to the second law of thermodynamics. The second law states that entropy increases unless energy is added to a system, and this increase can be attributed to quantum randomness. When considering functions with multiple arguments, the combined entropy must be evaluated, suggesting that the second law applies differently in these contexts. The distinction between entropy in physics and information theory is crucial, as physical entropy relates to the number of possible states with the same macroscopic description. Thus, while physical entropy may increase, information-theoretic entropy can remain constant, highlighting the complexity of these concepts.
Angella
Messages
2
Reaction score
0
We Now From Information Theory That Entropy Of Functions Of A Random Variable X Is Less Than Or Equal To The Entropy Of X.


25s4i95.jpg


Does It Break The Second Law Of Thermodynamic?
 
Science news on Phys.org
Unless you add energy to a system, its Entropy will increase.
So you have it backwards
H[g(x)] > H(x)
Entropy is a measure of disorder which usually increases
unless work is being done or energy is being added.
 
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.
 
DrZoidberg said:
That law only applies if x is the only argument of that function.
Let's assume x is the state of the universe and g is the laws of physics applied over a specific amount of time. Then g(x) will be the future state of the universe.
However the increase in entropy happens because of quantum randomness, that means our function g needs a second argument R which is a source of randomness. So you then have g(x, R).
That's if x is the state you measure with some instruments i.e. a classical state.
If on the other hand you look at a pure quantum system without any measurements or wave function collapse then there is no randomness and you just have g(x). But in such a case there is also no change in entropy/information. Conservation of information is a basic law of quantum mechanics.

thanks a lot DrZoidberg, but i still have a bit ambiguity.

do you mean that second law applies only in situations that our function have more than one argument? because in the data processing inequality g has just one argument and we see that the second law is not applied!

and can please explain more about R and what is the source of randomness?

and why quantum randomness cause an increase in entropy?

thank you again
 
Actually my previous answer was not completely correct. Of course the entropy law in information also works for more than one argument. But if you have two arguments then you have to look at the combined entropy of both.
So H(x, R) >= H(g(x, R)).
However even if there was no randomness in physics the 2nd law of thermodynamics would probably still be there because the definition of entropy in physics is different from the one in information theory. In physics the entropy/information content of a system is the log of the number of possible states that have the same macroscopic description. Which is of course kind of arbitrary since any physicist can decide what qualifies as macroscopic in any particular case.
That means it's possible for entropy in a physical system to increase even if from an information theory point of view it is staying constant.
 
I was watching a Khan Academy video on entropy called: Reconciling thermodynamic and state definitions of entropy. So in the video it says: Let's say I have a container. And in that container, I have gas particles and they're bouncing around like gas particles tend to do, creating some pressure on the container of a certain volume. And let's say I have n particles. Now, each of these particles could be in x different states. Now, if each of them can be in x different states, how many total...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...

Similar threads

Back
Top