Is there a deeper connection between thermal and information entropy?

In summary, there are multiple interpretations of entropy, including thermodynamic entropy and information entropy. While the idea that entropy is related to disorder is a common misconception, it is not entirely incorrect. Entropy can also be thought of as a measure of uncertainty or lack of information about a system. However, thermodynamic entropy has additional physical implications and is not completely equivalent to information entropy. This is due to the concept of macrostates and microstates, which are important in understanding the true meaning of entropy.
  • #1
revo74
72
0
The idea that entropy is the same as disorder is false. This is a misconception many people make or have made, including myself.

A person wrote this in response to the statement "Entropy does NOT mean disorder."
Is this correct? Please elaborate on this. Additionally, isn't thermodynamic entropy different than information entropy?

“Technically you are right. The entropy S is:

S = – Σ P(i) log P(i)

where P(i) is the probability of a particle in a state i, and Σ means the sum. It was Boltzmann who advocated the idea that entropy was related to disorder. In Boltzmann's mind, the more ways a system could move internally, the more disorderly the system was. A system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement whatsoever, hence S would be at a minimum. An example of low entropy (high order) would be ice, while water vapor would be high disorder, high entropy.

That was in the 19th century, and this concept prevailed until the mid 20th century. It was Shannon in the 1940's who revolutionized the field of thermodynamics. He defined entropy in terms of information. If my knowledge of the system is high, entropy is low. If my lack of information is high entropy is high. This is the definition currently accepted. It's the one that Susskind used to derive his groundbreaking concept of the holographic principle. Instead of order/disorder you have entropy as the amount of information. The bigger the system, that is the greater the number of degees the system has, the greater is my lack of information in terms of which state the particle is in. A gas is an example of a system with high entropy, the molecules occupied a greater number of states with greater velocities, my lack of information is therefore very high, hence entropy is high. In Boltzman's term, "a system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement", that would mean I know where the particles are in the case of perfect order, my knowledge is high, my lack of information is low, hence entropy is low. So you can see that order/disorder and amount of information are equivalent. They don't contradict each other.

Now take the case of the Big Bang. Initially, the universe occupied a tiny volume, the size of Planck length. My knowledge of where are all the particles is high (low entropy). As the universe expands, that is, it occupies more and more volume, it's harder for me to keep track of each particle, my lack of knowledge grows, so entropy increases. I can still look at this scenario in terms of order/disorder. Initially, the universe has low entropy ( high order), as it expands, it becomes more disorder ( entropy increases). In either description, there is no contradiction.

Also see : Entropy_(information_theory)”
 
Science news on Phys.org
  • #2
I would appreciate even one response.
 
  • #3
Entropy has more than one meaning. In thermodynamics it referes to the amount of energy in a system not available for useful work. For information, it is a little more complicated but the person you quoted was pretty much correct. There is also Statistical Thermodynamics Entropy. See here: http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics )
 
Last edited by a moderator:
  • #4
revo74 said:
I would appreciate even one response.

Response here :wink:.

I had subscribed to your thread to see if anyone would give some insights, because I'm afraid I'm out my depth here, and I am interested!
 
  • #5
The idea that entropy is disorder is an attempt to give an intuitive meaning to entropy, and it's not too bad, but not perfect. The description given in the quote is rather good. To understand it better, you should understand the concept of macrostate and microstate. If you have a homogeneous gas in equilibrium, it's macrostate is given by, for example, its volume, pressure, and the number of molecules in the volume. Its microstate is given by the microscopic description of each particle of the gas. In quantum mechanics, there are only a limited number of ways that you can have those particles in that volume, at that pressure, but the number is huge. This number is the number of microstates. The thermodynamic entropy is proportional to the logarithm of the number of possible microstates you could have that would give you the same macrostate.

An analogy that is usually made is if you have a messy desk, there are many different ways to have a messy desk, but only a few ways to have an ordered desk. If you think of disorder as having the quality that there are many ways to have disorder, fewer ways to have order, then the disorder analogy works. If you ever find a case where the analogy doesn't work, throw it out, because its the microstate/macrostate idea that works. Entropy is best thought of as a measure of uncertainty about the microstate, given the macrostate. However there is more to thermodynamic entropy. Thermodynamic entropy is proportional to the natural logarithm of number of microstates for a given macrostate and the constant of proportionality is Boltzmann's constant. Thermodynamic entropy has many physical implications regarding temperature, internal energy, etc. etc. and these physical implications are mostly missing in the idea of information entropy. The natural logarithm of the number of microstates for a given macrostate is the "information" part of thermodynamic entropy, the Boltzmann constant multiplying it is what turns it into thermodynamic entropy, with all of the physics implications. Many people object to using the word "entropy" to mean anything but the full thermodynamic entropy, with all of its physical implications. I don't.

Another aspect of the disagreement is that thermodynamic entropy was defined in terms of macroscopic measurements long before Boltzmann gave an explanation of entropy in terms of microstates and macrostates. This means that, strictly speaking, thermodynamic entropy is not defined in terms of information entropy, it is only when you consider Boltzmann's analysis that you see that thermodynamic entropy can be explained (but not defined!) in terms of an information entropy idea. As long as you understand all of these disagreements, you see that the whole disagreement is about words, not about how things work. In other words, its a semantic argument, and not a "physics" argument.

It is interesting to note that, given a macrostate, the number of yes/no questions you would have to ask to determine the microstate is, for very large numbers of microstates, equal to the logarithm base 2 of the number of microstates. The logarithm base 2 of a number is proportional to the natural log of the number, so you can see from a different approach how information entropy enters into the definition of thermodynamic entropy.
 
Last edited:
  • #6
Drakkith said:
Entropy has more than one meaning. In thermodynamics it referes to the amount of energy in a system not available for useful work. For information, it is a little more complicated but the person you quoted was pretty much correct. There is also Statistical Thermodynamics Entropy. See here: http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics )

Isn't that quote incorrectly lumping entropy in thermodynamics and information together?
 
Last edited by a moderator:
  • #7
Drakkith said:
In thermodynamics it referes to the amount of energy in a system not available for useful work.
This is not quite correct. Change in entropy or entropy difference between two states is a measure of the amount of potentially available useful work that is not realized in a thermodynamic process occurring between those two states. It is a measure of how far from "reversible" the actual thermodynamic process between the intial and final state was. In other words, if you stored all the work from the actual process as potential energy (eg. used the work to lift a weight) and then used that energy to drive the process backward, the entropy difference tells you how close to the initial state you would end up. The greater the entropy difference, the further from the initial state you would be.

Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:

Wikipedia>Entropy said:
Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat.
For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.

Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM
 
  • #8
Classical thermodynamics recognises certain properties which were originally thought to be infinitely divisible.

Later work introduced mechanisms as to how some of these properties might arise in a discrete universe.

So, for instance, the kinetic theory offers a pretty accurate insight into how the actions of many small particles might combine to display the macroscopic property known as pressure.

Another very simple property, that unfortunately had been elevated to mystical proportions to frighten schoolboys, was then shown to be derivable from probability considerations.

In both cases the introduction of the later work has greatly extended our knowledge and predictive capability.

No, of course they are not the same, they are describing fundamentally different views of reality but when addressing the same question they produce the same answer, sometimes one or the other has calculative advantage.
 
  • #9
Andrew Mason said:
This is not quite correct. Change in entropy or entropy difference between two states is a measure of the amount of potentially available useful work that is not realized in a thermodynamic process occurring between those two states. It is a measure of how far from "reversible" the actual thermodynamic process between the intial and final state was. In other words, if you stored all the work from the actual process as potential energy (eg. used the work to lift a weight) and then used that energy to drive the process backward, the entropy difference tells you how close to the initial state you would end up. The greater the entropy difference, the further from the initial state you would be.

Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:

For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.

Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM

Thank you for your reply.

Isn't it correct that entropy in thermodynamics is different than entropy in information – and if so wouldn't that mean the post I quoted in my original post is partially incorrect? If true can you elaborate how. Thank you.
 
  • #10
revo74 said:
Thank you for your reply.

Isn't it correct that entropy in thermodynamics is different than entropy in information – and if so wouldn't that mean the post I quoted in my original post is partially incorrect? If true can you elaborate how. Thank you.
In thermodynamics the change or difference in entropy between two states (ie states of the system and surroundings) is the integral of dQ/T of the system and surroundings over the reversible path(s) between those two states (a non-reversible process will change the states of the system and surroundings in such a way that two different reversible paths will be required to determine the change in entropy - one for the system and another for the surroundings). The concept was developed before anyone understood about atoms and molecules.

Boltzmann found that the underlying basis for the second law of thermodynamics was statistical. The second law is essentially a statistical law. In thermodynamics, particles in equilibrium follow Maxwell-Boltzmann statistics (ie. temperature).

When a similar concept is applied to information, the populations (ie of whatever it is you are doing statistics on) don't follow Maxwell-Boltzmann statistics (ie. there is no temperature of information). So the concept is quite different and really should not be confused or compared to thermodynamic entropy.

AM
 
  • #11
Andrew Mason said:
When a similar concept is applied to information, the populations (ie of whatever it is you are doing statistics on) don't follow Maxwell-Boltzmann statistics (ie. there is no temperature of information). So the concept is quite different and really should not be confused or compared to thermodynamic entropy.
AM

This is not correct. The thermodynamic entropy is the product of the Boltzmann constant and the information entropy. It is not an analogy which fails at some point, it is an identity. Multiplication of the dimensionless information entropy by the Boltzmann constant (which has units of thermodynamic entropy) produces a physical variable which then is involved in the richer theory of thermodynamics.

For example, from an information point of view, if you have a gas at a constant volume with a set of discrete energy levels, the energies form the "alphabet" of a "message". The "message" is a string of such "characters" and each position in the "message" corresponds to a particular particle of the gas, the "character" specifying the energy of that particle. If there are N particles, there are N "characters" in the "message". The probability of a given "character" being energy [itex]\varepsilon_i[/itex] is given by the Boltzmann distribution:

[tex]p_i=\frac{e^{-\varepsilon_i/kT}}{Z}[/tex]

where k is the Boltzmann constant, T is temperature, and Z is the partition function:

[tex]Z=\sum_i e^{-\varepsilon_i/kT}[/tex]

These probabilities are identical and independently distributed. The dimensionless information entropy is:

[tex]H=\sum_i p_i \ln(1/p_i) = \frac{U}{kT}+\ln(Z)[/tex]

where U is the internal energy:

[tex]U=\sum_i \varepsilon_i p_i[/tex]

Multiplying by kT, realizing that kH=S (the thermodynamic entropy) and the Helmholtz free energy is A=-kT ln(Z) you get the Euler-integrated form of the first law for constant volume:

[tex]U=TS+A[/tex]

Shannon's source coding theorem applied to thermodynamics is not very interesting (I guess) - it states that the minimum number of particles needed to yield the same thermodynamic entropy is that entropy divided by k times the log of the number of energy levels, which is limited by the finite internal energy U.
 
Last edited:
  • #12
Rap said:
This is not correct. The thermodynamic entropy is the product of the Boltzmann constant and the information entropy. It is not an analogy which fails at some point, it is an identity. Multiplication of the dimensionless information entropy by the Boltzmann constant (which has units of thermodynamic entropy) produces a physical variable which then is involved in the richer theory of thermodynamics.
I just said that they are two quite different things, conceptually and practically. There may be a similarity in the mathematical expressions but that is about it, as far as I can see.

One is related to, and depends upon, temperature. The other is related to information theory, which has no concept of temperature. They are both statistical concepts. In thermodynamics, entropy is a statistical notion based on the logarithm of the number of microstates that a system of particles can have in phase space (momentum and position each with 3 degrees of freedom). In information theory, it is the logarithm of a set of event probabilities. You can define entropy for a small number of events - eg. two coin flips. You cannot do that in thermodynamics. You need sufficient numbers of particles to have a temperature. Two particles cannot have a temperature.

AM
 
  • #13
Andrew Mason said:
I just said that they are two quite different things, conceptually and practically. There may be a similarity in the mathematical expressions but that is about it, as far as I can see.

One is related to, and depends upon, temperature. The other is related to information theory, which has no concept of temperature. They are both statistical concepts. In thermodynamics, entropy is a statistical notion based on the logarithm of the number of microstates that a system of particles can have in phase space (momentum and position each with 3 degrees of freedom). In information theory, it is the logarithm of a set of event probabilities. You can define entropy for a small number of events - eg. two coin flips. You cannot do that in thermodynamics. You need sufficient numbers of particles to have a temperature. Two particles cannot have a temperature.

AM

Do you find any fault with the above example? Do you agree that the thermodynamic entropy is equal to the Boltzmann constant times the information entropy? If you do, then we only have a semantic argument. I mean, as long as we agree on how things work, then the name tags we put on our concepts are not important, except that they interfere with our ability to communicate with each other. I'm not too interested in semantic arguments.

Is there anything in our disagreement that would cause us to come up with conflicting predictions? Is there any way to test the validity of our viewpoints? If not, let's agree to disagree.

Regarding the information analog of temperature, in thermodynamics we have dU=T dS (for constant volume and number of particles). So, in order to have an information analog of temperature, we will also need an information analog of internal energy. Conservation of energy, when applied to the "message" of the energies of the individual particles implies that the sum of all the "characters" in the "message" is a constant. I would say that if we impose such a conservation law on any information message, in which the probabilities are of the form [itex]p_i \propto \exp(-\varepsilon_i\beta)[/itex] where[itex]\beta[/itex] is some constant, then we will have an information analog of temperature ([itex]T \propto 1/\beta[/itex])
 
Last edited:
  • #14
Rap said:
Do you find any fault with the above example? Do you agree that the thermodynamic entropy is equal to the Boltzmann constant times the information entropy?
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.

AM
 
  • #15
But your argument doesn't persuade me.

I agree

There are many pairs of 'processes' in physics that possesses the same mathematical form, but are not connected.

For example those which obey an inverse square law.

It does not follow that just because they have the same formula that say, gravitational and electrostatic attraction are one and the same.

Further, as I pointed out earlier, discrete mathematics can never exactly replicate continuous mathematics.
So we should not loose a sense of wonder at how many apparently continuous natural processes now seem to be the aggregate result of discrete actions.
 
  • #16
Andrew Mason said:
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.
AM

I think we have an untestable difference of opinion - a semantic argument, which, as I said, is not very interesting to me.

Just to be sure, though, what would you say would qualify as a "real and substantial connection"?
 
  • #17
Rap said:
Just to be sure, though, what would you say would qualify as a "real and substantial connection"?
A connection that is real and is one of substance.

To use Studiot's example, the fact that g = GM/r^2 and E = kQ/r^2 does not demonstrate that there is a real physical connection between g and E. The connection is one of form, not substance.

AM
 
  • #18
Andrew Mason said:
To use Studiot's example, the fact that g = GM/r^2 and E = kQ/r^2 does not demonstrate that there is a real physical connection between g and E. The connection is one of form, not substance.
AM

Another thing - the intensity of sound from a point source falls off as 1/r^2. Also, the area on a sphere encompassed by a cone with a vertex at the origin is proportional to the square of the radius of the sphere. These examples illustrate an underlying property of 3-dimensional Euclidean space. I would not say there is a physical connection between g and E and sound, but I would say that the similarity in the laws is not entirely coincidental, but says a lot about the nature of Euclidean 3 space. In other words, the connection is more than one of form. There is a substantial connection between Newtons law of gravity, Coulomb's law, and the theory of acoustic waves, and this connection is via the geometry of Euclidean 3-space.

Similarly, there is a substantial connection between the transmission and encoding of electronic messages and thermodynamics via information theory.

Trying to find the information equivalent of temperature, internal energy, etc. is like trying to find the analog of charge in the study of solid geometry. The 1/r^2 law follows from the properties of space, but you cannot say that Coulomb's law has nothing to do with the properties of space simply because there is no analog of charge in the study of solid geometry.

The connection between thermodynamic and information entropy is likewise, more than one of form. Information entropy isolates certain statistical ideas, independent of any physical application. It can then be applied to the transmission and encoding of electronic messages. It can also be applied to the concept of thermodynamic entropy. Thermodynamic entropy incorporates this theory into a richer theory of thermodynamics, just as Coulomb's law incorporates Euclidean geometry into a richer theory of electrostatics.

If I am sending electronic messages on a twisted pair, can I argue that information theory has no application here, because there is no analog of a twisted pair in information theory? I say no.
 
Last edited:
  • #19
These examples illustrate an underlying property of 3-dimensional Euclidean space. I would not say there is a physical connection between g and E and sound, but I would say that the similarity in the laws is not entirely coincidental, but says a lot about the nature of Euclidean 3 space.

You might say that but such a deduction would be in the same realms as Sherlock Holmes deducing connections between travellers who happen to be passengers on the same NO17 Clapham Omnibus.

AM only quoted the shortened form of the gravitational and electrostatic inverse square force laws.

If you take the complete versions you will immediately notice they are not quite the same, since mass is a positive definite quantity and charge comes in two polarities.
 
  • #21
Studiot said:
You might say that but such a deduction would be in the same realms as Sherlock Holmes deducing connections between travellers who happen to be passengers on the same NO17 Clapham Omnibus.

Yes, and those connections are weaker, but substantial and not one of form.

Studiot said:
If you take the complete versions you will immediately notice they are not quite the same, since mass is a positive definite quantity and charge comes in two polarities.

Right, and there is no twisted pair in thermodynamics.

Again - this is a semantic argument, I think we agree on how things work.
 
Last edited:
  • #22
Andrew Mason said:
Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:

I mentioned these two mistakes that you pointed out to the person I am having a conversation with and here is what he said:

Andrew Mason said:
]For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.

His response: The website doesn't say there is no increase in entropy. Secondly, "theoretical maximum efficiency" doesn't mean 100% efficiency. You're jumping to conclusions. FYI,

theoretical maximum efficiency = 1 - Tc/Th

where Tc= temperature of cold reservoir, Th = temperature of hot reservoir. Unless Tc =Th, and there would be no point in building a machine for that case, then the theoretical maximum efficiency will always be less than 100%. That is another consequence of the second law of thermodynamics.

Suppose your reservoirs are operating at Tc= 20C, Th=80C, then the theoretical maximum efficiency = .75. This means the best possible machine you could ever build would be 75% efficient in that case, not that you will build such a machine, more likely it will have considerably less efficiency. But its theoretical maximum efficiency = 75%, no matter what.

Andrew Mason said:
Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM

His response: This is incorrect: "theoretical maximum efficiency" does not mean 100% efficiency, waste energy is in the form of dissipation, it wasn't meant to be read as the same as entropy, and "accumulate" means increase in case you don't know. You can't reconcile yourself with the idea that entropy increases. So here you go misreading the information on this website to fit your own misconception of what entropy means.

So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.
 
  • #23
revo74 said:
His response: The website doesn't say there is no increase in entropy. Secondly, "theoretical maximum efficiency" doesn't mean 100% efficiency. You're jumping to conclusions. FYI,
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]


Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]


Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]


In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?

Unless Tc =Th, and there would be no point in building a machine for that case, then the theoretical maximum efficiency will always be less than 100%.
If Tc = Th, the efficiency would be 0: 1-1=0.

His response: This is incorrect: "theoretical maximum efficiency" does not mean 100% efficiency,
True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.

So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.
You will find that entropy is a difficult concept and not all attempts to explain it are correct. You will just have to read good physics books and stay away from Wikipedia on this topic if you want to eventually understand it.

AM
 
  • #24
So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.

So did I waste my time, in my post#8 (to which you never replied), offering a viewpoint on your question?

I would agree with AM, and other knowedgeable people here that Wiki sometimes 'slips up' and presents erroneous or inconsistent material. This is not only true of thermodynamics so so always have to temper a Wiki offering with critical appraisal.
 
  • #25
Studiot said:
So did I waste my time, in my post#8 (to which you never replied), offering a viewpoint on your question?

I would agree with AM, and other knowedgeable people here that Wiki sometimes 'slips up' and presents erroneous or inconsistent material. This is not only true of thermodynamics so so always have to temper a Wiki offering with critical appraisal.

No you did not. I appreciate your input on the matter. What would your response be to the person who made those comments in the above post defending Wikipedia?
 
  • #26
Andrew Mason said:
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]


Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]


Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]


In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?

If Tc = Th, the efficiency would be 0: 1-1=0.

True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.

You will find that entropy is a difficult concept and not all attempts to explain it are correct. You will just have to read good physics books and stay away from Wikipedia on this topic if you want to eventually understand it.

AM

Thank you for your response. I will point out what you have said, perhaps I can help him learn entropy too, lol.
 
  • #27
Andrew Mason said:
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?

It seems that he is retreating. All he had to say in response was this:

"The 2nd law states that ΔS ≥ 0. None of those examples violates the second law."

What is your response to this?

True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.

His reply: "Entropy is a measure of the energy not available for useful work" is another way of looking at entropy. There is not just one way to look at entropy."

What say you?
 
  • #28
Andrew Mason said:
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.

AM

This is what I was told in response to the above:

"You can watch this video. Bousso does not prove that entropy = lack of information, but in his talk, he refers to it quite frequently. He shows that entropy (amount of information) of a black hole is just its area (at time 31:05), what Hawking use in his development of black hole thermodynamics and later on Susskind in the holographic principle. I don't know what your level of knowledge in physics, but his talk is non-technical, you should be able to follow most of it. If you need help, I may be able to provide it."

 
Last edited by a moderator:
  • #29
No you did not. I appreciate your input on the matter. What would your response be to the person who made those comments in the above post defending Wikipedia?

That was a deeply disappointing non response.

Right at this moment you are looking at a system that embodies not only another discrete system masquerading as continuous one, but also one that exemplifies the difference between the statistical and concrete approach.

I mean, of course your computer screen.

As to this other person. Can he or she not join PF and speak for themselves?
 
Last edited:
  • #30
Andrew Mason said:
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.
AM

Without a doubt, anyone considering the connection (or lack thereof) between thermodynamic entropy and information entropy should consider the following article by Jaynes as required reading:

http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf

Its clearly written, with concrete examples, and is, at the least, a very interesting and well-reasoned article. Two other good ones by the same author are:

http://bayes.wustl.edu/etj/articles/theory.1.pdf
http://bayes.wustl.edu/etj/articles/theory.2.pdf

Another defense for the connection - I think the name "information entropy" is unfortunate. The subject of entropy is divorced from specific applications like transmission of messages and thermodynamics. It is a subject belonging to pure probability theory. Pure probability theory does not deal with temperature, or transmission lines, yet it finds application in both of these concrete problems. This is the nature of the "connection" between the two.

(BTW, thank you for the explicit response to the idea that entropy is a measure of energy unavailiable for work - I never understood that idea, and now I am starting to know why. The responses from the defender of the idea are empty and evasive.)
 
  • #31
revo74 said:
It seems that he is retreating. All he had to say in response was this:

"The 2nd law states that ΔS ≥ 0. None of those examples violates the second law."

What is your response to this?
He is quite correct. But I was not suggesting that the second law can be violated. It can't. I was merely attempting to explain why entropy is NOT a measure of the energy that is unavailable for useful work.


His reply: "Entropy is a measure of the energy not available for useful work" is another way of looking at entropy. There is not just one way to look at entropy."

What say you?
There are many incorrect ways and a few correct ways to look at entropy. "Entropy is a measure of the energy not available for useful work" is an incorrect interpretation of entropy, for the reasons I have given.

AM
 
  • #32
Andrew Mason said:
There are many incorrect ways and a few correct ways to look at entropy. "Entropy is a measure of the energy not available for useful work" is an incorrect interpretation of entropy, for the reasons I have given.

AM

His response to this was:

"I gave you a link to a video in which Bousso used the concept of entropy as information. Yet, you pursue this insane line that entropy is what you think it is. I'm wrong, wikipedia is wrong, Bousso, an outstanding physicist, is wrong."

I have come to realize that entropy in information theory and entropy in thermodynamics are different. Isn't the first paragraph of the entropy Wiki page making reference to thermodynamics? It doesn't specify. What is your response to his statement?
 
  • #33
revo74 said:
I have come to realize that entropy in information theory and entropy in thermodynamics are different. Isn't the first paragraph of the entropy Wiki page making reference to thermodynamics? It doesn't specify. What is your response to his statement?

Don't you agree that it is a matter of words, not physics? If you believe it is physics, then suggest an experiment whose outcome we will disagree on as a result of our disagreement on the definitions of information and thermodynamic entropy.

If there is none, then our disagreement is semantic, and is only resolvable by our mutual agreement on the meaning of language. I'm interested in physics, not language, which means I will not defend to the death my use of language, or attack yours. If there is no experiment to decide, what is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?
 
  • #34
Rap said:
Don't you agree that it is a matter of words, not physics? If you believe it is physics, then suggest an experiment whose outcome we will disagree on as a result of our disagreement on the definitions of information and thermodynamic entropy.

If there is none, then our disagreement is semantic, and is only resolvable by our mutual agreement on the meaning of language. I'm interested in physics, not language, which means I will not defend to the death my use of language, or attack yours. If there is no experiment to decide, what is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?

I understand what you are saying. I have read all of your posts. My question to then is, is AM incorrect when he says Wikipedia is wrong. Are the examples he put forward wrong?
 
  • #35
revo74 said:
I understand what you are saying. I have read all of your posts. My question to then is, is AM incorrect when he says Wikipedia is wrong. Are the examples he put forward wrong?

I think Wikipedia is, at worst, wrong on the subject of entropy as a measure of energy unavailable for work. I think AM's exercise shows that interpreting the statement simplistically makes that statement nonsense. I think that any defense of the statement will make use of so many extra unmentioned assumptions and conditions as to render the statement useless as a quick and easy way to understand entropy.

On the separate subject of the difference between thermodynamic and information entropy, you state:

revo74 said:
I have come to realize that entropy in information theory and entropy in thermodynamics are different.

Again - this is a question - What is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?
 

Similar threads

  • Thermodynamics
Replies
3
Views
776
  • Thermodynamics
Replies
1
Views
2K
Replies
4
Views
1K
Replies
2
Views
838
Replies
3
Views
953
Replies
9
Views
1K
Replies
3
Views
1K
  • Special and General Relativity
Replies
7
Views
268
Replies
6
Views
2K
  • Thermodynamics
Replies
2
Views
2K
Back
Top