Entropy and thermodynamics.

  • Thread starter revo74
  • Start date
  • #26
72
0
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]


Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]


Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]


In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?

If Tc = Th, the efficiency would be 0: 1-1=0.

True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.

You will find that entropy is a difficult concept and not all attempts to explain it are correct. You will just have to read good physics books and stay away from Wikipedia on this topic if you want to eventually understand it.

AM
Thank you for your response. I will point out what you have said, perhaps I can help him learn entropy too, lol.
 
  • #27
72
0
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]


Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]


Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]


In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?
It seems that he is retreating. All he had to say in response was this:

"The 2nd law states that ΔS ≥ 0. None of those examples violates the second law."

What is your response to this?

True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.
His reply: "Entropy is a measure of the energy not available for useful work" is another way of looking at entropy. There is not just one way to look at entropy."

What say you?
 
  • #28
72
0
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.

AM
This is what I was told in response to the above:

"You can watch this video. Bousso does not prove that entropy = lack of information, but in his talk, he refers to it quite frequently. He shows that entropy (amount of information) of a black hole is just its area (at time 31:05), what Hawking use in his development of black hole thermodynamics and later on Susskind in the holographic principle. I don't know what your level of knowledge in physics, but his talk is non-technical, you should be able to follow most of it. If you need help, I may be able to provide it."

 
Last edited by a moderator:
  • #29
5,439
9
No you did not. I appreciate your input on the matter. What would your response be to the person who made those comments in the above post defending Wikipedia?
That was a deeply disappointing non response.

Right at this moment you are looking at a system that embodies not only another discrete system masquerading as continuous one, but also one that exemplifies the difference between the statistical and concrete approach.

I mean, of course your computer screen.

As to this other person. Can he or she not join PF and speak for themselves?
 
Last edited:
  • #30
Rap
814
9
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.
AM
Without a doubt, anyone considering the connection (or lack thereof) between thermodynamic entropy and information entropy should consider the following article by Jaynes as required reading:

http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf

Its clearly written, with concrete examples, and is, at the least, a very interesting and well-reasoned article. Two other good ones by the same author are:

http://bayes.wustl.edu/etj/articles/theory.1.pdf
http://bayes.wustl.edu/etj/articles/theory.2.pdf

Another defense for the connection - I think the name "information entropy" is unfortunate. The subject of entropy is divorced from specific applications like transmission of messages and thermodynamics. It is a subject belonging to pure probability theory. Pure probability theory does not deal with temperature, or transmission lines, yet it finds application in both of these concrete problems. This is the nature of the "connection" between the two.

(BTW, thank you for the explicit response to the idea that entropy is a measure of energy unavailiable for work - I never understood that idea, and now I am starting to know why. The responses from the defender of the idea are empty and evasive.)
 
  • #31
Andrew Mason
Science Advisor
Homework Helper
7,641
371
It seems that he is retreating. All he had to say in response was this:

"The 2nd law states that ΔS ≥ 0. None of those examples violates the second law."

What is your response to this?
He is quite correct. But I was not suggesting that the second law can be violated. It can't. I was merely attempting to explain why entropy is NOT a measure of the energy that is unavailable for useful work.


His reply: "Entropy is a measure of the energy not available for useful work" is another way of looking at entropy. There is not just one way to look at entropy."

What say you?
There are many incorrect ways and a few correct ways to look at entropy. "Entropy is a measure of the energy not available for useful work" is an incorrect interpretation of entropy, for the reasons I have given.

AM
 
  • #32
72
0
There are many incorrect ways and a few correct ways to look at entropy. "Entropy is a measure of the energy not available for useful work" is an incorrect interpretation of entropy, for the reasons I have given.

AM
His response to this was:

"I gave you a link to a video in which Bousso used the concept of entropy as information. Yet, you pursue this insane line that entropy is what you think it is. I'm wrong, wikipedia is wrong, Bousso, an outstanding physicist, is wrong."

I have come to realize that entropy in information theory and entropy in thermodynamics are different. Isn't the first paragraph of the entropy Wiki page making reference to thermodynamics? It doesn't specify. What is your response to his statement?
 
  • #33
Rap
814
9
I have come to realize that entropy in information theory and entropy in thermodynamics are different. Isn't the first paragraph of the entropy Wiki page making reference to thermodynamics? It doesn't specify. What is your response to his statement?
Don't you agree that it is a matter of words, not physics? If you believe it is physics, then suggest an experiment whose outcome we will disagree on as a result of our disagreement on the definitions of information and thermodynamic entropy.

If there is none, then our disagreement is semantic, and is only resolvable by our mutual agreement on the meaning of language. I'm interested in physics, not language, which means I will not defend to the death my use of language, or attack yours. If there is no experiment to decide, what is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?
 
  • #34
72
0
Don't you agree that it is a matter of words, not physics? If you believe it is physics, then suggest an experiment whose outcome we will disagree on as a result of our disagreement on the definitions of information and thermodynamic entropy.

If there is none, then our disagreement is semantic, and is only resolvable by our mutual agreement on the meaning of language. I'm interested in physics, not language, which means I will not defend to the death my use of language, or attack yours. If there is no experiment to decide, what is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?
I understand what you are saying. I have read all of your posts. My question to then is, is AM incorrect when he says Wikipedia is wrong. Are the examples he put forward wrong?
 
  • #35
Rap
814
9
I understand what you are saying. I have read all of your posts. My question to then is, is AM incorrect when he says Wikipedia is wrong. Are the examples he put forward wrong?
I think Wikipedia is, at worst, wrong on the subject of entropy as a measure of energy unavailable for work. I think AM's exercise shows that interpreting the statement simplistically makes that statement nonsense. I think that any defense of the statement will make use of so many extra unmentioned assumptions and conditions as to render the statement useless as a quick and easy way to understand entropy.

On the separate subject of the difference between thermodynamic and information entropy, you state:

I have come to realize that entropy in information theory and entropy in thermodynamics are different.
Again - this is a question - What is your basic objection to the idea that entropy is a concept in the realm of pure probability which finds application in both information theory and thermodynamics?
 
  • #36
Rap
814
9
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:
While I agree with AM, I found this reference which supposedly defends the idea that entropy is a measure of energy unavailable for work:

T. V. Marcella, ‘‘Entropy production and the second law of thermodynamics:
An introduction to second law analysis,’’ Am. J. Phys. 60, 888–895, 1992

I cannot access it, if anyone can, I would be interested in what it says.
 
  • #37
72
0
I think Wikipedia is, at worst, wrong on the subject of entropy as a measure of energy unavailable for work.
Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?
 
  • #38
Rap
814
9
Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?
I think what is needed is a quantitative explanation of why it is true. If the explanation works, then fine, if not, then thats another indication that it is flawed.
 
  • #39
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?
You will find there are many misleading explanations or definitions of entropy. Repeating them over and over does not make them more accurate. Entropy = disorder is a misleading unless you have a very specific definition of disorder. "Entropy is a measure of the energy unavailable for useful work" is also inaccurate and misleading.

An increase in entropy (of the system + surroundings) is certainly related to an increase in the amount of heat flow into a system that is not turned into useful work in a thermodynamic process. The greater the increase in total entropy the less efficient the thermodynamic process in terms of the ratio of useful work produced to input heat flow energy.

But you cannot conclude from this that entropy is a measure of the heat flow that is not available to do useful work. The change in entropy relates to only a part of the heat flow that does not produce useful work. Much of that heat flow is not available to do useful work even if there is no increase in entropy (reversible process).

AM
 
  • #40
You will find there are many misleading explanations or definitions of entropy. Repeating them over and over does not make them more accurate. Entropy = disorder is a misleading unless you have a very specific definition of disorder. "Entropy is a measure of the energy unavailable for useful work" is also inaccurate and misleading.

AM
Ok I am convinced information entropy and thermodynamic entropy doesn't mix well. However, even for the macroscopic, thermodynamic meaning of entropy, I still don't get what it is. I currently imagine it is a measure of energy that goes "elsewhere", to a place where it couldn't be changed to other forms. A system with high entropy means it has few "flow-able" energy, whether it flows within or outside; and a low entropy system means much of the energy could (potentially) flow. An analog to my imagination would be a glass of water with fixed number of molecules. If more water is in the form of ice (entropy), less water (energy) could flow inside the cup or be poured away.

Does that match what macroscopic thermodynamic entropy is?
 
  • #41
Rap
814
9
Ok I am convinced information entropy and thermodynamic entropy doesn't mix well.
Why are you convinced of this?
 
  • #42
Why are you convinced of this?
Because when the term entropy was first invented, people still don't know the molecular nature of thermodynamics and the original meaning has nothing to do with disorder. Before the concept was expanded to other fields, the purpose of inventing the notion of entropy was just to understand the action of steam engines. So should be able to understand what (thermodynamic) entropy in macroscopic scale is without bothering disorder or information.
 
  • #43
Rap
814
9
Because when the term entropy was first invented, people still don't know the molecular nature of thermodynamics and the original meaning has nothing to do with disorder. Before the concept was expanded to other fields, the purpose of inventing the notion of entropy was just to understand the action of steam engines. So should be able to understand what (thermodynamic) entropy in macroscopic scale is without bothering disorder or information.
But then that would mean that we can understand temperature without understanding the energy of a molecule, we can understand pressure without understanding the impact of molecules on a surface. In other words, you are saying we don't need statistical mechanics or atomic theory to understand classical thermodynamics. Actually, that is true, but there is so much more understanding to be gained by understanding all of these things in terms of statistical mechanics. Once you try to gain more understanding of entropy by using statistical mechanics, then you get the connection to information entropy.
 
  • #44
5,439
9
It is important to understand the difference between molecules and classical particles.
 
  • #45
But then that would mean that we can understand temperature without understanding the energy of a molecule, we can understand pressure without understanding the impact of molecules on a surface. In other words, you are saying we don't need statistical mechanics or atomic theory to understand classical thermodynamics. Actually, that is true, but there is so much more understanding to be gained by understanding all of these things in terms of statistical mechanics. Once you try to gain more understanding of entropy by using statistical mechanics, then you get the connection to information entropy.
I understand how useful it is to understand phenomena in microscopic terms. But as a noob in thermodynamics, I don't even have a concept to begin my connection (referring to your last sentence). It is like when talking about pressure, people just think of particle colliding with walls without realizing the corresponding macroscopic phenomena is the walls being "pushed" by a force. When talking about thermodynamic entropy, people just talk about disorder without linking the microscopic phenomena to the macroscopic one. The lack of macro-micro linking is the main reason why the notion of entropy is difficult to grasp for new comers. Although it may be turned out the microscopic aspect is more generalized hence more applicable, as a stepping stone it is still useful to know the macro aspect of entropy.

Anyway, here is what I found in a site:
http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" [Broken]

Historically, there is a problem of how to tell thermal energy of a system by its temperature. Entropy was invented originally to state the relationship between temperature and thermal energy. A body with the same temperature could have different thermal energy. In common sense higher temperature must means more thermal energy, but it is not always true. In fact there could be body with high thermal energy but low temperature (due to high entropy), and vice versa.This is the macroscopic phenomena of entropy.

To explain this common-sense-violating weirdo, the microscopic explanation comes into play. If the gas molecules have more ways to move, to vibrate, to oscillate, then they could gain more energy without flying faster (having higher KE). That's why bodies with same temperature (same average KE) could have different thermal energy.

And then somebody proposed the more ways to move a system has, the more messy (disorder) it is. It was not until then entropy is related to disorder, then even later, a randomness in information. And finally the term entropy has extended to other fields which has nothing to do with thermal energy.

To end with a analogy, we learn atoms first as tiny balls, then as a micro-solar system with electrons orbiting nucleus, then as electrons cloud orbiting nucleus in a quantum-mechanic way. Although the quantum-mechanic model has the strongest explanatory power, we still learn its former, less predictive model because it is too un-intuitive to grasp for new comers. So similarly, it is a bad idea to just talk about the deepest meaning of entropy and hoping a few genius could retro-deduce it back to the macroscopic aspect.
 
Last edited by a moderator:
  • #46
Rap
814
9
I understand how useful it is to understand phenomena in microscopic terms. But as a noob in thermodynamics, I don't even have a concept to begin my connection (referring to your last sentence). It is like when talking about pressure, people just think of particle colliding with walls without realizing the corresponding macroscopic phenomena is the walls being "pushed" by a force. When talking about thermodynamic entropy, people just talk about disorder without linking the microscopic phenomena to the macroscopic one. The lack of macro-micro linking is the main reason why the notion of entropy is difficult to grasp for new comers. Although it may be turned out the microscopic aspect is more generalized hence more applicable, as a stepping stone it is still useful to know the macro aspect of entropy.
Entropy is not only difficult for newcomers, it is difficult for almost everyone. Forget about "disorder", that only gives you a "feel" for entropy. "disorder" is not rigorously defined, and the idea gets squishy if you push it too far. The link between thermodynamic entropy and information entropy is the beginning of the "macro-micro linking" you are looking for. The thermodynamic entropy is equal to the Boltzmann constant times the information entropy, and the information entropy is the minimum number of yes/no questions you have to ask to determine the microstate, given that you know the macrostate (temperature, pressure, etc.). Sort of like the game of twenty questions. This doesn't give you the complete link, but it goes a LONG way towards it.

Anyway, here is what I found in a site:
http://www.science20.com/train_thought/blog/entropy_not_disorder-75081" [Broken]
This guy uses "diversity" as giving a better feel for entropy. I think it is better. But to be exact, what is "diversity"? If I flip four fair coins, but I don't look at them yet, the info entropy is 4 bits. I have to ask at least 4 questions about them to know how they came out. With 8 coins, its 8 questions. If you say that the 8 coin result is exactly twice as diverse as the 4 coin result, then yes, diversity is a good word because it is exactly proportional to the information entropy. Call it whatever you want, and as long as what you call it is proportional to the information entropy, you will be correct. The thermo entropy is proportional to the info entropy by the Boltzmann constant, so either one is good.

Historically, there is a problem of how to tell thermal energy of a system by its temperature. Entropy was invented originally to state the relationship between temperature and thermal energy. A body with the same temperature could have different thermal energy. In common sense higher temperature must means more thermal energy, but it is not always true. In fact there could be body with high thermal energy but low temperature (due to high entropy), and vice versa.This is the macroscopic phenomena of entropy.

To explain this common-sense-violating weirdo, the microscopic explanation comes into play. If the gas molecules have more ways to move, to vibrate, to oscillate, then they could gain more energy without flying faster (having higher KE). That's why bodies with same temperature (same average KE) could have different thermal energy.

And then somebody proposed the more ways to move a system has, the more messy (disorder) it is. It was not until then entropy is related to disorder, then even later, a randomness in information. And finally the term entropy has extended to other fields which has nothing to do with thermal energy.

To end with a analogy, we learn atoms first as tiny balls, then as a micro-solar system with electrons orbiting nucleus, then as electrons cloud orbiting nucleus in a quantum-mechanic way. Although the quantum-mechanic model has the strongest explanatory power, we still learn its former, less predictive model because it is too un-intuitive to grasp for new comers. So similarly, it is a bad idea to just talk about the deepest meaning of entropy and hoping a few genius could retro-deduce it back to the macroscopic aspect.
Thats the part of thermodynamic entropy that information entropy does not give much help on - the connection to the rest of thermodynamics. Thermodynamic entropy is Boltzmann's constant times the information entropy. The internal energy is temperature times Boltzmann's constant times the information entropy. Boltzmann's constant is historically connected to entropy rather than temperature, but its just as good to define a new temperature T'=kT and say that internal energy is the new temperature times the information entropy. Now the real problem is to understand temperature. I think that once information entropy is understood, the real problem lies in understanding temperature. People don't worry about temperature, because its something we intuitively "understand" - hot and cold and all that. But the real problem is the bridge from information entropy to internal energy, and that bridge is the temperature. Anyone who claims to understand thermodynamic entropy MUST understand temperature, and not in the "hot and cold" sense, but in the sense of being the bridge between internal energy and information entropy, two things we can more easily understand.
 
Last edited by a moderator:
  • #47
Thats the part of thermodynamic entropy that information entropy does not give much help on - the connection to the rest of thermodynamics. Thermodynamic entropy is Boltzmann's constant times the information entropy. The internal energy is temperature times Boltzmann's constant times the information entropy. Boltzmann's constant is historically connected to entropy rather than temperature, but its just as good to define a new temperature T'=kT and say that internal energy is the new temperature times the information entropy. Now the real problem is to understand temperature. I think that once information entropy is understood, the real problem lies in understanding temperature. People don't worry about temperature, because its something we intuitively "understand" - hot and cold and all that. But the real problem is the bridge from information entropy to internal energy, and that bridge is the temperature. Anyone who claims to understand thermodynamic entropy MUST understand temperature, and not in the "hot and cold" sense, but in the sense of being the bridge between internal energy and information entropy, two things we can more easily understand.
Temperature is relatively easy to understand. Both the macro meaning (ease of conducting heat away) and micro meaning (average KE in molecules) are intuitive enough to understand and linked quickly.

Entropy, macrosocpically is a (not directly proportional) measure of thermal energy that doesn't rise temperature of a body. Microscopically entropy is a measure of how many ways could the molecules oscillate (more oscillations available means less KE and lower temperature), which is directly proportional to the number of ways to arrange molecules of a body. This is how thermodynamic entropy and information entropy related.

This missing piece of logic that makes thermodynamic entropy and information entropy so unrelated is the hidden relationship between molecule arrangements and available oscillations, which is expressed mathematically be Boltzmann's constant.
 
Last edited:
  • #48
5,439
9
Temperature is relatively easy to understand. Both the macro meaning (ease of conducting heat away) and micro meaning (average KE in molecules) are intuitive enough to understand and linked quickly.

Entropy, macrosocpically is a (not directly proportional) measure of thermal energy that doesn't rise temperature of a body. Microscopically entropy is a measure of how many ways could the molecules oscillate (more oscillations available means less KE and lower temperature), which is directly proportional to the number of ways to arrange molecules of a body. This is how thermodynamic entropy and information entropy related.

This missing piece of logic that makes thermodynamic entropy and information entropy so unrelated is the hidden relationship between molecule arrangements and available oscillations, which is expressed mathematically be Boltzmann's constant.
So was all this a statement that a cathode ray does or does not possess a defineable and measurable temperature?
 
  • #49
So was all this a statement that a cathode ray does or does not possess a defineable and measurable temperature?
I don't see your logic. Cathode ray, ie electrons, are moving, so they have kinetic energy. So you can calculate the average kinetic energy as temperature. Though when talking about beams we usually use power to describe its heat transfer instead of temperature, because you can't put a travel beam of electrons next to an object like a block.

And I don't understand what you want to demonstrate by throwing out the question.
 
Last edited:
  • #50
5,439
9
I notice you avoided my comment, just as you did for post #44 and this thread

https://www.physicsforums.com/showthread.php?t=517005

When my comments were designed to help you tighten up on some loose terminology, which I think is hindering your understanding.

In particular specific heat is a property of the system/material itself, not of the process.

You can only add or subtract heat from a material/system at the (not time) rate of specific heat.

Entropy is a process property. You can add more or less heat to a system and the process equations (first and second laws etc) lead to different entropy (and other variable) solutions.

go well
 

Related Threads on Entropy and thermodynamics.

  • Last Post
Replies
7
Views
1K
Replies
2
Views
1K
  • Last Post
Replies
3
Views
5K
Replies
4
Views
6K
Replies
2
Views
2K
Replies
4
Views
1K
  • Last Post
Replies
15
Views
5K
Replies
0
Views
3K
Top