Entropy and thermodynamics.

  • Thread starter revo74
  • Start date
  • #1
72
0
The idea that entropy is the same as disorder is false. This is a misconception many people make or have made, including myself.

A person wrote this in response to the statement "Entropy does NOT mean disorder."
Is this correct? Please elaborate on this. Additionally, isn't thermodynamic entropy different than information entropy?

“Technically you are right. The entropy S is:

S = – Σ P(i) log P(i)

where P(i) is the probability of a particle in a state i, and Σ means the sum. It was Boltzmann who advocated the idea that entropy was related to disorder. In Boltzmann's mind, the more ways a system could move internally, the more disorderly the system was. A system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement whatsoever, hence S would be at a minimum. An example of low entropy (high order) would be ice, while water vapor would be high disorder, high entropy.

That was in the 19th century, and this concept prevailed until the mid 20th century. It was Shannon in the 1940's who revolutionized the field of thermodynamics. He defined entropy in terms of information. If my knowledge of the system is high, entropy is low. If my lack of information is high entropy is high. This is the definition currently accepted. It's the one that Susskind used to derive his groundbreaking concept of the holographic principle. Instead of order/disorder you have entropy as the amount of information. The bigger the system, that is the greater the number of degees the sytem has, the greater is my lack of information in terms of which state the particle is in. A gas is an example of a system with high entropy, the molecules occupied a greater number of states with greater velocities, my lack of information is therefore very high, hence entropy is high. In Boltzman's term, "a system in "perfect order" was one in which all the molecules were locked in perfect array without any freedom of movement", that would mean I know where the particles are in the case of perfect order, my knowledge is high, my lack of information is low, hence entropy is low. So you can see that order/disorder and amount of information are equivalent. They don't contradict each other.

Now take the case of the Big Bang. Initially, the universe occupied a tiny volume, the size of Planck length. My knowledge of where are all the particles is high (low entropy). As the universe expands, that is, it occupies more and more volume, it's harder for me to keep track of each particle, my lack of knowlege grows, so entropy increases. I can still look at this scenario in terms of order/disorder. Initially, the universe has low entropy ( high order), as it expands, it becomes more disorder ( entropy increases). In either description, there is no contradiction.

Also see : Entropy_(information_theory)”
 

Answers and Replies

  • #2
72
0
I would appreciate even one response.
 
  • #3
Drakkith
Staff Emeritus
Science Advisor
21,009
4,828
Entropy has more than one meaning. In thermodynamics it referes to the amount of energy in a system not available for useful work. For information, it is a little more complicated but the person you quoted was pretty much correct. There is also Statistical Thermodynamics Entropy. See here: http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics [Broken])
 
Last edited by a moderator:
  • #4
I like Serena
Homework Helper
6,577
176
I would appreciate even one response.
Response here :wink:.

I had subscribed to your thread to see if anyone would give some insights, because I'm afraid I'm out my depth here, and I am interested!
 
  • #5
Rap
814
9
The idea that entropy is disorder is an attempt to give an intuitive meaning to entropy, and it's not too bad, but not perfect. The description given in the quote is rather good. To understand it better, you should understand the concept of macrostate and microstate. If you have a homogeneous gas in equilibrium, it's macrostate is given by, for example, its volume, pressure, and the number of molecules in the volume. Its microstate is given by the microscopic description of each particle of the gas. In quantum mechanics, there are only a limited number of ways that you can have those particles in that volume, at that pressure, but the number is huge. This number is the number of microstates. The thermodynamic entropy is proportional to the logarithm of the number of possible microstates you could have that would give you the same macrostate.

An analogy that is usually made is if you have a messy desk, there are many different ways to have a messy desk, but only a few ways to have an ordered desk. If you think of disorder as having the quality that there are many ways to have disorder, fewer ways to have order, then the disorder analogy works. If you ever find a case where the analogy doesn't work, throw it out, because its the microstate/macrostate idea that works. Entropy is best thought of as a measure of uncertainty about the microstate, given the macrostate. However there is more to thermodynamic entropy. Thermodynamic entropy is proportional to the natural logarithm of number of microstates for a given macrostate and the constant of proportionality is Boltzmann's constant. Thermodynamic entropy has many physical implications regarding temperature, internal energy, etc. etc. and these physical implications are mostly missing in the idea of information entropy. The natural logarithm of the number of microstates for a given macrostate is the "information" part of thermodynamic entropy, the Boltzmann constant multiplying it is what turns it into thermodynamic entropy, with all of the physics implications. Many people object to using the word "entropy" to mean anything but the full thermodynamic entropy, with all of its physical implications. I don't.

Another aspect of the disagreement is that thermodynamic entropy was defined in terms of macroscopic measurements long before Boltzmann gave an explanation of entropy in terms of microstates and macrostates. This means that, strictly speaking, thermodynamic entropy is not defined in terms of information entropy, it is only when you consider Boltzmann's analysis that you see that thermodynamic entropy can be explained (but not defined!) in terms of an information entropy idea. As long as you understand all of these disagreements, you see that the whole disagreement is about words, not about how things work. In other words, its a semantic argument, and not a "physics" argument.

It is interesting to note that, given a macrostate, the number of yes/no questions you would have to ask to determine the microstate is, for very large numbers of microstates, equal to the logarithm base 2 of the number of microstates. The logarithm base 2 of a number is proportional to the natural log of the number, so you can see from a different approach how information entropy enters into the definition of thermodynamic entropy.
 
Last edited:
  • #6
72
0
Entropy has more than one meaning. In thermodynamics it referes to the amount of energy in a system not available for useful work. For information, it is a little more complicated but the person you quoted was pretty much correct. There is also Statistical Thermodynamics Entropy. See here: http://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics [Broken])
Isn't that quote incorrectly lumping entropy in thermodynamics and information together?
 
Last edited by a moderator:
  • #7
Andrew Mason
Science Advisor
Homework Helper
7,641
371
In thermodynamics it referes to the amount of energy in a system not available for useful work.
This is not quite correct. Change in entropy or entropy difference between two states is a measure of the amount of potentially available useful work that is not realized in a thermodynamic process occurring between those two states. It is a measure of how far from "reversible" the actual thermodynamic process between the intial and final state was. In other words, if you stored all the work from the actual process as potential energy (eg. used the work to lift a weight) and then used that energy to drive the process backward, the entropy difference tells you how close to the initial state you would end up. The greater the entropy difference, the further from the initial state you would be.

Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:

Wikipedia>Entropy said:
Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat.
For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.

Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM
 
  • #8
5,439
9
Classical thermodynamics recognises certain properties which were originally thought to be infinitely divisible.

Later work introduced mechanisms as to how some of these properties might arise in a discrete universe.

So, for instance, the kinetic theory offers a pretty accurate insight into how the actions of many small particles might combine to display the macroscopic property known as pressure.

Another very simple property, that unfortunately had been elevated to mystical proportions to frighten schoolboys, was then shown to be derivable from probability considerations.

In both cases the introduction of the later work has greatly extended our knowledge and predictive capability.

No, of course they are not the same, they are describing fundamentally different views of reality but when addressing the same question they produce the same answer, sometimes one or the other has calculative advantage.
 
  • #9
72
0
This is not quite correct. Change in entropy or entropy difference between two states is a measure of the amount of potentially available useful work that is not realized in a thermodynamic process occurring between those two states. It is a measure of how far from "reversible" the actual thermodynamic process between the intial and final state was. In other words, if you stored all the work from the actual process as potential energy (eg. used the work to lift a weight) and then used that energy to drive the process backward, the entropy difference tells you how close to the initial state you would end up. The greater the entropy difference, the further from the initial state you would be.

Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:

For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.

Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM
Thank you for your reply.

Isn't it correct that entropy in thermodynamics is different than entropy in information – and if so wouldn't that mean the post I quoted in my original post is partially incorrect? If true can you elaborate how. Thank you.
 
  • #10
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Thank you for your reply.

Isn't it correct that entropy in thermodynamics is different than entropy in information – and if so wouldn't that mean the post I quoted in my original post is partially incorrect? If true can you elaborate how. Thank you.
In thermodynamics the change or difference in entropy between two states (ie states of the system and surroundings) is the integral of dQ/T of the system and surroundings over the reversible path(s) between those two states (a non-reversible process will change the states of the system and surroundings in such a way that two different reversible paths will be required to determine the change in entropy - one for the system and another for the surroundings). The concept was developed before anyone understood about atoms and molecules.

Boltzmann found that the underlying basis for the second law of thermodynamics was statistical. The second law is essentially a statistical law. In thermodynamics, particles in equilibrium follow Maxwell-Boltzmann statistics (ie. temperature).

When a similar concept is applied to information, the populations (ie of whatever it is you are doing statistics on) don't follow Maxwell-Boltzmann statistics (ie. there is no temperature of information). So the concept is quite different and really should not be confused or compared to thermodynamic entropy.

AM
 
  • #11
Rap
814
9
When a similar concept is applied to information, the populations (ie of whatever it is you are doing statistics on) don't follow Maxwell-Boltzmann statistics (ie. there is no temperature of information). So the concept is quite different and really should not be confused or compared to thermodynamic entropy.
AM
This is not correct. The thermodynamic entropy is the product of the Boltzmann constant and the information entropy. It is not an analogy which fails at some point, it is an identity. Multiplication of the dimensionless information entropy by the Boltzmann constant (which has units of thermodynamic entropy) produces a physical variable which then is involved in the richer theory of thermodynamics.

For example, from an information point of view, if you have a gas at a constant volume with a set of discrete energy levels, the energies form the "alphabet" of a "message". The "message" is a string of such "characters" and each position in the "message" corresponds to a particular particle of the gas, the "character" specifying the energy of that particle. If there are N particles, there are N "characters" in the "message". The probability of a given "character" being energy [itex]\varepsilon_i[/itex] is given by the Boltzmann distribution:

[tex]p_i=\frac{e^{-\varepsilon_i/kT}}{Z}[/tex]

where k is the Boltzmann constant, T is temperature, and Z is the partition function:

[tex]Z=\sum_i e^{-\varepsilon_i/kT}[/tex]

These probabilities are identical and independently distributed. The dimensionless information entropy is:

[tex]H=\sum_i p_i \ln(1/p_i) = \frac{U}{kT}+\ln(Z)[/tex]

where U is the internal energy:

[tex]U=\sum_i \varepsilon_i p_i[/tex]

Multiplying by kT, realizing that kH=S (the thermodynamic entropy) and the Helmholtz free energy is A=-kT ln(Z) you get the Euler-integrated form of the first law for constant volume:

[tex]U=TS+A[/tex]

Shannon's source coding theorem applied to thermodynamics is not very interesting (I guess) - it states that the minimum number of particles needed to yield the same thermodynamic entropy is that entropy divided by k times the log of the number of energy levels, which is limited by the finite internal energy U.
 
Last edited:
  • #12
Andrew Mason
Science Advisor
Homework Helper
7,641
371
This is not correct. The thermodynamic entropy is the product of the Boltzmann constant and the information entropy. It is not an analogy which fails at some point, it is an identity. Multiplication of the dimensionless information entropy by the Boltzmann constant (which has units of thermodynamic entropy) produces a physical variable which then is involved in the richer theory of thermodynamics.
I just said that they are two quite different things, conceptually and practically. There may be a similarity in the mathematical expressions but that is about it, as far as I can see.

One is related to, and depends upon, temperature. The other is related to information theory, which has no concept of temperature. They are both statistical concepts. In thermodynamics, entropy is a statistical notion based on the logarithm of the number of microstates that a system of particles can have in phase space (momentum and position each with 3 degrees of freedom). In information theory, it is the logarithm of a set of event probabilities. You can define entropy for a small number of events - eg. two coin flips. You cannot do that in thermodynamics. You need sufficient numbers of particles to have a temperature. Two particles cannot have a temperature.

AM
 
  • #13
Rap
814
9
I just said that they are two quite different things, conceptually and practically. There may be a similarity in the mathematical expressions but that is about it, as far as I can see.

One is related to, and depends upon, temperature. The other is related to information theory, which has no concept of temperature. They are both statistical concepts. In thermodynamics, entropy is a statistical notion based on the logarithm of the number of microstates that a system of particles can have in phase space (momentum and position each with 3 degrees of freedom). In information theory, it is the logarithm of a set of event probabilities. You can define entropy for a small number of events - eg. two coin flips. You cannot do that in thermodynamics. You need sufficient numbers of particles to have a temperature. Two particles cannot have a temperature.

AM
Do you find any fault with the above example? Do you agree that the thermodynamic entropy is equal to the Boltzmann constant times the information entropy? If you do, then we only have a semantic argument. I mean, as long as we agree on how things work, then the name tags we put on our concepts are not important, except that they interfere with our ability to communicate with each other. I'm not too interested in semantic arguments.

Is there anything in our disagreement that would cause us to come up with conflicting predictions? Is there any way to test the validity of our viewpoints? If not, lets agree to disagree.

Regarding the information analog of temperature, in thermodynamics we have dU=T dS (for constant volume and number of particles). So, in order to have an information analog of temperature, we will also need an information analog of internal energy. Conservation of energy, when applied to the "message" of the energies of the individual particles implies that the sum of all the "characters" in the "message" is a constant. I would say that if we impose such a conservation law on any information message, in which the probabilities are of the form [itex]p_i \propto \exp(-\varepsilon_i\beta)[/itex] where[itex]\beta[/itex] is some constant, then we will have an information analog of temperature ([itex]T \propto 1/\beta[/itex])
 
Last edited:
  • #14
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Do you find any fault with the above example? Do you agree that the thermodynamic entropy is equal to the Boltzmann constant times the information entropy?
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.

AM
 
  • #15
5,439
9
But your argument doesn't persuade me.
I agree

There are many pairs of 'processes' in physics that possess the same mathematical form, but are not connected.

For example those which obey an inverse square law.

It does not follow that just because they have the same formula that say, gravitational and electrostatic attraction are one and the same.

Further, as I pointed out earlier, discrete mathematics can never exactly replicate continuous mathematics.
So we should not loose a sense of wonder at how many apparantly continuous natural processes now seem to be the aggregate result of discrete actions.
 
  • #16
Rap
814
9
I am saying that I don't see a real and substantial connection between the two concepts. Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message. I don't see how the two really have much to do with each other. If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider. But your argument doesn't persuade me.
AM
I think we have an untestable difference of opinion - a semantic argument, which, as I said, is not very interesting to me.

Just to be sure, though, what would you say would qualify as a "real and substantial connection"?
 
  • #17
Andrew Mason
Science Advisor
Homework Helper
7,641
371
Just to be sure, though, what would you say would qualify as a "real and substantial connection"?
A connection that is real and is one of substance.

To use Studiot's example, the fact that g = GM/r^2 and E = kQ/r^2 does not demonstrate that there is a real physical connection between g and E. The connection is one of form, not substance.

AM
 
  • #18
Rap
814
9
To use Studiot's example, the fact that g = GM/r^2 and E = kQ/r^2 does not demonstrate that there is a real physical connection between g and E. The connection is one of form, not substance.
AM
Another thing - the intensity of sound from a point source falls off as 1/r^2. Also, the area on a sphere encompassed by a cone with a vertex at the origin is proportional to the square of the radius of the sphere. These examples illustrate an underlying property of 3-dimensional Euclidean space. I would not say there is a physical connection between g and E and sound, but I would say that the similarity in the laws is not entirely coincidental, but says a lot about the nature of Euclidean 3 space. In other words, the connection is more than one of form. There is a substantial connection between Newtons law of gravity, Coulomb's law, and the theory of acoustic waves, and this connection is via the geometry of Euclidean 3-space.

Similarly, there is a substantial connection between the transmission and encoding of electronic messages and thermodynamics via information theory.

Trying to find the information equivalent of temperature, internal energy, etc. is like trying to find the analog of charge in the study of solid geometry. The 1/r^2 law follows from the properties of space, but you cannot say that Coulomb's law has nothing to do with the properties of space simply because there is no analog of charge in the study of solid geometry.

The connection between thermodynamic and information entropy is likewise, more than one of form. Information entropy isolates certain statistical ideas, independent of any physical application. It can then be applied to the transmission and encoding of electronic messages. It can also be applied to the concept of thermodynamic entropy. Thermodynamic entropy incorporates this theory into a richer theory of thermodynamics, just as Coulomb's law incorporates Euclidean geometry into a richer theory of electrostatics.

If I am sending electronic messages on a twisted pair, can I argue that information theory has no application here, because there is no analog of a twisted pair in information theory? I say no.
 
Last edited:
  • #19
5,439
9
These examples illustrate an underlying property of 3-dimensional Euclidean space. I would not say there is a physical connection between g and E and sound, but I would say that the similarity in the laws is not entirely coincidental, but says a lot about the nature of Euclidean 3 space.
You might say that but such a deduction would be in the same realms as Sherlock Holmes deducing connections between travellers who happen to be passengers on the same NO17 Clapham Omnibus.

AM only quoted the shortened form of the gravitational and electrostatic inverse square force laws.

If you take the complete versions you will immediately notice they are not quite the same, since mass is a positive definite quantity and charge comes in two polarities.
 
  • #21
Rap
814
9
You might say that but such a deduction would be in the same realms as Sherlock Holmes deducing connections between travellers who happen to be passengers on the same NO17 Clapham Omnibus.
Yes, and those connections are weaker, but substantial and not one of form.

If you take the complete versions you will immediately notice they are not quite the same, since mass is a positive definite quantity and charge comes in two polarities.
Right, and there is no twisted pair in thermodynamics.

Again - this is a semantic argument, I think we agree on how things work.
 
Last edited:
  • #22
72
0
Note: Incidentally, be very careful about relying on Wikipedia to understand entropy. There is a lot of incorrect information in the article on entropy. The first statement in that article is quite wrong:
I mentioned these two mistakes that you pointed out to the person I am having a conversation with and here is what he said:

]For example, a (theoretical) Carnot cycle operating between two reservoirs is a reversible cycle, so there is no increase in entropy. According to this statement, there should be 100% efficiency: 0 energy not available for useful work. This, of course, is incorrect.
His response: The website doesn't say there is no increase in entropy. Secondly, "theoretical maximum efficiency" doesn't mean 100% efficiency. You're jumping to conclusions. FYI,

theoretical maximum efficiency = 1 - Tc/Th

where Tc= temperature of cold reservoir, Th = temperature of hot reservoir. Unless Tc =Th, and there would be no point in building a machine for that case, then the theoretical maximum efficiency will always be less than 100%. That is another consequence of the second law of thermodynamics.

Suppose your reservoirs are operating at Tc= 20C, Th=80C, then the theoretical maximum efficiency = .75. This means the best possible machine you could ever build would be 75% efficient in that case, not that you will build such a machine, more likely it will have considerably less efficiency. But its theoretical maximum efficiency = 75%, no matter what.

Other errors: "Convertible energy" is not defined and at any rate, is not correctly used. If the theoretical maximum efficiency is achieved, there is no change in entropy so entropy would not "accumulate" in the system. Entropy is not "waste heat".

AM
His response: This is incorrect: "theoretical maximum efficiency" does not mean 100% efficiency, waste energy is in the form of dissipation, it wasn't meant to be read as the same as entropy, and "accumulate" means increase in case you don't know. You can't reconcile yourself with the idea that entropy increases. So here you go misreading the information on this website to fit your own misconception of what entropy means.

So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.
 
  • #23
Andrew Mason
Science Advisor
Homework Helper
7,641
371
His response: The website doesn't say there is no increase in entropy. Secondly, "theoretical maximum efficiency" doesn't mean 100% efficiency. You're jumping to conclusions. FYI,
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

Example 1: A heat engine operates between 500K and 300K with an efficiency of 10%. For each 1000 Joules of input energy, 900 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_1 = -1000/500 + 900/300 = 1.0 J/K[/tex]


Example 2: A heat engine operates between 1000K and 300K with an efficiency of 40%. For each 1000 Joules of input energy, 600 Joules are not turned into useful work. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_2 = -1000/1000 + 600/300 = 1.0 J/K[/tex]


Example 3: A Carnot engine operates between 500K and 300K with an efficiency of 1-3/5 = 40%. For 1000 Joules of input energy (heat flow) the amount not available for useful work is 600 J. The change in entropy for 1000 Joules of input heat is:

[tex]\Delta S_3 = -1000/500 + 600/300 = 0 J/K[/tex]


In Examples 1 and 2 the entropy change is the same. But there is 900 Joules of energy not available for work in the first and 600 not available in the second. So how does entropy tell me how much energy is not available for work?

In Example 3, the entropy change is 0 but the same amount of heat (600J) is not available for work as in Example 1, in which the entropy change is 1 J/K. How does entropy tell me how much energy is not available for work?

Unless Tc =Th, and there would be no point in building a machine for that case, then the theoretical maximum efficiency will always be less than 100%.
If Tc = Th, the efficiency would be 0: 1-1=0.

His response: This is incorrect: "theoretical maximum efficiency" does not mean 100% efficiency,
True. But that is not what the article said. It said entropy is a measure of the energy not available for useful work in a thermodynamic process. If the entropy change, 0, is a measure of the energy not available for useful work, then one could not be faulted for thinking that 0 energy is not available for useful work.

So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.
You will find that entropy is a difficult concept and not all attempts to explain it are correct. You will just have to read good physics books and stay away from Wikipedia on this topic if you want to eventually understand it.

AM
 
  • #24
5,439
9
So is he right or? I'm trying to understand entropy as fully as possible and I get different things from different people.
So did I waste my time, in my post#8 (to which you never replied), offering a viewpoint on your question?

I would agree with AM, and other knowedgeable people here that Wiki sometimes 'slips up' and presents erroneous or inconsistent material. This is not only true of thermodynamics so so always have to temper a Wiki offering with critical appraisal.
 
  • #25
72
0
So did I waste my time, in my post#8 (to which you never replied), offering a viewpoint on your question?

I would agree with AM, and other knowedgeable people here that Wiki sometimes 'slips up' and presents erroneous or inconsistent material. This is not only true of thermodynamics so so always have to temper a Wiki offering with critical appraisal.
No you did not. I appreciate your input on the matter. What would your response be to the person who made those comments in the above post defending Wikipedia?
 

Related Threads on Entropy and thermodynamics.

  • Last Post
Replies
7
Views
1K
Replies
2
Views
1K
  • Last Post
Replies
3
Views
5K
Replies
4
Views
6K
Replies
2
Views
2K
Replies
4
Views
1K
  • Last Post
Replies
15
Views
5K
Replies
0
Views
3K
Top