Confusion about entropy and disorder

In summary: Yes. In general, the more disordered a system becomes, the more probable it is to start out in a disordered state.
  • #1
techmologist
306
12
On pages 46-6 and 46-7 of Feynman Lectures, Vol 1, Feynman talks about irreversibility, order, and entropy in mechanics. He gives an example of an irreversible process:

Feynman said:
Suppose we have a box with a barrier in the middle. On the one side is neon ("black" molecules), and on the other, argon ("white" molecules). Now we take out the barrier, and let them mix. How much has the entropy changed? (p46-6)
...
Here we have a simple example of an irreversible process which is completely composed of reversible events. Every time there is a collision between any two molecules, they go off in certain directions. If we took a moving picture of a collision in reverse, there would be nothing wrong with the picture. In fact, one kind of collision is just as likely as another. So the mixing is completely reversible, and yet it is irreversible. Everyone knows that if we started with white and with black separated, we would get a mixture within a few minutes. If we sat and looked at it for several more minutes, it would not separate again but would stay mixed. So we have an irreversibility which is based on reversible situations. But we see the reason now. We started with an arrangemement which is, in some sense, ordered. Due to the chaos of the collisions, it becomes disordered. It is the change from an ordered arrangement to a disordered arrangement which is the source of the irreversibility.
(p. 46-7)

He then gives the definition of entropy as the logarithm of the number of ways in which the molecules could be arranged in the box and still look the same from outside, and equates entropy with disorder. Since there are about 2^N times as many ways of arranging the black and white molecules so that they are mixed as there are ways of arrangement that leave them separated, the separated situation corresponds to ordered, low entropy. The mixing process represents part of the universal change from order to disorder, increase in entropy. So far, this makes sense to me. But then he goes on,

In the case where we reversed our motion picture of the gas mixing, there was not as much disorder as we thought. Every single atom had exactly the correct speed and direction to come out right [the initial separated condition]! The entropy was not high after all, even though it appeared so.

So which is it? Does disorder really increase, or does it just appear to increase because we don't normally have the luxury of playing the movie backwards to see just how special and improbable the mixed condition really is?
 
Last edited:
Science news on Phys.org
  • #2
techmologist said:
Does disorder really increase, or does it just appear to increase because we don't normally have the luxury of playing the movie backwards to see just how special and improbable the mixed condition really is?

Yes. :-)

Entropy is an eternally confusing issue. Entropy is a property of a macrostate, i.e. in your terminology 'the way it looks from outside.' A macrostate is a collection of microstates, the various detailed ways in which the molecules could be arranged and still 'look the same.'

In Feynman's example, the original (separated) macrostate S1 has lower entropy than the final (mixed) macrostate S2. Why? Because there are more microstates in S2. Some of them came from S1, and if reversed will turn back into S1. But some of them (the overwhelming majority of them) cannot.

So if you compare S1 to all the states in S2, entropy has increased. If you could compare S1 to just the image of those states in S2, entropy has remained the same, but for typically large systems (10^23 molecules) there is no practical way to identify that subset.
 
  • #3
Thank you for that explanation. So, when talking about deterministic mechanics, scientists say things like the universe always moves from "order to disorder" or "less probable to more probable", they should add "according to our ability to distinguish macrostates", right? That is, in a Newtonian gas composed of little white and black billiard balls, there's nothing fundamentally more disordered about the particular mixed macrostate that results from time evolution from a separated macrostate. It is only that, from our point of view, it looks just exactly like any other mixed macrostate, and there are many more microstates corresponding to them than to separated macrostates. If we hadn't been watching it evolve from a separated condition, we would have no way of knowing it had recently been that way. And further, we would have to wait a nearly infinitely long time before we could expect to see the gas return to a separated condition.

That's still a little weird, because it seems that the increase of disorder is a subjective thing, at least in a Newtonian universe. The billiard ball molecules don't know which color they are, only the observer does.

Also, a few pages earlier (46-1), Feynman says,

Now, there are complicated mathematical demonstrations which follow from Newton's laws to demonstrate that we can get only a certain amount of work out when heat flows from one place to another, but there is great diffficulty in converting this into an elementary demonstration. In short, we do not understand it, although we can follow the mathematics.

Do you know which mathematical demonstration he is referring to? Seems like that would be a derivation of the 2nd law of thermodynamics from Newtonian mechanics, which I thought had never been done.
 
  • #4
techmologist said:
Thank you for that explanation. So, when talking about deterministic mechanics, scientists say things like the universe always moves from "order to disorder" or "less probable to more probable", they should add "according to our ability to distinguish macrostates", right?
Feynman uses "disorder" but he makes it clear in earlier chapters that it requires a special definition of "disorder".

It is not about order or disorder. All states are chaotic. Even in the initial state where the two kinds of gases are separated, it is completely chaotic at the microscopic level. It is about the number of equivalent microstates that can exist for a given macrostate. Each microstate is equally improbable. But there are many, many more microstates in which the gas molecules of each kind are in both sides of the container than for which they are on the same side.

Do you know which mathematical demonstration he is referring to? Seems like that would be a derivation of the 2nd law of thermodynamics from Newtonian mechanics, which I thought had never been done.
I think he is referring to the Boltzmann derivation of the second law using statistical analysis.

AM
 
  • #5
Andrew Mason said:
Feynman uses "disorder" but he makes it clear in earlier chapters that it requires a special definition of "disorder".

Right. But only a paragraph or two after he gives the technical definition of entropy, or disorder, as the logarithm of the number of microstates corresponding to the macrostate, he then says,

feynman said:
In the case where we reversed our motion picture of the gas mixing, there was not as much disorder as we thought. Every single atom had exactly the correct speed and direction to come out right! The entropy was not high after all, even though it appeared so.

This is what confused me. He seems to have reverted to a more common-sense notion of 'disorder', and even calls that 'entropy'.


Andrew Mason said:
I think he is referring to the Boltzmann derivation of the second law using statistical analysis.

AM

Cool. Thanks.
 
  • #6
techmologist said:
feynman said:
In the case where we reversed our motion picture of the gas mixing, there was not as much disorder as we thought. Every single atom had exactly the correct speed and direction to come out right! The entropy was not high after all, even though it appeared so.

This is what confused me. He seems to have reverted to a more common-sense notion of 'disorder', and even calls that 'entropy'.
This is why we should not use the term disorder. In the example Feynman gives, the state of disorder or chaos at the molecular level is the same as moment before we reverse the movie. The microstate in which the atoms are moving with a particular speed and direction is just as probable (or improbable) as the one in which the speeds and directions are reversed. However, if you change the speed an/or direction of just one of those molecules even slightly after the movie is reversed, the gas will not go back to a completely separated state.

For each microstate there are a gazillion different changes that will lead to microstates having the same macrostate (mixed) but only one that will lead to a particular microstate in which the molecules are not mixed.

AM
 
  • #7
khanacademy.org has several videos in their chemistry section that deal with entropy. I found them very enlightening.
 
  • #8
Andrew Mason:

You're right about disorder not being quite adequate for understanding entropy and how irreversible processes arise. And I like your explanation in terms of sensitive dependence on initial condition. That must be the real origin of the "disorder"--that the positions and momenta of the individual particles are infinite precision real numbers, and that no matter how good our measurements were, there would be no way of predicting how the system evolves more than a microsecond or two into the future.

mrspeedybob:

The video list at khan academy is impressive. That Sam Khan knows a lot of stuff. I will be going back there. Thank you.
 
  • #9
techmologist said:
On pages 46-6 and 46-7 of Feynman Lectures, Vol 1, Feynman talks about irreversibility, order, and entropy in mechanics. He gives an example of an irreversible process:

<snip>

So which is it? Does disorder really increase, or does it just appear to increase because we don't normally have the luxury of playing the movie backwards to see just how special and improbable the mixed condition really is?

Feynman is presenting what is known as the "Gibbs paradox". It appears as a paradox, because if you did not know which atom was neon or argon, you would have to conclude that the entropy change is zero. This is the essence of the argument that entropy is a 'subjective' quantity.

The paradox was first solved (AFAIK) by Jaynes, who showed that knowing what the microstate is (i.e. that some atoms are argon, some are neon) is equivalent to an amount of free energy. This energy cannot be accessed if you do not know there are two distinct populations, so there is no paradox.

http://www.google.com/url?sa=t&sour...sg=AFQjCNG0jZg-swJso6SeZ-G01bDFPmsuww&cad=rja
 
  • #10
techmologist said:
...that the positions and momenta of the individual particles are infinite precision real numbers, and that no matter how good our measurements were, there would be no way of predicting how the system evolves more than a microsecond or two into the future..

Well, technically, that's not true. The precision we need in our measurements to predict how the system evolves to within some fixed error grows exponentially in time, so for any time, the precision we need is finite. Its just that it grows ridiculously large (but not infinite) in a short amount of time.
 
  • #11
Andy Resnick said:
Feynman is presenting what is known as the "Gibbs paradox". It appears as a paradox, because if you did not know which atom was neon or argon, you would have to conclude that the entropy change is zero. This is the essence of the argument that entropy is a 'subjective' quantity.

The paradox was first solved (AFAIK) by Jaynes, who showed that knowing what the microstate is (i.e. that some atoms are argon, some are neon) is equivalent to an amount of free energy. This energy cannot be accessed if you do not know there are two distinct populations, so there is no paradox.

http://www.google.com/url?sa=t&sour...sg=AFQjCNG0jZg-swJso6SeZ-G01bDFPmsuww&cad=rja

That paper is exactly what I needed! I'll need to read it at least once more to fully get it, but that was very helpful for dissolving the "problem" of subjectivity in entropy. Thank you.

Rap said:
Well, technically, that's not true. The precision we need in our measurements to predict how the system evolves to within some fixed error grows exponentially in time, so for any time, the precision we need is finite. Its just that it grows ridiculously large (but not infinite) in a short amount of time.

You're right. That's basically what I had in mind, but I didn't say it correctly. The point is that for any fixed measurement precision, it won't be long before it's not precise enough to predict the evolution in a helpful way. That's what I meant by saying the positions and momenta are infinite precision real numbers. There's potentially infinite algorithmic information in each coordinate, but we can only measure the first 10-15 digits. So in a very short time, we will have no ability to distinguish one microstate from another.
 
  • #12
techmologist said:
That paper is exactly what I needed! I'll need to read it at least once more to fully get it, but that was very helpful for dissolving the "problem" of subjectivity in entropy. Thank you.

That paper by Jaynes is EXCELLENT. I had to read it at least three times, and I'm still not sure I get all of it. If you have any questions, please ask. Maybe I know the answer, maybe I don't and will learn something.
 
  • #13
Rap said:
That paper by Jaynes is EXCELLENT. I had to read it at least three times, and I'm still not sure I get all of it. If you have any questions, please ask. Maybe I know the answer, maybe I don't and will learn something.


Yes, it is excellent. Thanks again to Andy Resnick for sharing it with us! I am still trying to absorb the insights of it, but I can already tell that when I do, I will understand a lot more than I did. I admit that the section on Gibbs' Statistical Mechanics was somewhat over my head, but I think I will be able to get the main ideas of the paper by reading it again (a few times). Hopefully I will soon understand enough of it so that I can ask some good questions.

I feel much better now about the "subjective" or "anthropomorphic" nature of the 2nd law. Before I thought there was some hard, physical distinction between processes that are truly irreversible in a thermodynamic sense and other processes that just seem irreversible to humans. But Jaynes shows that by expanding your set of macrovariables, you can extend the application of thermodynamics to the latter as well.

For example, the classic examples that teachers use to explain the 2nd law as a universal tendency to disorder are their children's rooms or their own desks. Kids' bedrooms start out clean and end up a mess, and professors' desks begin the term in an immaculate state but turn into a mountain of unorganized papers. It never goes the other way spontaneously. But then, if the teacher is careful, he has to backtrack and say that this is just an analogy to the 2nd law, not an example. Thermodynamics only cares about the total energy of the desk, which doesn't change because some papers are unorganized. So it is not an increase in the "physical" entropy of the desk.

But after reading Jaynes' paper, I think this sells thermodynamics short. You start the term out with a well defined place for every paper, book, pen, coffee mug, etc. on your desk. So you could attach to each item on the desk a string and pulley that lifts a small weight when the item gets randomly bumped out of position. The heights of all these little weights are your additional macrovariables. Now when a student comes in and accidently knocks over a stack of papers, an arrray of small weights are lifted a few centimeters, and you can tie off the strings so the weights stay put. You do this every time something gets bumped or moved on your desk. By the time your desk is a complete mess, a bunch of little weights have each been raised a small distance. So the knowledge of these new macrovariables has allowed you to extract some free energy from an initially organized desk.

EDIT: I should add on to that last sentence so it reads "...extract free energy from an initially organized desk in combination with the random bumps it receives from you and your students".
 
Last edited:
  • #14
Okay, after a long delay, I have some questions:

On p.2 Jaynes says:
In the present note we consider the "Gibbs Paradox" about entropy of mixing and the logically inseparable topics of reversibility and the extensive property of entropy.

Regarding extensivity and its connection with the entropy of mixing calculation, is Jaynes referring to the fact that we assume that the total increase in entropy is the sum of the contributions from the two gases, A1 and A2? I guess that would not hold generally, for example if the gases interact in some way.

Jaynes mentions gravity as a long-range interaction that can make the entropy of a system non-extensive. Does that mean that in a system where things move in a gravitational field, the entropy depends on the positions of these objects? For example, does raising a weight involve a change in entropy, apart from the possible rise in entropy associated with the production of work.

Also, does my analysis of the messy desk in post #13 make sense?

Thanks.
 
  • #15
techmologist said:
Regarding extensivity and its connection with the entropy of mixing calculation, is Jaynes referring to the fact that we assume that the total increase in entropy is the sum of the contributions from the two gases, A1 and A2? I guess that would not hold generally, for example if the gases interact in some way..

Yes - if the gases interact, that would change things.

techmologist said:
Jaynes mentions gravity as a long-range interaction that can make the entropy of a system non-extensive. Does that mean that in a system where things move in a gravitational field, the entropy depends on the positions of these objects? For example, does raising a weight involve a change in entropy, apart from the possible rise in entropy associated with the production of work..

Hmm - I have not thought about that.

techmologist said:
Also, does my analysis of the messy desk in post #13 make sense?.

It does, if you assume that there are many ways to have a messy desk, but only a few ways to have an orderly desk. That assumption is what makes the analogy valid.
 
  • #16
Rap said:
Hmm - I have not thought about that.

The association of entropy with gravity has bothered me since I read something about it in an article about the expanding universe by Steven Frautschi. If there is an entropy decrease, for example, associated with raising a weight (I don't know whether there is), then it seems like you could decrease the entropy of the universe by raising a weight with a reversible engine. Obviously I'm missing something.


It does, if you assume that there are many ways to have a messy desk, but only a few ways to have an orderly desk. That assumption is what makes the analogy valid.

That's a good point. The way I defined 'orderly' above is kind of artificial, since it applies to only one position for every item on the desk. Clearly there is more than one way for a desk to be orderly, but given enough items, I think there must be many more ways for a desk to be disorderly. For instance, it should take fewer words to describe a desk that is orderly than one that is all jumbled up. I'm hoping that the second law can be legitimately applied to this situation--more than just an analogy.

EDIT: one possible problem with my application of the 2nd law to the messy desk is that it doesn't make any reference to the microstates, i.e. the positions and momenta of all particles that make up the items on the desk. Clearly there are many, many microstates corresponding to the observation "the center of mass of the coffee mug is here." It almost seems necessary to introduce an intermediate level of description, perhaps "mesostate", in which the exact position of every item on the desk is specified. Then every macrostate (which determines "ordered" vs. "disordered") is compatible with a number of mesostates. And each mesostate is in turn compatible with roughly the same (very large) number of microstates. "Order" corresponds to a few mesostates, while "disorder" corresponds to many more.

My earlier attempt associated "order" with exactly one mesostate, which is too restrictive. If I were sufficiently clever, I could probably think of some way to extract some work when the coffee mug gets tipped over, without having to worry about its exact position on the desk. Similar things could be done with stacks of papers, books, etc.
 
Last edited:
  • #17
techmologist said:
That's a good point. The way I defined 'orderly' above is kind of artificial, since it applies to only one position for every item on the desk. Clearly there is more than one way for a desk to be orderly, but given enough items, I think there must be many more ways for a desk to be disorderly. For instance, it should take fewer words to describe a desk that is orderly than one that is all jumbled up. I'm hoping that the second law can be legitimately applied to this situation--more than just an analogy.

EDIT: one possible problem with my application of the 2nd law to the messy desk is that it doesn't make any reference to the microstates, i.e. the positions and momenta of all particles that make up the items on the desk. Clearly there are many, many microstates corresponding to the observation "the center of mass of the coffee mug is here." It almost seems necessary to introduce an intermediate level of description, perhaps "mesostate", in which the exact position of every item on the desk is specified. Then every macrostate (which determines "ordered" vs. "disordered") is compatible with a number of mesostates. And each mesostate is in turn compatible with roughly the same (very large) number of microstates. "Order" corresponds to a few mesostates, while "disorder" corresponds to many more.

There is entropy in the informational sense, and thermodynamic entropy, and they are related by the Boltzmann constant, so they are different. The desk analogy is more of an informational entropy, but the second law uses thermodynamic entropy, so I don't think you can apply the second law to the desk analogy. Also, what you call a "mesostate" is the analog of microstate in the desk analogy. The objects on the desk are analogous to molecules. If you want to break down the objects into actual microstates involving particles, then there is no analogy to entropy, it IS entropy.

Regarding Jaynes and informational entropy, I read a very interesting thing - the informational entropy is the minimum number of yes/no questions you have to ask about a system in a given macrostate in order to determine its microstate. (Times some constant which I forget). You can see this is true because if each question you asked cut the number of possible microstates in half, the number of potential microstates after n questions is 1/2^n and the logarithm of that is -n.

So if you know you have an "ordered" desk, you need to ask fewer questions about where everything is. If by "ordered" you mean "a place for everything and everything in its place", then, once you know it is ordered, the number of questions you have to ask is zero.
 
  • #18
Rap said:
There is entropy in the informational sense, and thermodynamic entropy, and they are related by the Boltzmann constant, so they are different. The desk analogy is more of an informational entropy, but the second law uses thermodynamic entropy, so I don't think you can apply the second law to the desk analogy. Also, what you call a "mesostate" is the analog of microstate in the desk analogy. The objects on the desk are analogous to molecules. If you want to break down the objects into actual microstates involving particles, then there is no analogy to entropy, it IS entropy.

At the top of p.8 of the Jaynes paper, he hints that the distinction between thermodynamic entropy and "human" information is not as clear as some physicists would have you believe. I got excited by this, and perhaps I read too much into it. I was thinking (hoping) that he is saying the 2nd law can be applied to information entropy, too. This being the case because whenever you have a process where information entropy increases, there should be some way that you could have extracted some useful work out of the process if you had more detailed information.

Regarding Jaynes and informational entropy, I read a very interesting thing - the informational entropy is the minimum number of yes/no questions you have to ask about a system in a given macrostate in order to determine its microstate. (Times some constant which I forget). You can see this is true because if each question you asked cut the number of possible microstates in half, the number of potential microstates after n questions is 1/2^n and the logarithm of that is -n.

That is indeed very interesting...it shows that thermodynamic entropy is a special case of information entropy, right?

So if you know you have an "ordered" desk, you need to ask fewer questions about where everything is. If by "ordered" you mean "a place for everything and everything in its place", then, once you know it is ordered, the number of questions you have to ask is zero.

Exactly. A description of the desk could be written as the list of answers to a sequence of yes/no questions. For example, asking "Is the coffee inside the coffee mug" or "is the pencil in its drawer" would eliminate many of the possible mesostates corresponding to disorder, while still leaving open those mesostates corresponding to order.
 
  • #19
techmologist said:
At the top of p.8 of the Jaynes paper, he hints that the distinction between thermodynamic entropy and "human" information is not as clear as some physicists would have you believe. I got excited by this, and perhaps I read too much into it. I was thinking (hoping) that he is saying the 2nd law can be applied to information entropy, too. This being the case because whenever you have a process where information entropy increases, there should be some way that you could have extracted some useful work out of the process if you had more detailed information.

No, thermodynamic entropy is a special case of information entropy, and the second law is thermodynamics, so I don't think it can be applied to information entropy. Thermodynamic entropy equals the Boltzmann constant times the informational entropy. The Boltzmann constant is physics, and this is where certain physicists have a fit. They want the word entropy to refer to physical entropy and they want information entropy to not use the word entropy. I don't get hung up on semantic arguments, but I like "information entropy" and "thermodynamic entropy".



techmologist said:
That is indeed very interesting...it shows that thermodynamic entropy is a special case of information entropy, right?

Yes, that's how I would put it.
 
  • #20
Rap said:
No, thermodynamic entropy is a special case of information entropy, and the second law is thermodynamics, so I don't think it can be applied to information entropy. Thermodynamic entropy equals the Boltzmann constant times the informational entropy. The Boltzmann constant is physics, and this is where certain physicists have a fit. They want the word entropy to refer to physical entropy and they want information entropy to not use the word entropy.

Yep. Surprisingly, Hubert Yockey, a physicist who applies Shannon information theory to biological problems, is one of those who adamantly maintains that thermodynamic entropy and information entropy have nothing to do with one another. They merely look similar. Not being a scientist myself, I don't feel qualified to argue with him. I can only note the existence of other credentialed people who don't agree with him.

I don't get hung up on semantic arguments, but I like "information entropy" and "thermodynamic entropy".

Sounds good to me.
 
  • #21
Rap said:
No, thermodynamic entropy is a special case of information entropy, and the second law is thermodynamics, so I don't think it can be applied to information entropy.

I don't understand what you are saying- can you explain why you think there are multiple kinds of entropy, and why some forms are covered by the second law while others are not?
 
  • #22
Andy Resnick said:
I don't understand what you are saying- can you explain why you think there are multiple kinds of entropy, and why some forms are covered by the second law while others are not?

Well, I think you have to look at the history. First, there was classical thermodynamic entropy. Classical thermodynamics has nothing to do with statistical mechanics, its a theory of macroscopic measurements of heat, work, etc. It doesn't even need to know that there are atoms. Classical thermodynamic entropy is defined by the second law, and its the result of macroscopic measurements on thermodynamic systems and has nothing to do with "information entropy". Then Boltzmann developed statistical mechanics, which explained entropy as being equal to the Boltzmann constant times the log of the number of microstates which give the same macrostate. The Boltzmann constant has units of physical entropy. Then Shannon comes up with what he called information entropy. It is very general, and when applied to statistical mechanics, it is the logarithm part of the statistical mechanical definition of physical entropy. Then some physicists went ballistic and said entropy is what is defined in classical thermodynamics which makes no use of information entropy and wanted the whole name "information entropy" discarded. Others (whom I tend to agree with) said that statistical mechanics showed that the essential nature of physical entropy is information entropy.

Thermodynamics and statistical mechanics develop a whole lot of physical relationships between entropy, heat, work, temperature which I don't think have an analog in information entropy. The Boltzmann constant provides the bridge from information entropy to physical entropy. But I might be wrong, I haven't thought about it that much. Maybe its worth looking into. The first thing to look for is the analog of temperature, which is the "conjugate variable" to entropy (i.e. like a close relative). If there was an information analog of temperature, that would be very interesting. Come to think of it, there is a theory that internet connections behave as a thermodynamic Bose gas - see http://en.wikipedia.org/wiki/Bose–Einstein_condensation:_a_network_theory_approach - the bottom line I guess is maybe there is something that can be done.
 
  • #23
Rap said:
Well, I think you have to look at the history. First, there was classical thermodynamic entropy. Classical thermodynamics has nothing to do with statistical mechanics, its a theory of macroscopic measurements of heat, work, etc. It doesn't even need to know that there are atoms. Classical thermodynamic entropy is defined by the second law, and its the result of macroscopic measurements on thermodynamic systems and has nothing to do with "information entropy". Then Boltzmann developed statistical mechanics, which explained entropy as being equal to the Boltzmann constant times the log of the number of microstates which give the same macrostate. The Boltzmann constant has units of physical entropy. Then Shannon comes up with what he called information entropy. It is very general, and when applied to statistical mechanics, it is the logarithm part of the statistical mechanical definition of physical entropy. Then some physicists went ballistic and said entropy is what is defined in classical thermodynamics which makes no use of information entropy and wanted the whole name "information entropy" discarded. Others (whom I tend to agree with) said that statistical mechanics showed that the essential nature of physical entropy is information entropy.

Thermodynamics and statistical mechanics develop a whole lot of physical relationships between entropy, heat, work, temperature which I don't think have an analog in information entropy. The Boltzmann constant provides the bridge from information entropy to physical entropy. But I might be wrong, I haven't thought about it that much. Maybe its worth looking into. The first thing to look for is the analog of temperature, which is the "conjugate variable" to entropy (i.e. like a close relative). If there was an information analog of temperature, that would be very interesting. Come to think of it, there is a theory that internet connections behave as a thermodynamic Bose gas - see http://en.wikipedia.org/wiki/Bose–Einstein_condensation:_a_network_theory_approach - the bottom line I guess is maybe there is something that can be done.

I agree with your first paragraph, for the most part. The second law is a statement about processes, and is valid for any process- mechanical, chemical, transmission of a signal, etc. there are several equivalent statements of the second law, and as a result there are several equivalent formulations of entropy.

Your second paragraph is slightly more suspect- information is not a mechanical degree of freedom (neither is chemical potential), but you correctly identify the interplay between temperature and entropy. Signal transmission does involve an effective temperature (usually called the noise-equivalent temperature)

http://en.wikipedia.org/wiki/Noise_temperature
 
  • #24
Andy Resnick said:
Your second paragraph is slightly more suspect- information is not a mechanical degree of freedom (neither is chemical potential), but you correctly identify the interplay between temperature and entropy. Signal transmission does involve an effective temperature (usually called the noise-equivalent temperature)

http://en.wikipedia.org/wiki/Noise_temperature

Well, I never said information was a mechanical degree of freedom. But neither is entropy. Anyway, I have to think about the analog of temperature in information entropy. Thx for the link, that helps.
 
  • #25
Rap said:
Well, technically, that's not true. The precision we need in our measurements to predict how the system evolves to within some fixed error grows exponentially in time, so for any time, the precision we need is finite. Its just that it grows ridiculously large (but not infinite) in a short amount of time.

Once you reach the level of precision limited by the Heisenberg uncertainty principal you have run into a limit on how far ahead and how precisely you can model the evolution of the system.
 
  • #26
mrspeedybob said:
Once you reach the level of precision limited by the Heisenberg uncertainty principal you have run into a limit on how far ahead and how precisely you can model the evolution of the system.

Yes, absolutely. But the statement I was responding to was:

techmologist said:
... the positions and momenta of the individual particles are infinite precision real numbers, and that no matter how good our measurements were, there would be no way of predicting how the system evolves more than a microsecond or two into the future.

I just wanted to make sure that the idea of the butterfly effect was not misinterpreted as the error going to infinity in a finite time. With the reference to the "positions and momenta", that's what it seemed to be implying to me.
 

What is entropy?

Entropy is a measure of the amount of disorder or randomness in a system. It is a fundamental concept in thermodynamics that helps us understand the behavior of energy and matter.

What is disorder?

Disorder, in the context of entropy, refers to the randomness or unpredictability in the arrangement of particles in a system. The higher the disorder, the higher the entropy.

Is entropy the same as disorder?

No, entropy and disorder are not the same. While entropy is a quantitative measure of disorder, disorder itself is a qualitative concept that describes the randomness or unpredictability in a system.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of an isolated system will never decrease over time. This means that in any natural process, the total disorder of a system will always increase or remain the same.

What are some real-world examples of entropy and disorder?

A melting ice cube, a gas expanding in a container, and the rusting of metal are all examples of increasing entropy and disorder. On the other hand, a crystal forming from a solution or a clean and organized room have lower entropy and disorder.

Similar threads

Replies
16
Views
848
  • Thermodynamics
Replies
3
Views
1K
Replies
22
Views
2K
  • Thermodynamics
Replies
6
Views
1K
  • Thermodynamics
Replies
5
Views
959
Replies
21
Views
4K
Replies
3
Views
967
Replies
3
Views
1K
Replies
2
Views
2K
Replies
7
Views
2K
Back
Top