Introduced to the concept of entropy in school

In summary, my teacher said that entropy can be described as "disorder or chaos" and the 2nd law of entropy is 'change in entropy is greater then zero.So that makes more sense now.
  • #1
strid
77
0
Last we were introduced to the concept of entropy in school... I was quite suspicious to it already at first sight..

It is something like that in a closed system, chaos always increases...
meaningen that universe is heading towards chaos... (i.e. universe is heading towards where materiai will be distributed allover the universe(and not in bunches as planets and stars))


but what about the creation of our solar system...?
dont they believe that it was like a cloud of gases and stuff, and then bunched together to many "bodies of materia" that collided which other and ended up in 9 big "bodies" rotating arounf the big sun...?

that goes against the laws of entropy, doesn't it?
 
Science news on Phys.org
  • #2
strid said:
Last we were introduced to the concept of entropy in school... I was quite suspicious to it already at first sight..

It is something like that in a closed system, chaos always increases...
meaningen that universe is heading towards chaos... (i.e. universe is heading towards where materiai will be distributed allover the universe(and not in bunches as planets and stars))
It is not really correct or helpful to think of increasing entropy as increasing chaos or disorder. Entropy is a thermodynamic concept that started out as a simple observation: "heat cannot flow from a cold reservoir to a hot reservoir without the addition of work" or "heat from a reservoir cannot be completely converted to work - some of the heat must always flow to a reservoir at a colder temperature". The change in entropy is a measure of the amount of work required to make heat flow from a hot to a cold reservoir, or a measure of the amount of work that can be extracted from heat flowing from a hot reservoir to a reservoir at lower temperature.

Attempts were made in the 19th century to explain entropy in terms of statistics, but they have created much confusion and difficulty. As a result, entropy is a difficult and frequently misunderstood concept.

AM
 
  • #3
really?

my teacjer said that entropy could be described as "disorder or chaos" and the 2nd law of entropy is 'change in entropy is greater then zero.

Meaning that entropy (chaos) always increases...
Straight from my notes:
"Locally entropy can decrease, as long as entropy increases at least as much somewhere else"

the heat stuff you mentioned is one thing wee entropy can be applied, my teacher mentioned that too... but still that is also just an example where "chaos" increases...

to rephrase my questions:

Does the laws of entropy say that chaos always increases?
If yes, how do you explian my example of the solar system...
 
  • #4
strid said:
really?
Really.
Does the laws of entropy say that chaos always increases?
No. The law of increasing disorder does not apply to gravitational collapse, for example.
If yes, how do you explian my example of the solar system...
You cannot use the law of entropy to explain examples of highly ordered and disordered systems. You have to know the history of the 'system' ie how the system originated.

AM
 
  • #5
Entropy was introduced in equilibrium thermodynamics,true.However,once with the works of Boltzmann & Gibbs,it became clear that the nature of entropy is not thermodynamical at all,but statistical.Actually,the laws of empirical thermodynmics (CTPCN fromulation) are found elegantly by the mean of statistical physics.

There are 2 definitions of entropy:
[tex] S_{class}=:-k \langle \ln \rho \rangle [/tex]

[tex] S_{quant}=:-k \langle \ln \hat{\rho} \rangle [/tex]

Anything rlse is just a bunch of logical consequences.

Daniel.
 
Last edited:
  • #6
dextercioby said:
Entropy was introduced in equilibrium thermodynamics,true.However,once with the works of Boltzmann & Gibbs,it became clear that the nature of entropy is not thermodynamical at all,but statistical.Actually,the laws of empirical thermodynmics (CTPCN fromulation) are found elegantly by the mean of statistical physics.

There are 2 definitions of entropy:
[tex] S_{class}=:-k \langle \ln \rho \rangle [/tex]

[tex] S_{quant}=:-k \langle \ln \hat{\rho} \rangle [/tex]

Anything rlse is just a bunch of logical consequences.

Daniel.


What is [tex] \rho [/tex] in this case? I am aware of the [tex]S = -k_B ln T[/tex] relationship from thermodynamics, but have not seen this.

Thanks.
 
  • #7
Nope.It can't be the formula you pictured there.In what case is it valid (assuming by absurd that it would be correct).

[tex]\rho[/tex] is the probability density on the statistical ensemble.And [tex]\hat{\rho}[/tex] is the density operator.

Daniel.
 
  • #8
dextercioby said:
Nope.It can't be the formula you pictured there.In what case is it valid (assuming by absurd that it would be correct).

[tex]\rho[/tex] is the probability density on the statistical ensemble.And [tex]\hat{\rho}[/tex] is the density operator.

Daniel.

Oh yeah, I had a sudden memory lapse, I meant to write:

[tex] S = k_B \ln W [/tex]

Where W is number of states.
So that makes more sense now.

I must have been semi-remembering a Free Energy formula or something.
Cheers for that.
 
  • #9
Nope,that W is not the # of states.You may check a book on SM for its name and its significance.Anyway,you have tried to depict Boltzmann's formula.In the axiomatical formulation of equilibrium SM,it's nothing but a consequence.

Daniel.
 
  • #10
I've never been comfortable with the concept of "entrophy". It sounds too much like the energy is going somewhere, but nobody knows where.
 
  • #11
It may be because you were taught the Boltzman concept of entropy rather than the simpler Clausius concept first. Entropy is not about disorder. It is about reversibility of thermodynamic processes. There is no principle that the disorder always increases. Analogies to Humpty-Dumpty are quite misleading.

AM
 
  • #12
Andrew said:
Entropy is not about disorder.

Does that mean that the work done by Gibbs,von Neumann and C.Shannon is wrong...?

:bugeye:

Daniel.
 
  • #13
strid said:
Last we were introduced to the concept of entropy in school... I was quite suspicious to it already at first sight..

It is something like that in a closed system, chaos always increases...
meaningen that universe is heading towards chaos... (i.e. universe is heading towards where materiai will be distributed allover the universe(and not in bunches as planets and stars))


but what about the creation of our solar system...?
dont they believe that it was like a cloud of gases and stuff, and then bunched together to many "bodies of materia" that collided which other and ended up in 9 big "bodies" rotating arounf the big sun...?

that goes against the laws of entropy, doesn't it?

You have received a bunch of very good answers here, which really tried to convey that fact that the term "entropy" has a deeper meaning than just "chaos" or "disorder". BTW, be VERY careful in using the word "chaos", because in physics and mathematics, chaos is NOT equal to disorder. There's a definite meaning to the term chaos that should not be confused with the pedestrian usage of that word.

Now, having said that, I will attempt at answering your question using the level of understanding that you have been given. In other words, I will try to show you why the formation of the solar system, etc., does not violate any thermodynamics laws, even the assumption that "entropy" is disorder, which is the way you understand it.

Let's assume that your idea is correct, that the formation of planets and solar system is a reduction in entropy of the planets+sun system (pay careful attention to what the whole "system" in question is here). Now, the 2nd Law of Thermodynamics clearly indicates that the net entropy can only either increase, or remains constant. However, this only applies (at least in the conventional system) to a closed system with no interaction from outside the system. Our solar system has two different "external sources" of energy:

1. The energy and fields from other nearby systems
2. The gravitational potential energy that is not included in the statistics of randomly moving particles [i.e. the standard thermodynamics principle ignores particle-particle interactions other than elastic collisions between them]

These two factors causes the planets+sun system to be NOT an insolated, adiabatic, closed system, but rather an open one. There is no reason to expect that the entropy of that system cannot decrease, the very same way that the entropy of one part of a Carnot cycle actually decreases.

So, even if we apply what you understood as "entropy", and use your assumption that formation of planets and sun is a reduction in disorder, there is still nothing here that violates any physics principles.

Zz.
 
  • #14
dextercioby said:
Does that mean that the work done by Gibbs,von Neumann and C.Shannon is wrong...?
No. Information theory and Statistical theory have distinct concepts of entropy. They have really very little usefulness in thermodynamics. They do nothing to illustrate the concept of thermodynamic entropy and do everything to confuse it.

The problem is that students struggle to acquire the thermodynamic concept but instead are thrown statistical concepts that even the teachers don't fully understand. Gibbs' statistical explanation is incomprehensible and does nothing to deepen one's understanding. It is far better to keep information theory and thermodynamics separate, for purely pedagogical reasons if not for scientific reasons.

As far as disorder is concerned, it depends on how one defines order. Humpty Dumpty is not more ordered than a particular arrangement of smashed HD pieces. It is just that there are many more ways to smash an egg and much fewer ways to make it resemble an egg. But inside the chicken, Humpties are being made all the time and it has nothing really to do with entropy.

AM
 
  • #15
Just to add a bit to what ZapperZ has already said, remember that as particles fall down a gravitational well to smash into and stick onto the surface of a growing planet or moon, there will be heat generated in the collision. This heat will create electromagnetic energy (much of it in the infrared, I reckon) which for the most part escapes from our solar system. So to do your energy and entropy accounting with a closed system, you would have to have your system expanding outward at the speed of light, starting four billion or so years ago when the solar disk first appeared on the scene. By today, that has become a mighty large system! In fact, it covers a substantial fraction of the visible universe.
 
  • #16
I'd like to point out that a similar argument has been put forth by creationists in their attempt to discredit evolution. They argued that since "life" is essentially order out of disorder in the evolutionary picture, then the evolution of life on Earth violates the 2nd Law of Thermodynamics. If you don't believe me, do a google on "evolution thermodynamics".

I have explained in one of my journal entry why this is nothing more than ignorance of basic physics. The explanation is similar to the one that I've given here in this thread. Such foolishness is simply ONE of the many examples where people who only learn bits and pieces of something but nevertheless assume that they have understood all there is to know. As I've said many times, imagination without knowledge is ignorance waiting to happen.

Zz.
 
  • #17
As my chemistry teacher once put it:
Entropy is disorder.
High entropy means high probability of things to happen :approve:

Like for instance yours truly tripping over something in his incredibly messy study and end up in the hospital with a broken ankle... :rolleyes:
 
  • #18
gschjetne said:
As my chemistry teacher once put it:
Entropy is disorder.
High entropy means high probability of things to happen :approve:
This would require defining 'disorder' in a statistical sense - a measure related to the variation (eg. width of the standard deviation) of speeds or energies of molecules in the warm reservoirs compared to that of the hot and cold reservoirs. That is a very specific concept and it is related to thermodynamic entropy. But you cannot then replace that very specific statistical concept of 'disorder' (ie. populations of molecules with greater variation in speeds having more disorder) with a general non-statistical concept of 'disorder' and say that the general concept is also related to thermodynamic entropy. It isn't. Thermodynamic entropy is related to disorder only if you define disorder in the specific statistical sense.

The principles behind information theory (also statistically based) which deal with information loss are again different. Similar statistical analyses are used, but to suggest that 'bandwidth' in the information sense, explains thermodynamic entropy is a big stretch and, in any event, is not very instructive or helpful to explain the thermodynamic second 'law'. The information theory equivalent of the 'second law' is not really a law at all. It can, and is often, broken all the time. If it wasn't, the internet would not function.

So entropy is not related to disorder in the general sense such as your broken ankle (which sounds like a true story, so you have my sympathies if it is). To start a discussion of entropy and the second law of thermodynamics with a general discussion of order and disorder simply confuses the student and is most often quite incorrect.

AM
 
  • #20
I think that this has always been a problem faced by educators, especially at the elementary level - when teaching something new, what exactly do you tell the students what it is, so that they have some idea. Most educators, understandably, would simply describe entropy as disorder. Those of us who are lucky enough to further our knowledge in this field of study later on realize that such a definition is, at best, inadequate and incomplete. However, there are many people walking around who did not have the chance to study this further and therefore, are stuck with the understanding that "entropy=disorder".

For people who do fall in that category, if there's anything you can get out of this thread, is that you need to at least be aware that in physics, the definition that "entropy=disorder" should not be adhered to as being a clear meaning of what entropy is. Several links have been given above on why this isn't true. I will give one more, which I strongly recommend people to read (it has stuff from the elementary to advance level):

http://www.entropysite.com/

Hopefully, this will start to eradicate, a little bit, of such misunderstanding.

Zz.
 
  • #21
I had to teach freshmen chemistry lab this past year and our professor invoked the word "disorder" to describe entropy. In my research field (biophysics) we often talk about entropy in terms of the number of states a system can access. For instance when a protein binds to DNA the entropy decreases as the DNA cannot access as many configuration in conformational space as if it were random coil.
 

1. What is entropy?

Entropy is a scientific concept that describes the level of disorder or randomness in a system. In simpler terms, it is a measure of how much energy is unavailable for use or work in a system.

2. How is entropy related to the concept of energy?

Entropy and energy are closely related, as they are both measures of the amount of disorder in a system. While energy is the ability to do work, entropy is a measure of the energy that is unavailable for work.

3. Why is entropy often referred to as the "arrow of time"?

This phrase refers to the idea that entropy tends to increase over time. This means that systems tend to become more disordered and chaotic as time passes, rather than becoming more organized.

4. How is entropy relevant to everyday life?

Entropy plays a role in many aspects of our daily lives, from the food we eat to the energy we use. For example, food spoils and energy dissipates because of the natural tendency for entropy to increase.

5. Can entropy be reversed or decreased?

While entropy can technically be reversed or decreased in certain systems, it would require a significant input of energy. In most cases, the natural tendency of entropy is to increase, making it difficult to reverse or decrease in everyday situations.

Similar threads

  • Quantum Physics
Replies
16
Views
2K
  • Other Physics Topics
Replies
2
Views
2K
Replies
6
Views
2K
  • Science Fiction and Fantasy Media
Replies
8
Views
3K
Replies
1
Views
5K
Replies
15
Views
38K
  • Other Physics Topics
Replies
0
Views
674
  • STEM Academic Advising
Replies
10
Views
4K
  • STEM Educators and Teaching
4
Replies
128
Views
41K
Back
Top