A Question about Entropy

  • #26
Mister T
Science Advisor
Gold Member
2,604
851
To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.
If you knew the positions and velocities of each of the particles you'd have a definition of one single microstate. But there are lots of other microstates that would also be states of equilibrium. Now, look at a nonequilibrium state and you find that there are again, lots of different microstates that can form that nonequilibrium state. It turns out, though, that there are more microstates for the equilibrium state than there are for the nonequilibrium state, thus the equilibrium state is more likely, usually far more likely. That is the reason things tend towards equilibrium.

The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:
 
  • #27
Dale
Mentor
Insights Author
2020 Award
30,697
7,293
Thanks everyone for their answers...

A thing comes to my mind here...

'Everything is debatable'...
Maybe this line is also debatable☺☺
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort
 
  • #28
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,427
2,607
The same scientific methods that produced the 2nd Law also produced the theory of evolution. So when you argue that one of those ideas is false because the other one is true you accept the validity of one of them based on science but you reject the the validity of the other, and you also base that on science! Thus you are saying that science is wrong because science is right. :wideeyed:
Exactly. Once upon a time, I used to engage in debates with creationists who used the "entropy says things should get more disordered" as an argument against evolution. Over time, I came to the point of view that this is not only wrong, but in a way, the opposite of the truth. The tendency toward increasing entropy is what drives evolution and other living processes, in a similar way that the tendency for water to run downhill can drive waterwheels to generate power.
 
  • #29
So are you going to apply the rigorous definition of entropy to your last question? You cannot expect to learn without some personal effort
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
 
  • #30
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,427
2,607
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
I don't think that rigorous ever means that. Rigorous means careful, pains-taking. Avoiding hand-waving and appeals to intuition. A rigorous derivation or proof is one where the assumptions and rules are clear, and the steps are laid out in detail so that there is pretty much no room for doubt about the conclusion (as long as you also accept the assumptions and rules).
 
  • Like
Likes BruceW
  • #31
1,468
117
OK. Fine.

But, I have a problem regarding this.

When the ice cube melts completely an equilibrium will be reached
i.e a state of highest entropy will be reached.

Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

Why to call such a 'balanced' state a disordered one?

When the ice cube has melted, at microscopic level there is more of the imbalance. Different molecules have different speeds, that kind of imbalance.

There are some micro states where the molecules have the same speed, but not many. So very very likely the melted ice never during the lifetime of the universe visits those micro states.
 
  • #32
Vanadium 50
Staff Emeritus
Science Advisor
Education Advisor
26,160
9,550
Sometimes, rigorous tends to mean 'accepted'.
No it doesn't. And we're back in Humpty-Dumptyland.
 
  • #33
Mister T
Science Advisor
Gold Member
2,604
851
Sometimes, rigorous tends to mean 'accepted'.
So what? Do it in such a way that it's rigorous, but you don't accept it.

For example, researchers were able to present a rigorous argument that the 2nd Law is based on probabilities and the existence of atoms, but it was not accepted by many physicists. The reason being that they didn't accept the premises upon which the rigorous argument was based.

The rest of the story is not relevant to this discussion, but I'll add it in for completeness. It was subsequently established that the premises are valid and that the conclusion is valid. It then makes it harder to not accept the 2nd Law as being valid, but there are still plenty of people who don't accept it despite its rigor. Just look at the number of attempted patent submissions for devices that violate the 2nd Law.
 
  • #34
Dale
Mentor
Insights Author
2020 Award
30,697
7,293
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
Do you have any intention of putting in some personal effort to actually learn this material?
 
  • #35
BruceW
Homework Helper
3,611
119
You may feel annoyed, but...

Sometimes, rigorous tends to mean 'accepted'.
The more precise your question, the better the answer (in science, at least). Ultimately, the idea of scientific endeavour is to build a theory that has practical implications. To do this, precise (rigorous) definitions are needed.

I'm still not 100% sure what your concerns were. Taking the lumpy cake to a more physics-y example, suppose you have a box with lots of gas atoms inside. Call the fraction of atoms on the left-hand side of the box ##f##. We expect ##f=1/2## since all the atoms are bouncing around, they are equally likely to end up on either side. If we measure ##f## again, we again expect ##f=1/2##. Your intuition seems to be that perhaps next time ##f=0.7## or ##f=0.2## or some other random fraction. This is like your lumpy cake. You are not expecting the cake to be homogeneous. But the important point is the number of microstates corresponding to each macrostate. In our example of the box of atoms, you can think of ##f## as indicating the macrostate. There are so many microstates corresponding to ##f## being very close to ##1/2## that it means ##f## will equal ##1/2## every time.

p.s. Also, what Dale said.
 
  • #36
13
0
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?

If searching for the articles I found, include this in your search: /small-systems-defy-second-law
 
  • #37
Dale
Mentor
Insights Author
2020 Award
30,697
7,293
Having read several articles
Please cite the specific articles. Keep in mind that the research being done on fluctuation theorems is often portrayed incorrectly by popular media sources.
 
  • #39
1,241
189
If searching for the articles I found, include this in your search: /small-systems-defy-second-law
First hit with exact title from PhysicsWorld.com, the last paragraph sums it up well: "Evans and colleagues say that their discovery could be important in the design of nanomachines. They also point out that as thermodynamic systems become smaller, the probability that they will run ‘in reverse’ increases, and this could improve our understanding of how many small biological systems – such as ‘protein motors’ – work." Totally not an entropy situation, as there are outside influences...
 
  • #40
Dale
Mentor
Insights Author
2020 Award
30,697
7,293
This isn't about me needing help doing a search, it was about finding out what sources you were using. As I predicted above, all three of these articles are popular science sources which are mischaracrerizing the work being done on fluctuation theorems (at least the first and third are, but I only read the beginning of the second).

This is why pop sci sources are not considered valid, particularly for easily distortable topics like fluctuation theorem work.
 
  • Like
Likes NFuller
  • #41
Dale
Mentor
Insights Author
2020 Award
30,697
7,293
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?
The fluctuation theorem work is not about defying the 2nd law. It is about deriving the 2nd law as a classical limit of a deeper statistical mechanics law.

As their name indicates, they are about fluctuations, often fluctuations about an equilibrium. The question they address is, statistically, how long would you expect fluctuations away from equilibrium to last. Even these small systems do not decrease entropy for long, nor predictably.
 
  • #42
Having read several articles on nanosystems, which defy the 2nd law; I have to wonder if a macro system comprised of multiple linked nanosystems could produce more output than input? And if not, why not?

If searching for the articles I found, include this in your search: /small-systems-defy-second-law
I'm also skeptical here. From what I understand, it's not that there is more output than input, its that some quantum systems appear to be able to do work without dumping any heat and therefore have a thermodynamic efficiency of unity.

Also remember that you cannot scale up most quantum systems while still keeping the quantum effects.
 
  • #43
rude man
Homework Helper
Insights Author
Gold Member
7,871
793
I think 99.999999% of people on this planet only acquire stuff along the way....

It's only a miniscule minority that ever says something new...
Minuscule, not miniscule,
 
  • #44
rude man
Homework Helper
Insights Author
Gold Member
7,871
793
There is also entropy in communications theory, i.e. average self-information per message or symbol. For a group of N equally probable symbols this is log2 N.
 
  • #45
241
56
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
Things can be ordered (inefficiently) using energy. You can stack things up. They may fall later, but you can again take energy and stack them. The tendency to disorder can be defeated locally using energy.

If you take a bunch of calories in, as sugar, you can convert that sugar to carbon dioxide, and use the energy to create order within your cells. The overall process, of burning sugar, breathing in oxygen and out carbon dioxide, giving off waste heat to the air ... the overall process is one where the entropy increases. But the microscopic view of the cell shows the order THERE increasing.

The earth is not a closed system. It has an energy input (the sun). That energy is used by living organisms to create order. Living things are engines that take in energy, and (inefficiently) do work. Living things evolved because there were energetically favorable conditions.

EDIT: Another stab at it. Consider a burning piece of wood, going from an ordered state to a bunch of smoke, ash, and combustion gases. Along with a healthy dose of heat. Now suppose you put a crucible of ore above it and make a small ingot of metal and then forge a ring. You have managed to generate an improbably well ordered thing. But the entire process, including the fire, was one with increasing entropy.

The universe may be a closed system that is increasing in entropy. That does not mean that entropy is uniformly increasing at every location. The forged ring was a location that decreased in entropy, even as the overall system increased in entropy.
 
Last edited:
  • Like
Likes BruceW
  • #46
BillTre
Science Advisor
Gold Member
2020 Award
1,663
3,866
If you take a bunch of calories in, as sugar, you can convert that sugar to carbon dioxide, and use the energy to create order within your cells. The overall process, of burning sugar, breathing in oxygen and out carbon dioxide, giving off waste heat to the air ... the overall process is one where the entropy increases. But the microscopic view of the cell shows the order THERE increasing.
While I would agree that cells use energy to create, maintain, and restore some order in themselves, biological organisms also have other ways of maintaining/creating order.

For example, molecules in organisms can sort out into sub-cellular groups based on adhesion properties between themselves (things that strongly stick to each other will tend to end up in a group together).
For proteins, properties like this would be due to the details of their encoding DNA sequences, which direct the production of the proper protein amino acid sequence.
The energy immediately used in the currently living cell would be put into production of the encoded proteins (this would be like a proximate cause in biology (an immediately preceding driver)).
Something more akin to a biological ultimate cause (more closely related to the reason for its evolution), would be to consider the energy put into the evolution of the encoded protein by all of the gene's predecessors and the cells or organisms in which those sequences resided as they went through their generations of evolutionary history.

This seems to me, to be the accrued expense of the evolutionary building of (evolving of) the DNA sequences (stored information) from which these other properties are produced (using currently available energy and the cellular environment) anew in each cell, for what is probably a relatively small energetuic cost that could just as easily be used to produce a similar protein with a different amino acid sequence which could have slightly different or vastly different properties. This of course builds up over time and will simultaneously affect all the encoding sequences in the genome that are under selection in its circumstances (some traits may be selectively neutral). Its not clear to me how this energy contribution could be easily determined and assigned to a particular result. The same organisms would also be evolving all of the other sequences they have, some organisms and genes would be evolutionary dead ends, and genes may even be acquired from other species, viruses, or nowadays lab efforts.

I figure, this is a less direct use of energy to generate and maintain biological order.
It should reduce the amount of energy currently needed for to maintain order in currently existing cells.
 
  • #47
Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

Then how could living beings evolve when they are an extremely ordered system?
Think of it as a contract with the Universe -> You can be alive i.e. living beings but as payment you have to increase the amount of entropy more than the constitute parts would by themselves.
 

Related Threads on A Question about Entropy

  • Last Post
Replies
5
Views
1K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
5
Views
2K
  • Last Post
Replies
3
Views
2K
  • Last Post
Replies
4
Views
2K
  • Last Post
Replies
4
Views
504
  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
3
Views
579
Replies
2
Views
11K
Top