How does evolution theory fit the entropy theory

In summary: the second law of thermodynamic state that their will be more chaos, and natural selection in the opposite of that.
  • #1
Mathijsgri
40
2
the second law of thermodynamic state that their will be more chaos, and natural selection in the opposite of that.

maybe this is a stupid question but can somebody anser this
 
Science news on Phys.org
  • #4
Thx i get it now
 
  • #5
just one more thing, they talked no energy in or out a closed system but what if i take the whole galaxy as a system? or the whole universe?
 
  • #6
The universe is a closed system. A galaxy isn't, because it is emitting light, and we can see other galaxies with telescopes. It's very hard to have a true closed system, but often we can treat something as a closed system if we can quantitatively show that the energy transfer across the border is negligible compared to what we are focused on.
 
  • #7
Mathijsgri said:
just one more thing, they talked no energy in or out a closed system but what if i take the whole galaxy as a system? or the whole universe?

Then when one part of the world has an decrease in entropy, another part will have an increase equal or greater in entropy.

You don't have to use the entire universe for this. Just look at a heat engine (Carnot cycle). You can calculate how much one part has an increase in entropy and another having a decrease in entropy.

Zz.
 
  • #8
I recently read 'Life on the Edge' by Jim Alkjalili. In the book he suggests that evolution can only occur as a result of Quantum interactions and that the statistics will not allow it to work by thermodynamics. I know you can't just take a single source to be enough to explain something as consequential as that but he does refer to a fair number of creditable sources in the book.
He quotes a number of biological processes that can only be explained in terms of Quantum Tunnelling and offers reasons why it does not actually require super low temperatures.
A very good read, in any case.
 
  • #9
sophiecentaur said:
In the book he suggests that evolution can only occur as a result of Quantum interactions and that the statistics will not allow it to work by thermodynamics.
Sorry, but that sounds like a load of bs someone overselling their idea.
sophiecentaur said:
He quotes a number of biological processes that can only be explained in terms of Quantum Tunnelling and offers reasons why it does not actually require super low temperatures.
That quantum effects are important for life as we find it, ok, I can buy that. But that does not mean that life requires quantum effects (although, at the same time, since the world is quantum, it makes sense that evolution has made use of all that is available).
 
  • Like
Likes BillTre
  • #10
DrClaude said:
Sorry, but that sounds like a load of bs someone overselling their idea.
That quantum effects are important for life as we find it, ok, I can buy that. But that does not mean that life requires quantum effects (although, at the same time, since the world is quantum, it makes sense that evolution has made use of all that is available).

I can understand your skepticism but the numbers are too important just to be dismissed. He concludes, for instance, that simple random mutation processes would take too long for a simple bacterium to evolve.
The book is not expensive on Kindle and makes a good read.
I think my post has more bs than that book! [emoji846]
 
  • #11
DrClaude said:
overselling their idea.
Waayyy oversold ... entropy of mixing plus entropy of very wasteful chemical reactions is more than enough to drive evolution; the intuitive equivalence of "disorder" with "the dark side of 'The FORCE' ' " confuses more than a few people.
 
  • #12
sophiecentaur said:
simple random mutation processes would take too long for a simple bacterium to evolve.

I have been little impressed by these kinds of arguments.

They often seem to leave a lot of things out of their calculations like:
1) there are many copies (really big numbers if you are talking about molecules) of whatever is evolving,
2) things (such as genes) can be duplicated and then changed to build information increases much more rapidly then doing everything "from scratch",
3) many genes can be involved in evolving something (increasing the number of targets for mutation),
4) evolution has not only huge time spans (millions of years) to change things, but some organisms have very short generation times (bacteria can be as short as 20 minutes; 26280 generations/year). Many iterations let's little things add up over time.
 
  • Like
Likes DrClaude and Drakkith
  • #13
I linked a vid in another thread showing bacteria mutating in real time.
 
  • #14
Sorry to come to this discussion late, but I'd like to expand on the original question. I don't buy the quantum argument, but I do think life (and computers and solved jigsaws etc) "seem" too ordered to say they took that order from the sun. My answer involves looking at the 2nd law itself.

Summary: 2nd law is based on random state-changes, but natural selection drives state-change in another direction, so there's no reason it should satisfy the 2nd law.

Some observations, let me know what I got right or wrong:
  • As I understand it, the 2nd law of thermodynamics is a law of statistics, not physics (and thus not really a law of thermodynamics, although it applies to it).
  • Any system that changes randomly from state to state is more likely to change to a less ordered state, simply because there's more ways to be disordered than ordered. It's only tied to the physics of our universe because our universe is such a system.
  • It seems to be overlooked that the 2nd law is explicitly based on randomly changing between states. Without this, there's no reason to presume it will go to a less ordered state. And this is the key. Natural selection is a process that directs the state-change in a different direction. It pushes uphill, to a more ordered state.
  • The standard answer to this has been given above already, that this local increase in order is balanced by a greater decrease in entropy elsewhere (the sun), but why is this necessary? 2nd law is based on random state-changes, but natural selection drives state-change in a non-random direction. Even if it only does this locally, it doesn't need to be fuelled from elsewhere, because we've broken the rules the 2nd law requires.
  • It "seems" (in quotes because I acknowledge my limitations in not knowing how to quantify this) that a living creature has a ludicrous level of order. Same for a computer, or gluing a broken mug back together, or a sorted pack of cards. Apparently if you shuffle a deck of cards well, it's extremely unlikely to match any other shuffled deck that ever existed, yet I can easily sort the cards into any order I want. I am able to defy the 2nd law, not because I've taken order from elsewhere, but because I've taken away the thing the 2nd law is based on, random state-change.
  • Is the amount of order in the sun really so much more than all the living creatures, computers, solved Rubik's cubes etc on Earth? The sun is obviously huge, but there are lots of stars, and few (presumably) places where such ordered things exist as here on Earth. There are many more ways to rearrange parts of the sun and still have a functioning sun than there are to rearrange living creatures or solved Rubik's cubes without damaging them.
  • Consider a simple system where phrases are randomly generated. Let's say we start with a very ordered phrase, like "Let's test the 2nd law!". Randomly change one character at a time. It will follow the 2nd law and become more disordered over time (a good example of how this law is a law of statistics not physics). Now add a driving force akin to natural selection to this tiny "universe", where there's a target phrase, where we only allow random state-changes where no less characters match the target phrase. It won't take long for us to reach our target phrase, with this driving force at play. We have easily defied the 2nd law in this simplified universe, without the need for the increased order being matched by decreasing order elsewhere in the system.
 
  • #15
Do you have any equations and or data to support these claims?
 
  • #16
Robert Webb said:
let me know what I got right or wrong:
  • Wrong
  • Partially Wrong
  • Partially Wrong
  • Wrong
  • Wrong
  • Wrong
 
  • #17
Really? Do you really disagree that the 2nd law, which explains increasing entropy, is based on the assumption of random state-change? I thought that part at least was uncontroversial. If I'm wrong, that's fine, but I'm here to understand why I'm wrong and gain a better understanding, which the replies so far haven't helped with.

As for supporting equations or data, my basic argument isn't dependent on it. The claims about the amount of order in the sun vs. the order in life etc, IS dependent on it, which is why I presented that part in the most tentative terms. But my overall argument doesn't depend on it. See the summary I gave previously.
 
  • #18
Robert Webb said:
Really? Do you really disagree that the 2nd law, which explains increasing entropy, is based on the assumption of random state-change?
There was a really good discussion of Maxwell's Demon somewhere on PF. I will see if I can find a link. In the meantime, https://en.wikipedia.org/wiki/Maxwell's_demon
 
  • Like
Likes Robert Webb
  • #19
OK, thanks for the Maxwell's Demon reference, that helped. I was viewing order as "meaningful" (depends on your point of view) subsets of state-space and their ratio (or some function of it) of the total state-space. Now I see alternative definitions which tie the 2nd law specifically to energy. With that definition I'd agree, energy is conserved but the total amount of useful energy is always decreasing. The concept of increasing disorder IS a statistical law, for any meaningful measure of order, but it seems maybe entropy and the 2nd law are defining order explicitly as it relates to energy. Is that right?

So to be clear, am I right that a solved Rubik's cube or a sorted deck of cards would have exactly the same amount of entropy as their unsolved/shuffled counterparts, as far as physics is concerned, because the amount of useful energy in either is the same?
 
  • #20
Robert Webb said:
So to be clear, am I right that a solved Rubik's cube or a sorted deck of cards would have exactly the same amount of entropy as their unsolved/shuffled counterparts, as far as physics is concerned, because the amount of useful energy in either is the same?
The Rubik's cube has the same number of microstates and macrostates however getting it to a specific state requires energy. The multiplicity of a solved rubik's cube is one therefore the entropy is smallest (ie it can only increase). You have to exert energy to solve the Rubik's cube. Even though your cube is now pretty, the amount of entropy in the system has increased (your new disorder is no. As you point out, the amount of useful energy in either is the same, but you have lost energy thus increasing the entropy of the total system.

My advise is to reference http://hyperphysics.phy-astr.gsu.edu/hbase/Therm/entrop2.html and make sure you completely understand the paragraph on "Entropy in Terms of Heat and Temperature. (specifically, why it's defined in terms of dS instead of S)
 
Last edited:
  • Like
Likes Robert Webb
  • #21
Just see the Earth as a box floating in space that you can't see into, low entropy goes in, high entropy comes out. Because of that increase in entropy some complexity can take place inside the box. But more disorganisation radiates away than there is complexity inside the box that takes place.
 
  • #22
bland said:
I like to look at it this way, moderately high energy, less entropic photons rain down on the Earth, stuff gets more organised, very much lower very much less organised photons radiate off into space.
I would be hesitant to describe it that way unless you can described how a photon is "organized."
 
  • #23
Jamison Lahman said:
I would be hesitant to describe it that way unless you can described how a photon is "organized."

Yeah, fair enough, describing it as 'organised' is a bit problematic, perhaps I should have just stuck to 'higher energy' therefore capable of causing organisation while dropping to a lower energy.
 
  • #24
Jamison Lahman said:
The Rubik's cube has the same number of microstates and macrostates however getting it to a specific state requires energy. The multiplicity of a solved rubik's cube is one therefore the entropy is smallest (ie it can only increase). You have to exert energy to solve the Rubik's cube. Even though your cube is now pretty, the amount of entropy in the system has increased (your new disorder is no. As you point out, the amount of useful energy in either is the same, but you have lost energy thus increasing the entropy of the total system.

My advise is to reference http://hyperphysics.phy-astr.gsu.edu/hbase/Therm/entrop2.html and make sure you completely understand the paragraph on "Entropy in Terms of Heat and Temperature. (specifically, why it's defined in terms of dS instead of S)
Thanks for the link. It uses rolling a pair of dice as an analogy, but this is just an analogy, right? If entropy in physics is measured by the amount of useful energy, then all dice rolls would be equal. Entropy is meaningless without an agreement on what counts as ordered. Eg rather than seeing the total value of the dice as significant, maybe we only care about getting the same value on both dice, or maybe we specifically want a 2 and a 4. With a Rubik's cube we could have any specific arrangement in mind rather than the usual solution. Talking about the energy required to solve the cube seems like a misnomer because all states have the same amount of useful energy, so it's comparing apples and oranges isn't it? We can talk about the energy-based entropy increase required to power a low energy-based entropy elsewhere, but not a low arbitrary-measure-based entropy. For example, we could define our most special state as however the universe will be in 3 minutes, and then by definition entropy will inexorably plummet towards this low state. If we can measure any kind of order, then it seems easy to fabricate states with ludicrous amounts of order by that measure. If we look at life in terms of useful energy, it's a very different matter, and the 2nd law isn't challenged. This seems like an important part of the answer to the original question.
 
  • #25
Robert Webb said:
Thanks for the link. It uses rolling a pair of dice as an analogy, but this is just an analogy, right? If entropy in physics is measured by the amount of useful energy, then all dice rolls would be equal.
That is why they use an example with two die. The sum of the two die is not of equal probability. There is one way to make 12 (6+6) but there are six ways to make a sum of 7 (6+1, 5+2...). In that case, 6+1 is the microstate and the sum of 7 is the macrostate. Since there are 6 ways make a sum of 7, that means its multiplicity is 6.
If your die read twelve, the entropy of your system is 0. Your die are the "most ordered" they will ever be. if your die read 7, the entropy is kb ln(6) which is > 0, hence the entropy has increased.

I think the Rubik's cube is a flawed example because of the differences between combinations versus permutations. This was not my strong suit and I am tired, so I'm not sure I should dare go further! :-)

The problem is it is hard to equate anything physical to entropy. The configuration of the rubik's cube is a matter of statistics but 'solving' it correctly would require mechanical energy.
Robert Webb said:
For example, we could define our most special state
Our "most-special" state is the one with the lowest multiplicity and is defined by statistics, not us.
 
  • #26
Jamison Lahman said:
That is why they use an example with two die. The sum of the two die is not of equal probability. There is one way to make 12 (6+6) but there are six ways to make a sum of 7 (6+1, 5+2...). In that case, 6+1 is the microstate and the sum of 7 is the macrostate. Since there 6 ways make a sum of 7, that means its multiplicity is 6.
My point was that WE decided what counted as important macrostates. We could have found something else significant, eg doubles (1+1, 2+2 etc), in which case we'd get different results.

I think the Rubik's cube is a flawed example because of the differences between combinations versus permutations. This was not my strong suit and I am tired, so I'm not sure I should dare go further! :-)

The problem is it is hard to equate anything physical to entropy. The configuration of the rubik's cube is a matter of statistics but 'solving' it correctly would require mechanical energy.
This is what I mean about apples and oranges. If we're allowed to count ordered states of the Rubik's cube as low entropy as far as the 2nd law is concerned, then we're back to the problem I started with.

Think about it this way. Consider having multiple Rubik's cubes and consider them in unison. Regardless the number, they still have only a single ordered state, but each time you add a cube, the number of states sky-rockets. However, the amount of energy required to solve these cubes only goes up linearly. Solving 2 cubes only takes twice as long, and twice the energy, but the number of states has squared. As we add cubes, the energy required is quickly drawfed by the amount of order if measured in this way. But it only makes sense to talk about the energy used if we also measure order in terms of useful energy.

Although now that I've made that argument I'm not sure why it doesn't apply to any method of measuring order o_O

Our "most-special" state is the one with the lowest multiplicity and is defined by statistics, not us.
But we decide how to measure order and what states count as significant. Again, I think this works if we have defined order in terms of useful energy, but it won't work if we allow order to be measured in any way.

This does mean though that a solved puzzle would have the same entropy as an unsolved puzzle, as far as the 2nd law is concerned, because they all have the same amount of useful energy.
 
  • #27
Robert Webb said:
But we decide how to measure order and what states count as significant. Again, I think this works if we have defined order in terms of useful energy, but it won't work if we allow order to be measured in any way.

This does mean though that a solved puzzle would have the same entropy as an unsolved puzzle, as far as the 2nd law is concerned, because they all have the same amount of useful energy.
No matter how we define the macrostates, entropy is defined by the state with the lowest multiplicity. Unless your puzzle is binary, the solved puzzle will have lower entropy. Equating this entropy to some sort of mechanical process is difficult.
 
  • #28
Are we still talking about evolution violating Thermo 2nd Law here, or are we now discussing entropy states?

If it is the former, then why are we ignoring all the various papers that have been published on this? I've pointed one already in this thread, and here's more:

https://arxiv.org/abs/1003.3937
https://arxiv.org/abs/0903.4603 <- I pointed this out earlier
D. Styer, Am. J. Phys. v.76, 1031 (2008)

In fact, there are claims that the 2nd law is a NECESSARY component for evolution on earth:
https://phys.org/news/2008-08-evolution-law-thermodynamics.html

I haven't seen any new arguments to counter any of these since when this thread was first created. Are we going around in circles and ignoring the bullseye?

Zz.
 
  • #29
ZapperZ said:
are we going around in circles and ignoring the bullseye?
Zapper has a point here. If we're going to be discussing the basic understanding of entropy and the second law that is a prerequisite to the original topic of this thread, it should take place in a different thread.

This thread is closed, but PM any mentor if you want it reopened to continue on-topic discussion.
 

1. How does the theory of evolution relate to the concept of entropy?

The theory of evolution explains how living organisms change over time through natural selection. Entropy, on the other hand, is a measure of the disorder or randomness in a system. While the two may seem unrelated, they are actually closely connected. Evolution is driven by the increase in complexity and diversity of living organisms, which can be seen as a decrease in entropy. This is because living organisms are highly organized and structured, while entropy tends to increase in closed systems. Evolution can be seen as a way for living organisms to resist the natural trend towards entropy and maintain order.

2. Can the second law of thermodynamics be applied to evolution?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. However, this law only applies to closed systems, and living organisms are not closed systems. They are constantly exchanging energy and matter with their environment, which allows them to decrease their entropy. So while the second law of thermodynamics may apply to certain aspects of evolution, it cannot be applied to evolution as a whole.

3. Does the process of natural selection contradict the concept of entropy?

No, natural selection and entropy are not contradictory concepts. Natural selection is a driving force of evolution, leading to the increase in complexity and diversity of living organisms. This can be seen as a decrease in entropy, as living organisms are highly organized and structured. However, the overall trend of entropy in the universe is still towards disorder. Natural selection simply allows living organisms to resist this natural trend and maintain their order and complexity.

4. How does the evolution of species impact entropy?

The evolution of species can be seen as a way for living organisms to decrease entropy in their environment. As species evolve and become more complex, they are able to occupy new niches and interact with their environment in different ways. This can lead to an increase in order and organization in the ecosystem, thereby decreasing entropy. Additionally, the processes of natural selection and adaptation that drive evolution can also contribute to a decrease in entropy by increasing the organization and complexity of living organisms.

5. Is there evidence for a connection between evolution and entropy?

While there is not direct evidence for a direct connection between evolution and entropy, there are many examples of how the two concepts are related. For example, the increase in complexity and diversity of living organisms can be seen as a decrease in entropy, as living organisms are highly organized and structured. Additionally, the processes of natural selection and adaptation that drive evolution can also contribute to a decrease in entropy by increasing the organization and complexity of living organisms. However, more research is needed to fully understand the relationship between these two concepts.

Similar threads

Replies
12
Views
1K
  • Thermodynamics
Replies
2
Views
766
Replies
13
Views
1K
  • Thermodynamics
Replies
33
Views
2K
Replies
16
Views
835
Replies
1
Views
895
Replies
15
Views
1K
  • Thermodynamics
Replies
3
Views
776
  • Thermodynamics
Replies
1
Views
724
Back
Top