Entropy vs Life: Exploring the Probability of Creation

In summary, the conversation discussed the creation of cells and bacteria and whether it violates the second law of thermodynamics. The probability of a "soup" randomly creating a cell is compared to the entropy increasing over time. The second law of thermodynamics is explained as not being as strict as other physical laws and that living organisms are not a closed system. The role of the sun in providing energy for life on Earth is also discussed. Finally, the misconception of "organized" structures being synonymous with "low entropy" is addressed. This conversation highlights the need for a better understanding of thermodynamics and its applications in the study of life.
  • #1
waht
1,501
4
We don't know how cells or bacteria were created which is a highly orginazed chunk of matter. But if it arose randomly from a soup of necessary indegrients, would that violate the second law of thermodynamics which says the entropy increases with time.

The probability of the "soup" to create a perfect square of molecules would be no different than to create a cell. In time, the perfect square disappears but the cell lives on by picking up other molecules and dividing. So the order begins at the microscopic level.
 
Science news on Phys.org
  • #2
I do not know the immediate answer to your question, but let me point out that the second law of thermodynamics is not quite as strict as most physical laws. It's more of a guideline. An overwhelming amount of the time, the entropy of a closed system will increase with time, but there is a miniscule possibility that a closed system's entropy can decrease.
 
  • #3
The simple answer is that life pays for its order by increasing the rate at which the Universe in general is being disordered. So light from the sun is captured by a forest and then released at a much lower temperature than it would have been off bare rock.

This kind of perspective changes life and other dissipative structures from a surprise to a necessity. Structures that are better at disposing of entropy gradients are likely to be favoured by nature. Some would call this the basis of a fourth law of thermodynamics. A controversial area still of course.
 
  • #4
I quote from ZapperZ's journal, which I think answers your question:

[...]
2. Evolution violates the Second Law of Thermodynamics

Already, this is something that affects physicists, because inadvertently, our area is being dragged directly into this battle.

The argument comes from the apparent understanding of two things: (i) life beings are "ordered" structure and (ii) 2nd Law of Thermodynamics reflects an increase in entropy or, to put it crudely, disorder.

Now, I will not go into detail why to equate entropy with disorder is inaccurate (that will be saved for another time). So let's assume that both (i) and (ii) are correct. ID advocates point to the fact that if Evolution did happen, it implies a trend towards order of our Earth system. Random distribution of atoms and molecules in primovial Earth somehow form ordered and more sophisticated congrlomeration that eventually form life forms. Thus, the Earth went from disorder to order. This clearly violates the 2nd Law of Thermodynamics and thus, is not very likely. So evolution cannot be the explanation for life.

Again, such an argument is being made without an understanding of the 2nd Law, or even basic thermodynamics in the first place. The 2nd Law clearly states that in an ISOLATED SYSTEM (no energy or any kind going in and out), entropy cannot decrease. The Earth is certainly NOT an isolated system. In fact, the Earth DEPENDS predominantely on one source of external energy - the sun! So even if we consider the most simplified system, we have to consider the sun and the Earth as the complete isolated system, not just the Earth alone. Within this system, there is nothing to prevent one part of the system to have a lower entropy with time (example: carnot cycle). Thus, even if the Earth does really have a lowering of entropy, this certainly does not violate Thermodynamics' 2nd Law.

One would be surprised that, even when this is already explained in several articles and books, that there are still numerous websites supporting creationism/ID that still carry this argument (do a google search if you don't believe me). Either the authors are not aware of how ridiculous such an argument is, or they are hoping that the reader are not aware of it, or not good in simply thermodynamics. This isn't a stretch of imagination because the general public do not have any significant understanding of basic thermodynamics principles and thus, can easily be fooled into thinking that physics has made evolution impossible! It costs nothing to perpetuate the lie.

[09-05-2004 07:33 AM] - Imagination without knowledge is Ignorance waiting to happen - Part 2
https://www.physicsforums.com/journal.php?s=&action=view&journalid=6230&perpage=10&page=10 [Broken]
 
Last edited by a moderator:
  • #5
waht said:
We don't know how cells or bacteria were created which is a highly orginazed chunk of matter. But if it arose randomly from a soup of necessary indegrients, would that violate the second law of thermodynamics which says the entropy increases with time.
The 2nd law says entropy for a closed system increases with time. The entropy of that "soup" could still be increasing while the first proto-cells were forming.
 
  • #6
As others have pointed out, living organisms are not a closed system and require continuous inputs of energy from the environment to maintain internal order (homeostasis). The majority of this energy is provided by the sun through photosynthesis in which the photosynthetic organisms convert the sun's energy into chemical energy. Then photosynthetic organisms are consumed by non-photosynthetic organisms to extract the chemical energy and use it for themselves.

So, you need to consider the sun as part of the system of life on Earth and the loss of energy from the sun that accompanies the use of that energy to maintain order by living organisms.
 
  • #7
waht said:
(snip)a highly orginazed chunk of matter. (snip) So the order begins at the microscopic level.

"Organized," "order," and "low entropy" are NOT synonyms, never have been synonyms, and never will be synonyms. Entropy is the integral of dq/T from 0 K to the temperature of interest for the system of interest. The change in entropy associated with the formation of chemical compounds is the entropy of the compound minus the entropies of the elements comprising the compound; this is nearly always greater than zero (can't think of any exceptions off the top of my head). The entropies of mixing compounds to form solutions, cell walls, micelles, and other biological structures is also positive.

Anyone who asserts that "highly organized" structures such as living cells are also "low entropy" entities ain't been near a thermo book in his/her/its life, or is engaged in fraud.
 
  • #8
waht said:
We don't know how cells or bacteria were created which is a highly orginazed chunk of matter. But if it arose randomly from a soup of necessary indegrients, would that violate the second law of thermodynamics which says the entropy increases with time.

The probability of the "soup" to create a perfect square of molecules would be no different than to create a cell. In time, the perfect square disappears but the cell lives on by picking up other molecules and dividing. So the order begins at the microscopic level.

Living organisms are dissipative structures (Nobel Prize for Chemistry for Prigogine on 1977). They are studied from a "generalization" of thermodynamics.

dS = diS + deS

diS > 0 for disissipative phenomena. If system is isolated deS = 0 and then dS = diS > 0

If system is open dS =/= diS. On a dissipative structure

dS = diS + deS

with |deS| >> |diS|

and dS = diS + deS << 0

that is the system self-organizes.
 
Last edited:
  • #9
εllipse said:
I do not know the immediate answer to your question, but let me point out that the second law of thermodynamics is not quite as strict as most physical laws. It's more of a guideline. An overwhelming amount of the time, the entropy of a closed system will increase with time, but there is a miniscule possibility that a closed system's entropy can decrease.


This is one of more commons errors of physics literature.

Of course the second law of thermodynamics is a strict law. It is so strict as Newton ("rational") laws of mechanics.

This typical error on interpretation of second law is the basis of the completely wrong article published in physical review letters some years ago:

WANG, G. M; SEVICK, E. M; MITTAG, E; SEARLES, D. J; EVANS, D. J. Experimental Demonstration of Violations of the Second Law of Thermodynamics for Small Systems and Short Time Scales. Phys. Rev. Lett. 2002, 89(5), 050601.

and was invalidated by several authors including Juan R. (CPS:physchem/0309002).
 
  • #10
There are many examples of processes in nature that reduce entropy locally, only to increase entropy elsewhere. Take water vapor for example. The evaporation of water increases enropy, but the condensation of water is a local decrease in entropy. So water vapor that forms into rain droplets is actually a decrease in entropy for the water.

There's nothing special about local decreases in entropy, it happens all the time without the help of life.
 
  • #11
Juan R. said:
Of course the second law of thermodynamics is a strict law. It is so strict as Newton ("rational") laws of mechanics

Exactly, and just as Newton is not true in the more general case of high speeds, so the second law is not true in the general case of open systems, where its hypotheses are violated:

In every closed system the entropy is non decreasing
 
  • #12
Bystander said:
"Organized," "order," and "low entropy" are NOT synonyms, never have been synonyms, and never will be synonyms. Entropy is the integral of dq/T from 0 K to the temperature of interest for the system of interest. The change in entropy associated with the formation of chemical compounds is the entropy of the compound minus the entropies of the elements comprising the compound; this is nearly always greater than zero (can't think of any exceptions off the top of my head). The entropies of mixing compounds to form solutions, cell walls, micelles, and other biological structures is also positive.

Anyone who asserts that "highly organized" structures such as living cells are also "low entropy" entities ain't been near a thermo book in his/her/its life, or is engaged in fraud.
Glad you brought this up. I've always wanted to ask an ID proponent to point me to literature quoting the value of [itex]\Delta S^o_f ~(human~being)[/itex].

But note here, that the relevant entropy change is not for formation from the elements but rather from H2O, O2, CO2 and N2 (primarily).
 
Last edited:
  • #13
Gokul43201 said:
(snip)But note here, that the relevant entropy change is not for formation from the elements but rather from H2O, O2, CO2 and N2 (primarily).

... and?

Your homework for the day: count translational, rotational, internal rotational, and vibrational degrees of freedom contributing to the heat capacities of a system of n amino acids, n(NH2-R-CO2H), in aqueous solution; synthesize an n-peptide from the n amino acids; count the contributions to heat capacities of the products; note the absence of any change in total heat capacity; configure the peptide in any "active" arrangement you wish (prion, enzyme, whatever), using the water from the condensations and additional water from the original solution if necessary; count contributions to heat capacity; integrate the difference in heat capacities between the configured peptide and that of n amino acids in solution.

The "relevant" entropy change is between smaller reactant molecules and larger product molecules. Losses of translational degrees of freedom, and "locked" configurations appear as increases in product heat capacities relative to the sum of reactant heat capacities; hence, an increase in product entropy.

Edit: Addendum: Would everyone please note that this process takes place in a "closed system." i.e., there is NO necessity to flounder about defining open, closed, and isolated systems and the second law statements applying to such.
 
Last edited:
  • #14
selfAdjoint said:
Exactly, and just as Newton is not true in the more general case of high speeds, so the second law is not true in the general case of open systems, where its hypotheses are violated:

In every closed system the entropy is non decreasing

You are completely misguided.

1) appealing to strictness of second law i was (it is obvious for anyone with a minimum knowledge) rebating the popular but wrong idea of physics literature that Newton laws are laws of nature whereas the second law is a "statistical" law.

2) I newer said that Second law was absolute. In fact, in my papers and preprints i newer said that. For example in (CPS: physchem/0309002) I said

Either, the validation or the possible invalidation of classical thermodynamics in those “exotic” regimes, for a definite class of systems, has both a great theoretical –e.g. arrows of time, theory of quantum dissipative systems, etc.– and technical –e.g. nanotechnology, molecular biology, supramolecular chemistry, etc.– interest. The author thinks that dogmas have no room in science. The scientific laws of nature are formulated in restrictive experimental and/or theoretical frameworks. Therefore, is useful to believe that the formulation of final scientific laws, that are always valid anywhere is a very difficult objective.

3) The second law is perfectly valid in open systems. You say is completely wrong. Has you heard about the thermodynamics of open systems guy?

I recommend to you very, very, very old literature on the topic. See, for example, famous Prigogine monograph Introduction to thermodynamics of irreversible processes 1955.

Prigogine received the Nobel Prize for chemistry 1977 for his work in open systems, specially dissipative structures.

AS already SAID in post #8


The second laws read

diS > 0 for disissipative phenomena and

diS = 0 for equilbrium phenomena.

If, and only if, the the system is isolated (no "closed" like you incorrectly say) the deS = 0 and then dS = diS > 0 or dS = diS = 0.

That is entropy S is non decreasing (*).

If system is non isolated (e.g. open) then dS =/= diS, and

dS can be positive, negative, or zero in function of external flows of matter and energy.

In living systems (mature) the ss approach can be used and the production of entropy by metabolic processes is almost canceled by external flows of entropy (mainly expulsion of residue by cells metabolism) and dS = 0.

Therefore, the living body maintain its structures (its order) "forever".

(*) Note that i did not say that was valid elsewhere. Your "in every closed system the entropy is non decreasing" is also wrong when one studies small (nano) systems.

However, and this was the error of Wang et al paper i cited, that does not imply a violation of the second laws. Precisely the result obtained by Wang et al are compatible with the second law until the second order in perturbation series of thermodynamic fluctuations theory.
 
Last edited:
  • #15
Juan R. said:
Moreover, your in every closed system the entropy is non decreasing is also wrong when one studies small (nano) systems.
When one says the above, there is an implicit statement that the change is measured in the limit of a large number of particles or over large timescales.
 
  • #16
Gokul43201 said:
When one says the above, there is an implicit statement that the change is measured in the limit of a large number of particles or over large timescales.

I'm sorry but i cannot agree.

If one say "every" that mean "every". Precisely the error of interpretation of the second law was the basis for wrong claims of Wang et al paper and subsequent gulash of news and others on APS, Nature news, BBC science news, and others.

The second laws vas violated! They claimed

But it was not, either implicit or explicit :biggrin:

Moreover, you also appears to confound closed and isolated systems. Also the definition of isolated system is implicit in above "closed"?
 
  • #17
mccrone said:
The simple answer is that life pays for its order by increasing the rate at which the Universe in general is being disordered. So light from the sun is captured by a forest and then released at a much lower temperature than it would have been off bare rock.

This kind of perspective changes life and other dissipative structures from a surprise to a necessity. Structures that are better at disposing of entropy gradients are likely to be favoured by nature. Some would call this the basis of a fourth law of thermodynamics. A controversial area still of course.

Part of the energy absorbed by trees and other plants is converted into complex organic molecules which may in turn be consumed by animals which use the molecules to grow and move around. In other words biological organisms tend to increase order.
 
  • #18
Biological life, however it came to exist, reduces the potential entropy of a planet. Energy reflected or radiated back into space is essentially lost because it cannot do anything. By using energy for constructive purposes biological life reduces the amount of entropy that would otherwise occur if there were no biological life.
 
  • #19
reasonmclucus said:
(snip)In other words biological organisms tend to increase order.

This is a bold assertion. You have written the partition function for an abiotic Earth and compared it to that for a biotic earth? And, found the second case to yield a smaller entropy than the first?

I KNOW you haven't. Life ain't an "ordered" process. Mother nature is an entropic, wasteful sloven. Roget's Thesaurus is not a suitable source for establishing the equivalence of order, organization, complexity, structure, and other ID words with a state of low entropy.

Write the partition functions or keep your peace. Hand waving is not suffered gladly on this forum.
 
  • #20
There's a trick there, Bystander, that I think you missed:
reasonmclucus said:
By using energy for constructive purposes biological life reduces the amount of entropy that would otherwise occur if there were no biological life.
That statement doesn't say that life decreases the entropy of a system, it says that a system with life involves less of an increase in entropy than one without. I would tend to agree, but its a pretty uselsess thing to discuss.
 
  • #21
russ_watters said:
There's a trick there, Bystander, that I think you missed: That statement doesn't say that life decreases the entropy of a system, it says that a system with life involves less of an increase in entropy than one without. I would tend to agree, but its a pretty uselsess thing to discuss.

Initial state: abiotic "primordial soup."
Final state: biotic hodgepodge.

Calculate, estimate, or measure changes in the various thermodynamic state functions.

I've never run into an analysis that involved postulation of an alternative final state, calculation of the differences in state functions between the observed and hypothetical final states, and the presentation of those differences as a thermodynamic description of the final state. Yeah, we play with hypothetical reference states, but, the initial and final states are both referred to the same hypothetical reference state, and only the difference between the initial and final states has any meaning.
 
  • #22
russ_watters said:
There's a trick there, Bystander, that I think you missed: That statement doesn't say that life decreases the entropy of a system, it says that a system with life involves less of an increase in entropy than one without. I would tend to agree, but its a pretty uselsess thing to discuss.

Those who study these things argue that life actually increases the rate of entropy production. So add in the order of the life form itself and still there is overall a faster move towards degradation of gradients. Life could be considered a catalyst to speed the reaction.

Google for maximum entropy production principle. Then there are those who say that the immature stages of any ecology do just blast away, producing waste heat, but then there is a switch to mature systems where entropy production becomes minimised - as now there is a lot of free energy getting locked up as configuration energy, or information.

So newly cleared forest radiates a lot of heat. Mature jungle runs colder as there is indeed a lot more getting locked up as order (all those complex ecowebs and niches). But even with the move to mature dissipative structure, overall more entropy is produced than order created. Life never contradicts the second law.
 
  • #23
If you have a jar of stardust and rocks, that is totally isolated from everything, no energy going in. No energy going out. It will end up clumping up into bigger and bigger pieces, which lessens the disorder. It goes from disorder to order. The only thing that would increase the entropy, would be when the clumps smash into each other, dislodging pieces of dust and rocks. That would again clump together. And eventually smash into more clumps and fall apart.

The point is, if no energy is going in and going out things are just going to change order.

I'm eager to be wrong.

Why not?
 
  • #24
The reason why clumping under gravity generally represents a disordering would be because the universe started out with mass spread out very evenly - this smoothness being more orderly than the later messy fractal clumpiness we see now.

If you put a bit of the universe in a jar, then you are constructing orderly boundaries. The effort involved should dissipate more energy than the energy you would trap - or at least that is the kind of result the second law would lead you to expect.

Your clumped bit of stardust and rock would also be doing something - tumbling about the jar and banging off the walls. So either the lump would crumble or convert its kinetic energy to waste heat. As a one particle system, it would still seem pretty disorderly. At least that would be the view of the jar you constructed.

The universe itself is not in a box or jar of course but is freely expanding and so mass can randomise its location just a little quicker than gravity can collapse it.
 
  • #25
OK. Since this is STILL in the physics section of PF (and not, let's say, the General Discussion), I think it is imperative that proper citation and references be given, or else I don't see this thing going anywhere. I tend to agree with the "spirit" (not necessary the point) that Bystander made in asking for something more definitive in some of the points being presented (a partition function would be a good start). Or else how is anyone able to make any convincing point?

In case one is not aware, there have been TONS of published material on such issues:

http://www.fes.uwaterloo.ca/u/jjkay/pubs/Life_as/lifeas.pdf [Broken]
http://www.mdpi.net/entropy/papers/e1020009.pdf
http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=105137
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=3709900&dopt=Citation
http://bruce.edmonds.name/combib/compref274.html

etc...

In any case, the OP that started this whole thread is clearly wrong and is based on very faulty understanding of the 2nd Law.

Zz.
 
Last edited by a moderator:
  • #26
Didn't Poincare say that every closed system, being in a given state at a given time, will eventually return to that state? Admittedly, the amount of time needed for a Poincare return for any but a simple system may exceed the age of the universe, but wouldn't it contradict the second law?
 
  • #27
Gerinski said:
Didn't Poincare say that every closed system, being in a given state at a given time, will eventually return to that state? Admittedly, the amount of time needed for a Poincare return for any but a simple system may exceed the age of the universe, but wouldn't it contradict the second law?

Yes it contradicts the second law. It is known like the problem of arrow of time.

This is the reason that there is not still a developed nonequillibrium statistical mechanics. There is one incompatibility between thermodynamics and mechanics like there is other between quantum mechanics and general relativity.

Solution: develop a new mechanics, irreversible one.
 
  • #28
let systems A and B be a closed system and let A and B be separated by an adiabatic wall. the systems have different temperature TA and TB. let TA > TB. now let systems A and B be separated by a diathermic wall, which permits energy to be exchanged between the systems. the systems will eventually come to thermal equilibrium, upon which they have the same temperature T. figuratively speaking, this final state of systems A and B is like the state of the disorganized "soup". now there is no way that systems A and B can come to a more organized state unless it comes in contact with another system C and TC < T.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In scientific terms, it is a measure of the number of ways in which a system can be arranged.

2. How does entropy relate to life?

Entropy is closely related to the concept of life as it is a measure of the probability of a system existing in a particular state. In other words, the higher the entropy, the lower the probability of life existing in that system.

3. Can life exist in a high entropy state?

While it is theoretically possible for life to exist in a high entropy state, the chances of it happening are very low. This is because high entropy states are characterized by disorder and randomness, which are not conducive to the highly organized and complex structures found in living organisms.

4. How does the probability of creation relate to entropy?

The probability of creation is a concept that explores the likelihood of a system or universe with the necessary conditions for life to exist. Entropy plays a crucial role in this probability as it determines the likelihood of a system existing in a state that can support life.

5. Is there a link between entropy and the origin of life?

While there is no conclusive evidence, some scientists believe that the low entropy state of the early universe may have played a role in the creation of life. This is because a low entropy state provides the necessary conditions for complex and highly organized systems, such as living organisms, to form.

Similar threads

  • Thermodynamics
Replies
1
Views
2K
  • Biology and Medical
Replies
14
Views
2K
Replies
6
Views
1K
Replies
21
Views
3K
Replies
6
Views
4K
  • Thermodynamics
Replies
2
Views
2K
  • Biology and Medical
Replies
14
Views
3K
Replies
3
Views
1K
Back
Top