Can Organized Systems Defy the Trend Towards Maximum Entropy?

AI Thread Summary
The discussion centers around a homework question regarding the concept of entropy in relation to Intelligent Design and evolution. The original poster disagrees with the claim that systems cannot become organized due to the tendency towards maximum entropy, providing an example of an endothermic reaction where entropy in the surroundings increases while the system's entropy decreases, thus adhering to the Second Law of Thermodynamics. Participants contribute additional examples, such as the transition of water from gas to liquid, and emphasize that living systems, like the human body, are not closed systems and can decrease their own entropy by increasing the entropy of their surroundings. The conversation also touches on the definition of entropy, with some participants clarifying that it is a measure of disorder rather than potential energy. A blog post is referenced for a clearer understanding of entropy, which explains it in terms of probability and quantum mechanics, highlighting the relationship between entropy and quantum numbers. Overall, the discussion reflects a deep engagement with thermodynamic principles and their implications for understanding biological systems and entropy.
pzona
Messages
234
Reaction score
0
This was a question on a homework assignment I had a few weeks ago:

""Intelligent Design" believers sometimes argue against evolution by saying that it is impossible for a system (i.e. a human being) to become so organized since everything tends towards maximum entropy. Do you agree with this statement? If not, give a simple chemical counterexample."

I already turned in the assignment, so that's why I didn't put this in the homework help section. Here's what I answered:

"I disagree. In an endothermic reaction, work/heat is added to the system by the surroundings once the reaction is complete, as the surroundings try to reach thermal equilibrium with the system. Entropy in the system decreases as the entropy in the surroundings increases. This does not violate the Second Law because S(surroundings) > S(system), which means that universal entropy does increase."

Obviously this is extremely simplified. My answer was marked correct, but I was wondering if anyone else could think of some simple examples like this, just out of curiosity. Also, any critique on my phrasing/reasoning is welcome. I look forward to hearing some other responses.
 
Chemistry news on Phys.org
How about the simple phenomenon of rain. Gaseous water has much more entropy than liquid water.
 
The simplest - albeit not exactly chemical example - is that with such understading of the thermodynamics it is not possible to build a house or car, yet there are plenty around. So either this understanding is wrong, or whatever we see around doesn't exist :wink:

--
 
Here's my thought : The human body is not a closed system (eating for example completely changes the system). In order to lower the entropy of our body, we must increase the entropy of our surroundings. The change of entropy of us plus the change of entropy of our surroundings results in an increase of entropy so that the Second Law of thermodynamics still holds.
 
Ygggdrasil said:
How about the simple phenomenon of rain. Gaseous water has much more entropy than liquid water.

That's a really good example, and it leads me to a question: I've read a definition that defines entropy as a measure of the change in potential energy. Is this too oversimplified? I'm a first year undergrad, so obviously I haven't gotten to a whole lot of the mathematics involving entropy yet, and I'm still trying to come to grips with the idea of disorder in terms of thermodynamic terms. If this definition is applicable in most cases, that would definitely help me out a lot.
 
No, you cannot think of entropy as a potential energy. The simplest explanation of entropy is saying that it's a measure of the amount of disorder of a system. A more mathematically correct explanation is that entropy identifies which situations are most probable. For a really good, simple explanation of entropy see http://gravityandlevity.wordpress.com/2009/04/01/entropy-and-gambling/ (also a really good blog on physics as well). Another related post explains the concept of free energy really well also http://gravityandlevity.wordpress.c...e-plays-skee-ball-the-meaning-of-free-energy/
 
Wow, that entropy description was completely unlike anything else I've read on the subject. I'll definitely be reading more on that blog, I like the style very much. Another quick question though. I don't expect you to go into detail on this, but does entropy deal a lot with quantum numbers? This was one of the first things that came to mind when I read the gambling description, and it kind of seems to me like quantum numbers would play heavily into the whole combination aspect of it.
 
Hi pzona and Ygggdrasil,

I saw that you stumbled across my blog entry. I'm certainly glad you liked it!

pzona, you're right to think that entropy can be defined in terms of quantum numbers. In a way, that's the most correct way to define entropy. If N is the number of distinct sets of quantum numbers that is associated with a particular state, then the entropy of that state is S = k_B \ln N.
 
As gravityandlev said, this view of entropy deals with how particles are distributed among different energy levels, so in this respect, implicitly relies on quantum mechanics (which predicts discrete energy levels instead of continuous energy levels). However, in many cases, you can make the assumption that energy is continuous and still calculate the entropy (for example, we make this assumption when deriving the Boltzmann distribution and the ideal gas equation).
 
Back
Top