1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B A Question about Entropy

  1. Aug 31, 2017 #1
    Our universe is considered a closed system. Law says that the entropy of a closed system is bound to increase.

    Then how could living beings evolve when they are an extremely ordered system?
     
  2. jcsd
  3. Aug 31, 2017 #2

    anorlunda

    Staff: Mentor

    The earth is not a closed system. It gets energy from the sun.
     
  4. Aug 31, 2017 #3

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    Within the closed system, there can be pockets where entropy decreases, where that is balanced out by pockets where entropy increases. It is just that the TOTAL overall entropy increases.

    Zz.
     
  5. Aug 31, 2017 #4
    Any experimental proof for this statement.
     
  6. Aug 31, 2017 #5

    anorlunda

    Staff: Mentor

    Us. We exist.
     
  7. Aug 31, 2017 #6

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    Drop an ice cube into a cup of warm water. Compute!

    Zz.
     
  8. Aug 31, 2017 #7
    OK. Fine.

    But, I have a problem regarding this.

    When the ice cube melts completely an equilibrium will be reached
    i.e a state of highest entropy will be reached.

    Isn't equilibrium itself a kind of 'ordered state' (a state where there is perfect balance).

    Why to call such a 'balanced' state a disordered one?
     
  9. Aug 31, 2017 #8

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    You have a strange way of defining "equilibrium". In fact, I find many of the stuff you've "acquired" along the way in many of your posts to be rather strange.

    "Equilibrium" simply means, in this case, d(something)/dt = 0.

    You need to really look and read the PHYSICS (not the pedestrian) definition of entropy. How about starting with reading the stuff they have on the entropy site:

    http://entropysite.oxy.edu/

    Zz.
     
  10. Aug 31, 2017 #9
  11. Aug 31, 2017 #10

    Dale

    Staff: Mentor

    Heat pumps work.
     
  12. Aug 31, 2017 #11
    I think 99.999999% of people on this planet only acquire stuff along the way....

    It's only a miniscule minority that ever says something new...
     
  13. Aug 31, 2017 #12

    Dale

    Staff: Mentor

    I think this may be a big source of your conceptual unease. You have this exactly backwards. Thermal equilibrium is not an ordered state.

    Rigorously, you should stick with the standard and unambiguous notion of entropy, and not your colloquial concept of order. However, if you do insist on thinking in colloquial terms then you at least need to think carefully about your concept.

    Colloquially, order is when the books are in the book case, the clean laundry is in the drawers, and the dirty laundry is in the hamper while disorder is when they are all on the floor. Order has boundaries and separations and disorder is uniform. Equilibrium is disorder.

    Saying that equilibrium is ordered is wrong both rigorously and colloquially.
     
  14. Sep 2, 2017 #13
    If icing is to be spread on a cake and we do it in lumps (boundaries), it is disordered..

    But, if we spread the icing evenly(uniformly), it is 'ordered'.....
     
  15. Sep 2, 2017 #14

    Dale

    Staff: Mentor

    You are confusing "ugly" with "disordered". Just because you have an aesthetic preference for smooth rather than lumps doesn't mean it is more ordered. If you start with big lumps with sharp boundaries and randomly perturb it (e.g. heat or vibration) then you can get smooth icing and the boundaries will reduce. If you start with smooth icing and randomly perturb it then you will not suddenly get big lumps with sharp boundaries. Despite your dislike for such lumpy icing, it is in fact more ordered.

    It is clear that your intuitive concept of "disordered" is simply wrong. This is one of the reasons why we develop rigorous quantitative definitions. Please stick with the technical concept of entropy. Your intuition for this concept will improve over time, but right now you need to use the rigorous definition as you work to correct some faulty assumptions.

    Don't worry and don't give up. This sort of thing happens all the time, and it can be overcome by consistently relying on the rigorous definition until the intuition builds later.
     
    Last edited: Sep 2, 2017
  16. Sep 3, 2017 #15
    So, I ask further.....

    To my mind, if we come to know everything about all the particles at equilibrium (just suppose), we wouldn't call it disorder AT ALL. Then the entropy at equilibrium would be zero.

    So, it seems our 'inability' is translated as disorder...

    Hope, there is some sense in my pedestrian views.
     
  17. Sep 3, 2017 #16

    Dale

    Staff: Mentor

    What does the definition of entropy say? Apply the actual rigorous definition to the situation.
     
    Last edited: Sep 3, 2017
  18. Sep 3, 2017 #17

    Vanadium 50

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2017 Award

    Deepak, why do you come here? When you're told you're wrong, you just dig in harder. I'm afraid that doesn't make you right. It also isn't the behavior of a student, it's the behavior of a crackpot. Finally and most importantly, the process of learning is exchanging wrong ideas for right ones. If you don't give up the wrong ideas, you're not learning, and just wasting your time - and everybody else's.

    Redefining entropy makes discussion impossible. As I wrote a week ago:

    And here we are, Humpty-Dumpting away again. If you don't use standard definitions, we cannot communicate.
     
  19. Sep 3, 2017 #18

    BruceW

    User Avatar
    Homework Helper

    On one hand, there is the idea of entropy as expressing our lack of knowledge of the system. On the other hand, we have the thermodynamics idea of entropy, which feature in the 3 laws of thermodynamics. They're related but maybe a little bit different.

    If we had a system of particles in equilibrium, the thermodynamic entropy is a definite calculated nonzero value. But thinking of entropy as being our lack of knowledge, if we could see all the particles positions, the system has zero entropy.

    As another example, if you have spheres in a box, at low density they are very randomly arranged, but if you decrease the size of the box, they order into a lattice pattern. So, the knowledge entropy decreases. But, the system has no energy involved, so it's not so clear how to interpret this in terms of the thermodynamic definition of entropy.
     
  20. Sep 3, 2017 #19

    ZapperZ

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor

    No, the concept of STATISTICS is the expression of our lack of knowledge of every individual particles in the system, not just entropy. So the whole field of thermodynamics and statistical mechanics are included in that. This is not the definition of entropy.

    Where did you get that?

    Again, as with the wrong idea exhibited by the OP, entropy is NOT disorder, or even a lack of knowledge.

    http://news.fnal.gov/2013/06/entropy-is-not-disorder/
    http://entropysite.oxy.edu/entropy_isnot_disorder.html
    http://www2.ucdsb.on.ca/tiss/stretton/CHEM2/entropy_new_1.htm
    http://home.iitk.ac.in/~osegu/Land_PhysLettA.pdf

    Zz.
     
  21. Sep 3, 2017 #20

    BruceW

    User Avatar
    Homework Helper

    This wiki article https://en.wikipedia.org/wiki/Entropy_(information_theory) is the kind of entropy I'm thinking about when I say the amount of disorder, or lack of knowledge. They actually use the word 'surprisal' but it is maybe a bit intimidating for beginners, which is why people say disorder instead (I would guess).

    So yes, this information definition of entropy would be one way to express our lack of information and there are many other statistics that you could choose.

    Thermodynamics and statistical mechanics are different disciplines. They should agree on all concepts where they overlap, but you get so much weird stuff happening in statistical mechanics that I feel it's best to specify if someone is talking about statistical mechanics or thermodynamics at the outset, for clarity.
     
  22. Sep 3, 2017 #21
    You are taking a very lose interpretation of the article which does not clarify anything for the OP.
    Thermodynamics and statistical mechanics are the respective macroscopic and microscopic theories for the same physical process. They are not different theories but are instead quite complimentary. Entropy in particular, on the macroscopic scale, is defined such that its differential change is equal to the differential change of heat in a system at constant temperature
    $$\frac{\delta Q}{T}=\delta S$$
    Notice that there is nothing in this equation that is directly implying disorder or "lack of information".
    On the microscopic scale, entropy is defined as proportional to the natural logarithm of the number of micro states ##W## at equilibrium
    $$S=k\text{ln}(W)$$
    Again, this is not implying "disorder". It says that entropy is related to the number of ways one can organize the constituent particles of a system while still maintaining the macroscopic configuration.
     
  23. Sep 3, 2017 #22

    BruceW

    User Avatar
    Homework Helper

    hm, as an example, the free energy of the 1D Ising model is $$f(\beta ,h)=-\lim _{L\to \infty }{\frac {1}{\beta L}}\ln(Z(\beta ))=-{\frac {1}{\beta }}\ln \left(e^{\beta J}\cosh \beta h+{\sqrt {e^{2\beta J}(\sinh \beta h)^{2}+e^{-2\beta J}}}\right)$$ I would prefer to say this macroscopic quantity is a result from statistical mechanics rather than thermodynamics, but I guess it just depends on the preference. Probably you could list it in either sections of a journal.

    Also, maybe these articles are a bit better than the other one I linked to https://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics) I think our main difference is that I would also like to interpret entropy as being related to information, but you would prefer not to.
     
  24. Sep 3, 2017 #23

    anorlunda

    Staff: Mentor

    The Wikipedia disambiguation page on entropy shows 16 scientific definitions (presumably all correct). IMO that makes entropy particularly hard to discuss. People talk past each other with differing definitions in their heads.
     
  25. Sep 3, 2017 #24
    I think our goal here should be to help the OP to understand the canonical definition of entropy rather than discussing higher level concepts such as information theory. In order to do this, we should stick to the definition of entropy as stated in introductory statistical mechanics books.

    @Vanadium 50 has already stated that we should be using standard definitions here.
     
  26. Sep 3, 2017 #25
    Thanks everyone for their answers...

    A thing comes to my mind here...

    'Everything is debatable'...
    Maybe this line is also debatable☺️☺️
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted