1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How can one calculate entropy? What is entropy?

  1. Dec 24, 2013 #1
    From what I've been taught, the entropy of a system is the amount of microstates a macrostate can have.

    A microstate refers to the configuration of a system on a microscopic level (energy of each particle, location of each particle), a macrostate refers to the external parameters of that system (volume, pressure, etc.)


    My problem is, how can one possibly calculate the entropy of a system? Knowing that there are billions and billions of particles, I don't think it's possible to analyse each single particle, and define what the millions of places these particles can be located within the macrostate or what energy each particle can have at some point. So how come there are standard entropy tables to calculate entropy? Where do these things come from/What's the reasoning behind it?

    Also, why is the unit for the standard entropy J/mol.K? How does knowing the amount of energy per mol at a certain temperature tell us how many microstates a macrostate can have?

    I have a test coming and I've been trying to understand the concept of entropy for days already to no avail.

    Thank you for reading and hopefully answering!
     
    Last edited: Dec 24, 2013
  2. jcsd
  3. Dec 24, 2013 #2

    haruspex

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    I can shed some light on that part. The temperature of an object is, in effect, the energy per state. Since the laws of thermodynamics arrange that energy is spread out roughly equally to all states, this leads to the net heat flow being from hotter to cooler.
     
  4. Dec 25, 2013 #3
    Entropy is a measure of randomness within a system. Another useful way to look at it is "energy no longer available to do work". I'll admit I still have kind of a loose handle on it myself. Entropy is positive or zero for any process. So it's not conserved as other forms of energy are, but it is always being generated. Another good way to look at it is a measure of the "quality" of the energy within (or outside of) a system.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: How can one calculate entropy? What is entropy?
  1. What is entropy? (Replies: 2)

  2. Entropy calculation (Replies: 16)

  3. Entropy calculation (Replies: 9)

  4. Entropy calculation (Replies: 5)

Loading...