Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Difference in conventional and fundamental entropy?

  1. Apr 8, 2012 #1
    I'm reading Thermal Physics by Kettle and I feel I'm having trouble really grasping what entropy is. From my lower division pretty much entropy can be defined as a measurement of randomness in a system. Like with ice, very little randomness=lower entropy, water, more randomness because the molecules can move around=higher entropy. Kettle brings up conventional entropy and fundamental entropy. I'm having trouble really understanding the difference and why do we have both. Conventional entropy seems to be the one you learn in lower division courses, it has units. It's the fundamental entropy multiplied by the Boltzmann's constant. The book states the fundamental entropy is a pure number. So why have both, and how does one benefit over the in certain problems? Thank you.
  2. jcsd
  3. Apr 9, 2012 #2


    User Avatar

    What you are calling "fundamental entropy" is, I think, better known as "information entropy", or "Shannon entropy" as it is applied to thermodynamics. "Conventional entropy" is probably better known as "thermodynamic entropy". I think the best way to think of entropy is as "missing information". Any time you have probabilities, you don't know completely what is going on, so you have "missing information", or information entropy. Take a coin flip. Before you flip the coin, you have 50 percent possibility of heads, 50 percent tails. Information theory says you are missing one "bit" of information, the entropy is one bit. In thermodynamics, suppose you have a gas and you know its temperature and pressure and number of particles. That means you know its thermodynamic state, or "macrostate". But you don't know where every particle of the gas is, or their velocity. That's the "microstate". Many different microstates could give you the same macrostate, and which microstate the gas is in, is "missing information" or the "fundamental entropy" of the gas. Thermodynamic entropy is something you can thermodynamically measure and it was Boltzmann's great discovery that the two are related by a constant, called "Boltzmann's constant".
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook