Development of 2nd Law Of Thermodynamics

In summary, the 2nd law of thermodynamics states that the efficiency of an engine is only dependent on temperature. The 1st law of thermodynamics was demonstrated by Joule using a paddle wheel in a container of water. The 2nd law was then demonstrated by an experiment.
  • #106
Andy Resnick,

Thank you, however your answer regarding thermodynamic entropy...

"I like to think of entropy simply as 'energy unavailable to perform useful work'."

...left me wondering?

Thermodynamic Entropy does not have units of energy. Whether the energy is available or unavailable for work, it is still energy and has units of energy. Entropy by virtue of its units, is something different from entropy. I am not questioning the fact that energy has transferred from a state of potential usefulness to a state of un-usefulness for the Carnot engine to which the exchange of energy applies. A lower temperature Carnot engine could make use of the lost energy. I know that you know this. It is just that an answer that verbally, as opposed to physically, appears to relate or change entropy into energy does not pass by unnoticed.

Thermodynamic Entropy is the transferred energy divided by the temperature of the temperature sink. Where did the temperature go? What did it mean? Thermodynamic Entropy appears to me to be a process and not a state. It occurs and then it is gone. The energy involved in the entropy process remains. However, the entropy process by which the energy was transferred from a state of usefulness to un-usefulness remains unexplained. What do you think?

James
 
Science news on Phys.org
  • #107
Ha! Be careful, you may actually learn something! :)

Yes, the units of entropy is energy/degree. I don't know how to measure that, so I don't really know what that means. I do understand energy and I understand (after a fashion) temperature because I can measure those things.

Honestly, biochemists are way ahead of physicists with some of this material. They work directly with the Gibbs free energy:

http://www.tainstruments.com/main.aspx?id=214&n=1&gclid=CKv-2Kv6uaECFRIeDQodRHb9BA&siteid=11

http://www.setaram.com/Microcalorimetry.htm

I was lucky enough to work with a biochemist for a while and learn some of this stuff. What entropy 'is' or 'is not' is a much less useful question than asking how the Gibbs free energy changes during a process, mostly because the entropy can't be measured. So, the 'free energy' is how much energy *is* available to do work (especially chemical reactions and whatnot), making the entropy (or T*dS) the energy *not* available.
 
Last edited by a moderator:
  • #108
As you mention free energy and biochemists: There has been an interesting paper by Jarshinsky (I suppose I spelled him wrong) in Phys Rev Lett about 10 years ago about what I vaguely remember as a non-equilibrium connection about the mean exponentiated work and the free energy. It has been shown to be true experimentally in experiments e.g. stretching protein molecules at different speeds.
 
  • #109
DrDu said:
As you mention free energy and biochemists: There has been an interesting paper by Jarshinsky (I suppose I spelled him wrong) in Phys Rev Lett about 10 years ago about what I vaguely remember as a non-equilibrium connection about the mean exponentiated work and the free energy. It has been shown to be true experimentally in experiments e.g. stretching protein molecules at different speeds.

Yes! The Jarzynski inequality:

http://en.wikipedia.org/wiki/Jarzynski_equality

I discovered this in the context of laser tweezer experiments on protein folding.
 
  • #110
I have a son who is a graduate student studying meteorology. He uses entropy frequently. It is very useful, now if only we knew what it was?

James
 
  • #112
I saw it and I know the Keldysh formalism. I once attended a seminar he gave on the subject.
 
  • #113
Andy Resnick said:
James,

<snip>

For example, I've started discussing the structure of the p-V surface with some mathematician colleagues here (CSU), as I feel certain issues raised on this thread merit additional consideration. I can't comment on those discussions yet as they are too preliminary for PF.

Here's an interesting reference:

S.G. Rajeev, Quantization of contact manifolds and thermodynamics, Annals of Physics, Volume 323, Issue 3, March 2008, Pages 768-782, ISSN 0003-4916, DOI: 10.1016/j.aop.2007.05.001.

(http://www.sciencedirect.com/science/article/B6WB1-4NNYJBK-1/2/56df1f8184b786a722d43733e3d92765)Here's the gist (as much as I understand): given the conservation of energy

dU + p dV + T dS

How many independent variables are there?

Answer- only 2. The others are given by constitutive relations. Additionally, the 'phase space' has an *odd* number of variables, as opposed to 'normal' mechanics, which has an *even* number. The phase space of mechanics has a symplectic structure, thermodynamics has a contact structure.

An aside: "entropy" is the conjugate variable to "temperature". So there's another definition of what entropy 'is'...

I guess I have some summer reading lined up...
 
  • #114
Andy Resnick,

I live in Colorado. Thank you for those links and the thoughts that help get me started.

James
 

Similar threads

Replies
20
Views
1K
  • Thermodynamics
Replies
4
Views
3K
  • Advanced Physics Homework Help
Replies
5
Views
1K
Replies
5
Views
1K
Replies
7
Views
1K
Replies
7
Views
15K
Replies
15
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
2K
Replies
4
Views
1K
Back
Top