What is the real explanation for entropy and its role in chemical reactions?

  • Thread starter Senex01
  • Start date
  • Tags
    Entropy
In summary: I can't find them now. Entropy and specific heat capacity are both measures of the amount of heat energy required to raise a mass of a substance by a given number of degrees.
  • #36
atyy said:
"microcanonical ensemble"
I have a mental image of a choir of pint-sized clerics, but I'm reading up on it. Thanks.
 
Science news on Phys.org
  • #37
Senex01 said:
I think I need "heat" explained to me as well then. I thought heat was the total kinetic energy of the particles of the system. And temperature is a somewhat arbitrary measurement of the kinetic energy or momentum (I'm not sure) that the particles transfer, directly or via radiation, on other particles. Therefore two systems will be "at the same temperature" when they transfer equal amounts of heat energy to each other. But two systems could have quite different ratios between internal kinetic energy and the energy they transfer to external systems.

I mean, that is not supposed to be a statement of fact, just what I thought I understood.

"Heat = kinetic energy" is a good heuristic, but in terms of jargon, it is strictly speaking wrong. Here are the correct statements. Suppose you have a gas in which the particles don't interact with each other through potential energy. In such a gas the internal or total energy is just the total kinetic energy of the particles. To increase the kinetic energy, one may add either heat or work to the gas, thus knowing the kinetic energy alone does not tell you whether you got there by adding heat or work. This illustrates that the kinetic energy is a state variable, and does not depend on the history of the gas, while heat and work do depend on the history of the gas. However, in such a gas the kinetic energy is related to the *temperature, which is a state variable*, and indicates the direction of heat flow if two containers of this gas at different temperatures are placed in thermal contact with each other.
 
Last edited:
  • #38
my professor used to say the only true answer is: entropy is equilibrium in phase space.

If there are particles, their locations and momenta are equally distributed across the whole range (given a cutoff before infinity).

Of course there is entropy in computer sciences as well. Perhaps one can generalize it even more.

Regarding chemical reactions this may not be the answer but you may be able to deduce sth. from this.
 
  • #39
Senex01 said:
All right, I'm sorry to be difficult, thanks for being patient. Does this mean that they do not teach the laws of thermodynamics in high school any longer?

They do, but they don't explain what heat, work, temperature, and entropy really are. This is explained phenomenologically using examples. When students arive at university, they are asked to forget what they have been previously told before they are given the rigorous definitions. This is because the intuitive high school explanations look to be pretty water-tight, while in fact it is full of holes so that students could become confused.
 
  • #40
Entropy has always seemed rather confusing but fascinating...Here are some explanations from famous people...maybe they will offer some clues. Keep in mind that Boltzmann, the father of S = k logW committed suicide...so maybe entropy should not be thought about toooo much! (LOL)
Brian Greene< Fabric of The Cosmos:
high entropy means that there are many ways; low entropy entropy means there are few ways...If the pages of war and peach are stacked in proper numerical order, that is a low-entropy configuration...two essential features of entropy. ... entropy is a measure of the amount of physical disorder in a system. (in the absence of gravity) a uniformly spread gas configuration is insensitive to an anormous number of rearrangements of its molecular components and so is in a state of high entropy. Just as thinner books have fewer page reorderings smaller spaces have fewer places for molecules...so allow for fewer rearrangements.

All forms of energy are not equal. ..every life form takes in low entropy energy (in the form of food) and gives off high-entropy energy (as heat)... plants maintain their low entropy state via photosynthesis...deriving energy from the (low entropy) sun.

when gravity matters, as it did in the early universe, clumpiness- not uniformity- is the norm...for the initailly diffuse gas cloud the entropy decrease through the orderly formation of clumps (stars, planets, etc) is more than compensated by the heat generated as the gas compresses...and ultimately by the enormous amount of heat and light released as nuclear processes begin to take place.

back later..if this is helpful I'll find some other explanations...
 
  • #41
Of course if someone had told me at 15 years old that I couldn't understand my homework until I'd finished the first year of university, I would have decided "Well, I suppose I'll just have to be a lawyer then." In fact, that is what I decided, but for other reasons.

I was talking to a German who knows a bit about the history of science, and from what he said, the following makes some sense.

Each force in the universe acts on objects, until it can act no more on them. Clausius's original term, when he was explaining the second law of thermodynamics, was Verwandlung, meaning transformation (such as frog into a prince, or a petty neurotic clerk into a cockroach). As each transformation (or reaction perhaps) occurs, the differences in potential energy (in terms of heat originally, but gravity, electricity, strong/weak would do) between each "particle" or "part of the system" decreases. So they shunt their way down from reaction to reaction, to ever-decreasing energy states - or to states of ever-decreasing energy difference. Eventually they will theoretically encounter the "heat death" when no more work can be done, but in practice they will get hung up in some state where some force acts to counter-balance the other.

You can jolt them (the particles or parts of the system) out of their tenuous balance by giving them a jolt of energy that changes their state in a particular way to counter-act or overcome some force, so that they can shunt their way down to a state of lower energy difference. For example, by applying a match.

It's slightly more complex in quantum mechanics, because of the probablistic nature of the reactions, but the same principle applies when you consider things in toto.

Brian Greene's explanation made a lot of sense to me, until I started seeing the equations, which made it clear that entropy had something to do with energy and forces, and was not simply a matter of the ordering of states of the system. Not just the classical equation but the quantum mechanical equation is a measure of energy:

S = k ln(Omega(E + dE)) -- it was a wonderful piece of devilry to leave out the last two terms...

You could make a measurement I suppose of the energy differences of each ordered pair of particles/parts - classically or quantum mechanically, it wouldn't make much difference in principle - and then make a measurement of the total energy differences, or perhaps mean energy difference. Omega, if my understanding of the explanation is right, is in fact an enumeration of the energy states - and the change in energy states of course.

Entropy then, is the inverse of these differences in energy levels.

(Note: this is a request for clarification, again, this is what I understand, I'm just wondering what your response would be...)
 
Last edited:
  • #42
http://necsi.org/projects/baranger/cce.pdf
"Yes, you are the one who increased the entropy! It is not physics, it is not chaos, it is not Liouville: you drew the smooth volume to make your life easier, you are the one. It was chaos who manufactured the fractal, but you chose to smooth it out. ... One final question: if entropy is really totally subjective, why does it appear so objective to so many people? Ask any physical chemist. You will be told that entropy is a fundamental, permanent property of matter in bulk. You can measure it, you can calculate it accurately, you can look it up in tables, etc. And the answer to that new paradox is: large numbers. It is well-known that very large numbers have a way of making probabilities turn into absolute certainty."

http://ocw.mit.edu/NR/rdonlyres/Physics/8-333Fall-2007/1E2D4D68-EC43-44C7-91A8-9FC8A7698A78/0/lec9.pdf "However,the advent of powerful computers has made it possible to simulate the evolutionof collections of large numbers of particles,governed by classical, reversible equations of motion. Although simulations are currently limited to relatively small numbers of particles (10^6), they do exhibit their reversible macroscopic behaviors similar to those observed in nature (10^23). For example, particles initially occupying one half of a box proceed to irreversibly,and uniformly,occupy the whole box. (This has nothing to do with limitations of computational accuracy; the same macroscopic irreversibility is observed in exactly reversible integer based simulations, such as with cellular automata.) Thus the origin of the observed irreversibilities should be sought in the classical evolution of large collections of particles. ... The Liouville equation and its descendents contain precise information about the evolution of a pure state. This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."
 
Last edited by a moderator:
  • #43
Back to the equations, if that is okay.

S entropy
Q energy release or gain
T temperature
c specific heat

We had

S = Q / T = c dT / T

Nice. Derivable S = Q / T (definition, at least for now), and c = Q / dT (by definition)

Therefore, if dT is equal to T, - if we are calculating the total heat energy of the system:

S = c.

Let's see:

At 298 K, S of He is 126 J K-1 mol-1 (Keeler and Wothers "why chemical reations happen" p.11)

But of course the specific heat of He is 5193 J K-1 kg-1
A mole of He is about 4g
The specific heat per mole is 5193/4.002602 = 1297.4 J K-1 mol-1

Out by a factor of ten.

Is what I'm doing wrong related to the specific heat of a system not being constant at different temperatures? Does specific heat drop significantly at lower temperatures?
 
  • #44
If

dS = dQ / T = c dT / T

S entropy
Q energy release or gain
T temperature
c specific heat
d (delta)

then

TdS = c dT

Does entropy actually have a practical use, that you couldn't just use specific heat for by just jiggling the equations?
 
  • #45
The fact that entropy can only be increased has to be added in by hand in the phenomenological approach to thermodynamics. You can then e.g. derive equations for the maximum efficiency of heat engines.

Note that there is no such thing as "total heat". The heat absorbed by a system as you go from one thermodynamic state to anoyther, depends on the path you take. There is no function Q such that the difference of Q between the two states will give you the heat. We say that there is no thermodynamic state variable that corresponds to heat. Entropy, on the other hand, is a thermodynamic state variable. This is why entropy appears in the thermodynamic equations instead of heat.
 
  • #46
Taking TdS = c dT above

and He
S = 126
c = 5193

Then 1 mole of He losing 1 K when at 300 K emits how much heat.

TdS = 126*298 = 37548 J

c dT = 5193*1 = 5193 J

Out by a factor of 70+.
 
  • #47
How did anyone calculate entropy if it's not related to reactions?

Did they simply divide the heat given off by a reaction by the temperature at which the reaction took place, and then said: "We'll call that the 'entropy'!"?
 
  • #48
You need dS not S. Look up the change in S if the temperature changes by 1 K.
 
  • #49
Senex01 said:
How did anyone calculate entropy if it's not related to reactions?

Did they simply divide the heat given off by a reaction by the temperature at which the reaction took place, and then said: "We'll call that the 'entropy'!"?


That's how Carnot did it. He showed that the entropy change between a final and intital state does not depend on which path you take.
 
  • #50
Senex01 said:
We had

S = Q / T = c dT / T

Nice. Derivable S = Q / T (definition, at least for now), and c = Q / dT (by definition)

Therefore, if dT is equal to T, - if we are calculating the total heat energy of the system:

S = c.

Whoa! You're conflating and canceling terms incorrectly. Some people use Q to define an infinitesimal quantity, some a finite quantity. You can't mix the two. Let's denote the first as q and the second as Q. Then [itex]S=Q/T[/itex] and [itex]dS=q/T[/itex], and [itex]dS=c\,dT/T[/itex]. And it's not ever valid to equate dT and T!
 
  • #51
Count Iblis said:
The heat absorbed by a system as you go from one thermodynamic state to anoyther, depends on the path you take. There is no function Q such that the difference of Q between the two states will give you the heat.

Does that mean that there is no such thing as enthalpy? in terms of the heat given off by a reaction?
 
  • #52
Count Iblis said:
That's how Carnot did it. He showed that the entropy change between a final and intital state does not depend on which path you take.

I thought you just said it did depend on the path you took?
 
  • #53
Senex01 said:
Does that mean that there is no such thing as enthalpy? in terms of the heat given off by a reaction?

Enthalpy change equals the absorbed heat when the pressure is kept constant. So, while enthalpy change is path independent, it will only give you the absorbed heat for specific paths.
 
  • #54
Senex01 said:
I thought you just said it did depend on the path you took?

Heat is path dependent, entropy and enthalpy are path independent.
 
  • #55
OK. Thank you. I think I am getting it.

If it's okay, I want to keep going though.

So the entropy ( = heat change / temperature (q / T ) ) is identical from one state to another. But the heat change itself could vary according to particular cases, according to the path they take from one state to another.

In what way would one path differ from another? You mentioned pressure, that sounds like a pretty relevant difference, which would affect the heat gain/loss.

I assume if a sequence of reactions, some exothermic some endothermic, would count as a different path if the sequence varied, but at constant pressure the net heat change would be identical from the initial state to the final state. Right? The same for the rapidity of the change. These scenarios are actually both very common in cellular systems, as you know.

What else would constitute a different path?
 
Last edited:
  • #56
Actually I suppose the answer to my last question is pretty obvious. The way one path will have a different result to another path will be if it happens at a different temperature. (Duh!). Obviously differing pressures will affect temperature.

Obviously, chemical reactions in a living cell will generally happen under conditions of constant pressure and temperature. Not sure what the effect of different catalysts would be, but I would guess that wouldn't make a difference to the heat loss/gain.
 
  • #57
So Gibbs energy of a reaction is the energy that the reaction releases at a certain temperature: that energy is available as work.

dG = dH - TdS

where dH is the energy released by the reaction
TdS is a term which shows the energy that will not be released by the system at this temperature.
 
  • #58
No-one is obliged to ever respond, especially to a stroppy old bugger like me. But I'm assuming that the posts nobody has come back to me on, are probably not too far wrong: like the one about entropy being the inverse of the sum/average energy differences of parts of the system. Actually, looking at the gibbs energy equation, I'm not sure about that one now.

Well, I am genuinely grateful for everyone who has responded so far Count i, Mapes, atyy and all of you.
 

Similar threads

Replies
2
Views
839
Replies
4
Views
1K
Replies
12
Views
2K
Replies
27
Views
3K
  • Thermodynamics
Replies
7
Views
2K
  • Thermodynamics
Replies
4
Views
1K
Replies
5
Views
1K
Back
Top