What is the real explanation for entropy and its role in chemical reactions?

  • Thread starter Thread starter Senex01
  • Start date Start date
  • Tags Tags
    Entropy
AI Thread Summary
Entropy plays a crucial role in chemical reactions, often described in terms of order and disorder, but this explanation lacks clarity. The discussion critiques the conventional understanding of entropy as a measure of information needed to specify a system's microstate, emphasizing that the definitions are often vague and context-dependent. It highlights the relationship between entropy, heat transfer, and Gibbs free energy, questioning how these concepts interrelate and whether they truly reflect the underlying physical processes. The conversation also points out that the categorization of microstates is influenced by thermodynamic properties, which complicates the understanding of entropy. Ultimately, the need for a precise definition of microstates remains a significant challenge in comprehending entropy's role in chemical reactions.
Senex01
Messages
39
Reaction score
0
This must have been posted on here before, but I can't find any reference to it.

I've had to learn a little biochemistry related to my work. This led me to realize that I knew little chemistry, and set me out to learn some chemistry. Chemical reactions are significantly dependent, according to what I read, on entropy. This got me looking at entropy again, which I remember vaguely from school.

Some books try to explain entropy in terms of order/disorder. This seems a bit of a poor explanation to me. For example, they show a box with two gases in it separated by a barrier: the barrier is removed, and the gases mix. Thus order -> disorder, and the entropy increases. This still seems to beg the question of what "order" is. Also it helps not one whit when trying to explain what entropy means when you are trying to define Gibbs free energy.

An attempt I came across to tighten up the gas-dispersion explanation of entropy stated that it was a movement from low-probability state to a high-probability state. Of course, in this example this is nonsense, because any particular arrangement of gas particles is as equally probable as any other: we only have a "higher-probability", as we have grouped a larger number of particular arrangements into a single category, and so therefore a category with a large number of arrangements ("dispersed") has a higher probability than a category with a lower number of arrangements("compacted"). We can arbitrarily create any kind of categorization we like to describe arrangement of gas particles. Thus entropy and chemical reactions depend on what categories we choose to define. This seems very improbable, or at least a very confusing explanation.

Then there was the attempt to explain it in terms of information. We have good information on where the gas particles are at the start, the instant the barrier is removed. As the two mingle, we have less information on the relative locations of the gas particle of one kind relative to the gas particles of the other kind. This is really just the same as the order/probability explanation given before, but in different terminology. Still the same problem arises: does a chemical or physical reaction depend on how much knowledge we have about a system: surely the chemical and physical event occur even if don't know about them. And even if we had perfect knowledge, the reaction would still happen. We in fact do have pretty good knowledge of the reaction that occurs when a chromosome replicate itself: we know the structure of the chromosome, and the intermediate molecules that read it, and reconstruct its copy. Our near-perfect knowledge has no effect on the reaction.

So we'll drop this order/probability/knowledge analogy, unless someone explains it better to me.
 
Science news on Phys.org
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.
 
I've been reading Keeler and Wothers Why chemical reactions happen.

Then they bounce immediately into explaining entropy mathematically, in relation to heat. Of course, if we describe heat as roughly a measurement of the kinetic energy, or alternatively, the momentum of the movement of particles, there is a connection with the order/disorder explanations, but it is a bit tenuous.

So entropy S, itself, nobody seems to want to define in precise terms. However, they are happy to define the change in entropy dS, in terms of heat Q and temperature T.

dS = Q / T

Fine. So entropy is measured in Joules and is a measure of the heat absorbed. Then they, and people posting in Wikipedia and elsewhere happily substitute S for Q/T and vice versa.

How is this related to the specific heat capacity, which I remember studying in school? Specific heat capacity of a substance is the heat energy in Joules required to raise a certain mass of the substance by a certain number of degrees. (Kilogram per degree, when I did it, but you could use moles or Fahrenheit or whatever, I suppose.) The specific heat "c" for a body of mass "m" is given by

Q = mc * dT

So therefore

mc = Q / dT

and is specific heat is of course mesured in Joules as well. If you use mass as a single unit (one kilo or one mole) and also had the change in temperature a fixed unit, then specific heat capacity would equal entropy. What is the actual difference? Is there one?

Also we were given tables when I was at school that implied that specific heat capacity was constant at different temperatures, although it would appear perfectly reasonable to me if it were to change as temperature change, even to change for different substances in different ways.

The equation we had for entropy assumes a constant temperature during heat loss or gain, which is not particularly plausible, except for infinitesimal changes. The specific heat capacity equation assumes that the temperature is changes as the substance gains and loses heat. This seems far more reasonable. However, it makes entropy seem an even more confusing and ill-defined concept.
 
Count Iblis said:
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.

That would mean that if I make more precise measurements - say a kind of Hamiltonian vector of all the particles (I remember Hamiltonian vectors from Penrose's Emperor's New Mind) - then the chemical or physical status of the reaction would be different, than if I just took a general measurement of its temperature, volume and density? I must be misunderstanding you.
 
Senex01 said:
We can arbitrarily create any kind of categorization we like to describe arrangement of gas particles. Thus entropy and chemical reactions depend on what categories we choose to define. This seems very improbable, or at least a very confusing explanation.
You raise a good point. What they don't usually tell you is that the categories are determined by a certain set of thermodynamic properties.

Basically, any system can be in one of a large number of possible microstates. A microstate is just a particular arrangement of the particles in the system. Now, each microstate has a certain pressure P, temperature T, and number of particles N, and you can group the microstates by the values of those three variables, so that all the microstates in each group have the same values of P, T, and N. These groups, or macrostates, are the "categories" you're thinking of. By measuring the pressure, temperature, and number of particles (well in practice usually you'd measure something else, like volume, and then calculate number of particles), you can tell which macrostate a system is in, and the number of microstates in that macrostate determines the state's entropy.

Why do those particular variables, and no others, determine the groups? Well, actually you can use certain other variables, like volume in place of number of particles as I mentioned. But you still need 3 variables which are related to P, T, and N. I think the reasoning there is that the 3 variables represent the 3 ways in which a system can interact with its environment: it can transfer heat, it can do work, or it can gain or lose particles. And it stands to reason that from the point of view of the environment, the state of a system is fully defined by how it can interact - beyond that, what's really going on inside the system doesn't matter.
 
Last edited:
Count Iblis said:
Entropy is the amount of information that you need in order to specify the exact microstate a system is in.

If a chemical reaction is happening in a test tube and you describe what is going on if you look at your test tube and take some measurements, then in terms of this imprecise macroscopic data, the number of microstates compatible with what you see/measure, is relevant.

Let me put it another way. I have test-tube A and test-tube B, that contain substances X and Y. I measure them in whatever way, and discover that the particles are more randomly distributed A than in B: X particles tend to be higher up the tube, say. Therefore, the mixture in test-tube B has higher entropy that that in test-tube A. Fine.

Then, by using some super measurement, I discover that there is a precise mathematical distribution, perhaps some kind of fibonacci sequence, that determines the exact position and movement of the particles in test-tube B: I wasn't aware of this before. So then, suddenly, test-tube A now has the higher entropy. Again, the problem is that we are making the chemical reaction dependent on my knowledge of the substances involved.

Which brings me to the same point: unless you have a precise technical definition of "microstate", to define the exact microstate of system, I have to describe all the positions/momentums of the particles in it. This is exactly the same quantity of information, no matter what state the system is in.
 
diazona said:
Now, each microstate has a certain pressure P, temperature T, and number of particles N...

That's false. A microstate does not have a temperature at all.
 
diazona said:
Basically, any system can be in one of a large number of possible microstates. A microstate is just a particular arrangement of the particles in the system. Now, each microstate has a certain pressure P, temperature T, and number of particles N, and you can group the microstates by the values of those three variables, so that all the microstates in each group have the same values of P, T, and N. These groups, or macrostates, are the "categories" you're thinking of.

Thanks, that, and the rest of your post makes sense to me. I can see how that connects with the 2nd law of thermodynamics, both when stated with and without reference to entropy.

Of course, the pressure, temperature and number of particles of system won't necessarily change in the spreading gas example, or in the other favourite example people use, the melting ice cube example, but I'll just put that down to misleading examples, and follow yours because it makes sense. So entropy is a relationship between a macrostate and a theoretical (but I presume practically incaculable) number of microstates.

How does this relate to heat transfer and Gibbs energy then? Or to the mathematical definition of entropy as a quantity of heat abosorbed at a certain temperature?
 
Count Iblis said:
That's false. A microstate does not have a temperature at all.

so what actually is a "microstate" then?
 
  • #10
Senex01 said:
That would mean that if I make more precise measurements - say a kind of Hamiltonian vector of all the particles (I remember Hamiltonian vectors from Penrose's Emperor's New Mind) - then the chemical or physical status of the reaction would be different, than if I just took a general measurement of its temperature, volume and density? I must be misunderstanding you.


If you know the exact state of the system, and if it is an isolated sstem, you could simply predict the state it will be in some time later.


Let me put it another way. I have test-tube A and test-tube B, that contain substances X and Y. I measure them in whatever way, and discover that the particles are more randomly distributed A than in B: X particles tend to be higher up the tube, say. Therefore, the mixture in test-tube B has higher entropy that that in test-tube A. Fine.

Then, by using some super measurement, I discover that there is a precise mathematical distribution, perhaps some kind of fibonacci sequence, that determines the exact position and movement of the particles in test-tube B: I wasn't aware of this before. So then, suddenly, test-tube A now has the higher entropy. Again, the problem is that we are making the chemical reaction dependent on my knowledge of the substances involved.

Which brings me to the same point: unless you have a precise technical definition of "microstate", to define the exact microstate of system, I have to describe all the positions/momentums of the particles in it. This is exactly the same quantity of information, no matter what state the system is in.

A microstate of a closed system is the exact quantum state of the system. If you have one particle in a box, then it will have certain energy levels in which it can be in, just like an electron of the hydrogen atom. If you specify the the energy of the particle in the box with some small uncertainty, then it can be in energy level that falls within that uncertainty. The amount of information you need to specify exactly which one of the energy level the particle really is in is proportional the the logarithm of this number (the logarithm will be proportional to the number of digits of this huge number).

Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.
 
  • #12
Count Iblis said:
Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.

Don't those two sentences contradict each other? Unless you have a definition of "accessible"? Does "accessible" just mean "that we can predict"?

If I know something about the status of a system at time t, and it is changing randomly, then at time t+t' I know less about it, unless I make new measurements. That is obviously true, but it can't be what you mean. It would imply that if someone had taken measurements before me, then the entropy of the system would be, for them, higher than the entropy of a system for me.

So if we were measuring two different substances, and then the two substances reacted, the entropy of the two systems could be different for each of us, which would imply that we would each observe a different reaction. I am sure this is not what you mean.
 
  • #14
Count Iblis said:
To see how the definition S = k Log(Omega) yields the relation dS = dq/T, http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Proof_of_the_Second_Law"

What I'm looking for is a definition of entropy that does not use the term "entropy" in trying to define itself, which was why I was happier with the S = Q/T definition, even it it didn't make much sense on reflection. (As T could not remain constant under such conditions)
 
Last edited by a moderator:
  • #15
Senex01 said:
That is interesting, but it assumes the definition of "entropy" in defining "entropy", so I'm not so happy with the definition.


Well, S = k Log(Omega) is the definition of entropy for a system in thermal equilibrium. You can derive the relation dS = dq/T from that definition, not the other way around.
 
  • #16
Senex01 said:
Don't those two sentences contradict each other? Unless you have a definition of "accessible"? Does "accessible" just mean "that we can predict"?

If I know something about the status of a system at time t, and it is changing randomly, then at time t+t' I know less about it, unless I make new measurements. That is obviously true, but it can't be what you mean. It would imply that if someone had taken measurements before me, then the entropy of the system would be, for them, higher than the entropy of a system for me.

So if we were measuring two different substances, and then the two substances reacted, the entropy of the two systems could be different for each of us, which would imply that we would each observe a different reaction. I am sure this is not what you mean.

Accessible means that the state is a possible state the system can be in given what we know about the system. If the energy of a state does not fall within the known limits of the total eenrgy of the isolated system then, due to conservation of energy, that state is not an accessible state.

The fundamental postulate of statistical physics is that all accessible states are a priori equally likely. If a system is in equilibrium, then it can be found in all these accessible states with equal chance. If a system is not in equilibrium, then that means that it is more likely to be in some accessible states.

E.g. suppose you have a gas that was constrained to move in one part of an enclosure and you remove that constraint. Immediately after the constraint has been removed the gas has not had the time to move into the other part. So, not all of the states are equally likely. To say that they are a priori equally likely means that if you are given a system in which the gas can move freely as in this example and you are not told anything about its history, then all the states are equally likely. But the conditional probability distribution over all the accessible states, given that a constraint has just been removed, is not a uniform distribution.
 
  • #17
You are well named Count Iblis, the charm and devilry of your explanations does you credit.

I think we can avoid going into the explanation of eigenstates and the adiabatic theorem of quantum mechanics. The maths is entertaining. I notice though, that that definition depends directly on the total energy of the system, which brings us back to the definition that entropy is just a measurement of energy gain.

That definition is really equivalent to the definitnion that dia... whatver his/her name was ... gave: that it depends on the relationship between the microstates and the macrostate: the equation defines the macrostate in terms of the total energy in the system - presumably this would be the theoretical energy required to raise the system from absolute zero to its current temperature, although there could be other interesting complications ^- dia...whatever defined it in terms of ... was it pressure, temperature, number of particles, ... would ultimately amount to the same thing.

Entropy was originally created to describe the mechanics of steam engines, and I think we can more or less keep it at that level.
 
  • #18
I'm still very charmed.

The a priori equally liklihood of states is the assumption I'm making too.

If the entropy depends on our knowledge of the history of the system, then two observers who had different knowledge could observe different reactions.

Saying that it depends on knowledge is not the same as saying that it depends on the energy level of the system, and energy transfer between one system (0r one part of a system) and another. Which is what your fancy equation said.
 
  • #19
So if I have chart that says, inter alia, regarding the values of absolute entropy at 298 K

He (g): 126 J K-1 mol-1

This means that another system brought into contact with He at 298 K will gain 126 J per K of difference in temperature per mol of He. Is that correct?
 
  • #20
But then, this site

http://www2.ucdsb.on.ca/tiss/stretton/Database/Specific_Heat_Capacity_Table.html

says that the specific heat capacity of He Gas at 26 C (= 299 K??) is 5.3 J per gram... per degree K or C...

And http://www.ausetute.com.au/moledefs.html" , a mole of He is 4.003g,

SO

If a (cooler) substance is brought into contact with He gas at 299 K it will gain 5.3 x 4.003 = 21.2159 J per degree difference in temperature per mole.

I'm doing something wrong: what is the relationship between specific heat and entropy then? If any?
 
Last edited by a moderator:
  • #21
I mean, entropy is measured in joules per kelvin per mole,

and specific heat capacity is measured in joules per kelvin per mole, or else in joules per kelvin per kilo.

... and they both express the heat gained or lost by a system. So what is the difference between them, and why do charts give different values for them?
 
  • #22
Count Iblis said:
A microstate of a closed system is the exact quantum state of the system. If you have one particle in a box, then it will have certain energy levels in which it can be in, just like an electron of the hydrogen atom. If you specify the the energy of the particle in the box with some small uncertainty, then it can be in energy level that falls within that uncertainty.
...

Now for an isolated system, all possible accessible microstates are equally likely. This leads to the conclusion that the entropy can only increase.

Look, I'm sorry to be rude, but you didn't learn about quantum mechanics and energy levels of hydrogen, and the eigenstate functions and the other stuff mentioned in the Wikipedia section you cited, before you learned what entropy was. Otherwise you would never have learned any more physics at all.

I have a vague idea of quantum mechanics, like most literate people, but I'm just asking about entropy, and the basics of the laws of thermodynamics: either you're just showing off, or you're being incredibly perverse in bringing in the mathematical basis of quantum mechanics into the problem.

I really have to assume that you don't actually know what entropy is yourself but have just learned some equations without understanding what they mean, if you can't explain them without using concepts which beg the question. I can do the maths required by the problems, but they are completely meaningless, unless they have some relationship to reality. At the moment, I can't see any relationship to reality of heat transfer.

What is the difference between entropy and specifc heat?
 
  • #23
What I mean is "you weren't told to learn about eigenstates and the adiabatic theory of quantum mechanics before you learned what entropy was, otherwise you wouldn't have learned any more physics at all..." Why are you being so perverse?
 
  • #24
Even worse, once we get onto Gibbs energy, which is defined in the literature, both in Wikipedia and Keeler and Worthers as

dG = dH -TdS

G is the Gibbs free energy, the amount in joules of useful heat available for work, as I understand it. Here dG is the gain in free energy

H is the enthalpy, and dH the energy gain of the system, which under conditions of constant pressure is equal to dQ, the energy gain.

T is the absolute temperature of the system.

dS is the change in entropy of the system.

From the definition of entropy which we got from the mathematics, dS = dQ/T.

So, substituting, we get

dG = dH - TdQ/T
dG = dH - dQ

from dQ = dH under constant pressure, which is the norm for chemical reactions

dG = 0

which is also complete nonsense.

What's going wrong here??
 
  • #25
Sorry, but there is no way you can understand what entropy is before you have studied quantum mechanics. Historically, the concept of entropy was developed phenomenologically, i.e. without a deep understanding in terms of the fundamentals. This is what we call: "Classical Thermodynamics", and it is not taught at university anymore. The only correct way to learn thermodynamics is by first familiarizing yourself with the basics of quantum mechanics and then learning the basics of statistical physics.



The relation between entropy and specific heat is given by the relation:

dS = dq/T = c dT/T
 
  • #26
dG = 0

which is also complete nonsense.

What's going wrong here??

In your derivation you were implicitely assuming thermal equilibrium. G tends to decrease for a system that is not in equilibrium and is thus a minimum in equilibrium. So, if you assume equilibrrium from the start you can't find anything else than dG = 0.

See the detailed discussion about the analogous case of the Helmholtz free energy here:

http://en.wikipedia.org/wiki/Helmholtz_free_energy
 
  • #27
Count Iblis said:
Sorry, but there is no way you can understand what entropy is before you have studied quantum mechanics.

All right, I'm sorry to be difficult, thanks for being patient. Does this mean that they do not teach the laws of thermodynamics in high school any longer?
 
  • #28
Count Iblis said:
The relation between entropy and specific heat is given by the relation:

dS = dq/T = c dT/T

That's very helpful.

Dividing by T seems a strange thing to do, but if it produces the ratio you are looking for...
 
  • #29
Count Iblis said:
In your derivation you were implicitely assuming thermal equilibrium. G tends to decrease for a system that is not in equilibrium and is thus a minimum in equilibrium. So, if you assume equilibrrium from the start you can't find anything else than dG = 0.

See the detailed discussion about the analogous case of the Helmholtz free energy here:

http://en.wikipedia.org/wiki/Helmholtz_free_energy

I thought that was the equation for dG, but I suppose it's all the same.

I was assuming not just thermal equilibrium, but that H and TS both referred to the same system at the same moment of time. This obviously can't be the case if G is either positive or negative. Either H or TS have to refer to systems disassociated from each other in either space or time.
 
  • #30
Keeler and Wother don't explain G=H-TS, but from their examples it is clear they mean:

H = the energy produced or consumed by the chemical reaction we are arbitrarily considering

TS = the temperature of the system * the entropy of the system.

Since TS = TQ/T, I don't see why they just don't use Q, but since S seems to be available from tables, I suppose it has a purpose.
 
  • #31
Since dG = dH - TdS

and

dS = dQ/T = c dT/T

then

dG = dH - c*dT

Doesn't it? Actually, that makes real sense.
 
  • #32
In thermodynamics there are 2 sorts of quantities. Heat and work are "path dependent" and require a knowledge of the history of the system. Energy, entropy are "state variables" and are properties of the system at that point in time - no knowledge of the history of the system needed. In the formula dS=dQ/T the left hand side is a state variable, the right hand side is path dependent, so the right hand side must be specified along a reversible path for the equality to hold.
 
  • #33
As far as I see it though - going by the information/microstate definition of entropy, we con only say "higher probability" = "lower entropy" (by definition) if we add the corrollary

"lower entropy" = "higher probability" of having other states at the same energy level
 
Last edited:
  • #34
Senex01 said:
As far as I see it though - going by the information/microstate definition of entropy, we con only say "higher probability" = "lower entropy" (by definition) if we add the corrollary

"lower entropy" = "higher probability" of having other states at the same energy level

Yes - this is called the "microcanonical ensemble" if you'd like to read up on it. (I'm not sure about the information thing though).
 
  • #35
atyy said:
In thermodynamics there are 2 sorts of quantities. Heat and work are "path dependent" and require a knowledge of the history of the system. Energy, entropy are "state variables" and are properties of the system at that point in time - no knowledge of the history of the system needed. In the formula dS=dQ/T the left hand side is a state variable, the right hand side is path dependent, so the right hand side must be specified along a reversible path for the equality to hold.

I think I need "heat" explained to me as well then. I thought heat was the total kinetic energy of the particles of the system. And temperature is a somewhat arbitrary measurement of the kinetic energy or momentum (I'm not sure) that the particles transfer, directly or via radiation, on other particles. Therefore two systems will be "at the same temperature" when they transfer equal amounts of heat energy to each other. But two systems could have quite different ratios between internal kinetic energy and the energy they transfer to external systems.

I mean, that is not supposed to be a statement of fact, just what I thought I understood.
 
  • #36
atyy said:
"microcanonical ensemble"
I have a mental image of a choir of pint-sized clerics, but I'm reading up on it. Thanks.
 
  • #37
Senex01 said:
I think I need "heat" explained to me as well then. I thought heat was the total kinetic energy of the particles of the system. And temperature is a somewhat arbitrary measurement of the kinetic energy or momentum (I'm not sure) that the particles transfer, directly or via radiation, on other particles. Therefore two systems will be "at the same temperature" when they transfer equal amounts of heat energy to each other. But two systems could have quite different ratios between internal kinetic energy and the energy they transfer to external systems.

I mean, that is not supposed to be a statement of fact, just what I thought I understood.

"Heat = kinetic energy" is a good heuristic, but in terms of jargon, it is strictly speaking wrong. Here are the correct statements. Suppose you have a gas in which the particles don't interact with each other through potential energy. In such a gas the internal or total energy is just the total kinetic energy of the particles. To increase the kinetic energy, one may add either heat or work to the gas, thus knowing the kinetic energy alone does not tell you whether you got there by adding heat or work. This illustrates that the kinetic energy is a state variable, and does not depend on the history of the gas, while heat and work do depend on the history of the gas. However, in such a gas the kinetic energy is related to the *temperature, which is a state variable*, and indicates the direction of heat flow if two containers of this gas at different temperatures are placed in thermal contact with each other.
 
Last edited:
  • #38
my professor used to say the only true answer is: entropy is equilibrium in phase space.

If there are particles, their locations and momenta are equally distributed across the whole range (given a cutoff before infinity).

Of course there is entropy in computer sciences as well. Perhaps one can generalize it even more.

Regarding chemical reactions this may not be the answer but you may be able to deduce sth. from this.
 
  • #39
Senex01 said:
All right, I'm sorry to be difficult, thanks for being patient. Does this mean that they do not teach the laws of thermodynamics in high school any longer?

They do, but they don't explain what heat, work, temperature, and entropy really are. This is explained phenomenologically using examples. When students arive at university, they are asked to forget what they have been previously told before they are given the rigorous definitions. This is because the intuitive high school explanations look to be pretty water-tight, while in fact it is full of holes so that students could become confused.
 
  • #40
Entropy has always seemed rather confusing but fascinating...Here are some explanations from famous people...maybe they will offer some clues. Keep in mind that Boltzmann, the father of S = k logW committed suicide...so maybe entropy should not be thought about toooo much! (LOL)
Brian Greene< Fabric of The Cosmos:
high entropy means that there are many ways; low entropy entropy means there are few ways...If the pages of war and peach are stacked in proper numerical order, that is a low-entropy configuration...two essential features of entropy. ... entropy is a measure of the amount of physical disorder in a system. (in the absence of gravity) a uniformly spread gas configuration is insensitive to an anormous number of rearrangements of its molecular components and so is in a state of high entropy. Just as thinner books have fewer page reorderings smaller spaces have fewer places for molecules...so allow for fewer rearrangements.

All forms of energy are not equal. ..every life form takes in low entropy energy (in the form of food) and gives off high-entropy energy (as heat)... plants maintain their low entropy state via photosynthesis...deriving energy from the (low entropy) sun.

when gravity matters, as it did in the early universe, clumpiness- not uniformity- is the norm...for the initailly diffuse gas cloud the entropy decrease through the orderly formation of clumps (stars, planets, etc) is more than compensated by the heat generated as the gas compresses...and ultimately by the enormous amount of heat and light released as nuclear processes begin to take place.

back later..if this is helpful I'll find some other explanations...
 
  • #41
Of course if someone had told me at 15 years old that I couldn't understand my homework until I'd finished the first year of university, I would have decided "Well, I suppose I'll just have to be a lawyer then." In fact, that is what I decided, but for other reasons.

I was talking to a German who knows a bit about the history of science, and from what he said, the following makes some sense.

Each force in the universe acts on objects, until it can act no more on them. Clausius's original term, when he was explaining the second law of thermodynamics, was Verwandlung, meaning transformation (such as frog into a prince, or a petty neurotic clerk into a cockroach). As each transformation (or reaction perhaps) occurs, the differences in potential energy (in terms of heat originally, but gravity, electricity, strong/weak would do) between each "particle" or "part of the system" decreases. So they shunt their way down from reaction to reaction, to ever-decreasing energy states - or to states of ever-decreasing energy difference. Eventually they will theoretically encounter the "heat death" when no more work can be done, but in practice they will get hung up in some state where some force acts to counter-balance the other.

You can jolt them (the particles or parts of the system) out of their tenuous balance by giving them a jolt of energy that changes their state in a particular way to counter-act or overcome some force, so that they can shunt their way down to a state of lower energy difference. For example, by applying a match.

It's slightly more complex in quantum mechanics, because of the probablistic nature of the reactions, but the same principle applies when you consider things in toto.

Brian Greene's explanation made a lot of sense to me, until I started seeing the equations, which made it clear that entropy had something to do with energy and forces, and was not simply a matter of the ordering of states of the system. Not just the classical equation but the quantum mechanical equation is a measure of energy:

S = k ln(Omega(E + dE)) -- it was a wonderful piece of devilry to leave out the last two terms...

You could make a measurement I suppose of the energy differences of each ordered pair of particles/parts - classically or quantum mechanically, it wouldn't make much difference in principle - and then make a measurement of the total energy differences, or perhaps mean energy difference. Omega, if my understanding of the explanation is right, is in fact an enumeration of the energy states - and the change in energy states of course.

Entropy then, is the inverse of these differences in energy levels.

(Note: this is a request for clarification, again, this is what I understand, I'm just wondering what your response would be...)
 
Last edited:
  • #42
http://necsi.org/projects/baranger/cce.pdf
"Yes, you are the one who increased the entropy! It is not physics, it is not chaos, it is not Liouville: you drew the smooth volume to make your life easier, you are the one. It was chaos who manufactured the fractal, but you chose to smooth it out. ... One final question: if entropy is really totally subjective, why does it appear so objective to so many people? Ask any physical chemist. You will be told that entropy is a fundamental, permanent property of matter in bulk. You can measure it, you can calculate it accurately, you can look it up in tables, etc. And the answer to that new paradox is: large numbers. It is well-known that very large numbers have a way of making probabilities turn into absolute certainty."

http://ocw.mit.edu/NR/rdonlyres/Physics/8-333Fall-2007/1E2D4D68-EC43-44C7-91A8-9FC8A7698A78/0/lec9.pdf "However,the advent of powerful computers has made it possible to simulate the evolutionof collections of large numbers of particles,governed by classical, reversible equations of motion. Although simulations are currently limited to relatively small numbers of particles (10^6), they do exhibit their reversible macroscopic behaviors similar to those observed in nature (10^23). For example, particles initially occupying one half of a box proceed to irreversibly,and uniformly,occupy the whole box. (This has nothing to do with limitations of computational accuracy; the same macroscopic irreversibility is observed in exactly reversible integer based simulations, such as with cellular automata.) Thus the origin of the observed irreversibilities should be sought in the classical evolution of large collections of particles. ... The Liouville equation and its descendents contain precise information about the evolution of a pure state. This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."
 
Last edited by a moderator:
  • #43
Back to the equations, if that is okay.

S entropy
Q energy release or gain
T temperature
c specific heat

We had

S = Q / T = c dT / T

Nice. Derivable S = Q / T (definition, at least for now), and c = Q / dT (by definition)

Therefore, if dT is equal to T, - if we are calculating the total heat energy of the system:

S = c.

Let's see:

At 298 K, S of He is 126 J K-1 mol-1 (Keeler and Wothers "why chemical reations happen" p.11)

But of course the specific heat of He is 5193 J K-1 kg-1
A mole of He is about 4g
The specific heat per mole is 5193/4.002602 = 1297.4 J K-1 mol-1

Out by a factor of ten.

Is what I'm doing wrong related to the specific heat of a system not being constant at different temperatures? Does specific heat drop significantly at lower temperatures?
 
  • #44
If

dS = dQ / T = c dT / T

S entropy
Q energy release or gain
T temperature
c specific heat
d (delta)

then

TdS = c dT

Does entropy actually have a practical use, that you couldn't just use specific heat for by just jiggling the equations?
 
  • #45
The fact that entropy can only be increased has to be added in by hand in the phenomenological approach to thermodynamics. You can then e.g. derive equations for the maximum efficiency of heat engines.

Note that there is no such thing as "total heat". The heat absorbed by a system as you go from one thermodynamic state to anoyther, depends on the path you take. There is no function Q such that the difference of Q between the two states will give you the heat. We say that there is no thermodynamic state variable that corresponds to heat. Entropy, on the other hand, is a thermodynamic state variable. This is why entropy appears in the thermodynamic equations instead of heat.
 
  • #46
Taking TdS = c dT above

and He
S = 126
c = 5193

Then 1 mole of He losing 1 K when at 300 K emits how much heat.

TdS = 126*298 = 37548 J

c dT = 5193*1 = 5193 J

Out by a factor of 70+.
 
  • #47
How did anyone calculate entropy if it's not related to reactions?

Did they simply divide the heat given off by a reaction by the temperature at which the reaction took place, and then said: "We'll call that the 'entropy'!"?
 
  • #48
You need dS not S. Look up the change in S if the temperature changes by 1 K.
 
  • #49
Senex01 said:
How did anyone calculate entropy if it's not related to reactions?

Did they simply divide the heat given off by a reaction by the temperature at which the reaction took place, and then said: "We'll call that the 'entropy'!"?


That's how Carnot did it. He showed that the entropy change between a final and intital state does not depend on which path you take.
 
  • #50
Senex01 said:
We had

S = Q / T = c dT / T

Nice. Derivable S = Q / T (definition, at least for now), and c = Q / dT (by definition)

Therefore, if dT is equal to T, - if we are calculating the total heat energy of the system:

S = c.

Whoa! You're conflating and canceling terms incorrectly. Some people use Q to define an infinitesimal quantity, some a finite quantity. You can't mix the two. Let's denote the first as q and the second as Q. Then S=Q/T and dS=q/T, and dS=c\,dT/T. And it's not ever valid to equate dT and T!
 
Back
Top