What is meant by molecular temperature ?

shrumeo

What is meant by "molecular temperature"?

Hello,

I got into a debate with someone about the term "molecular temperature."

I said "You can't define the temperature of a single molecule. It doesn't make sense."

And they said. "But look at all these papers where people use the term molecular temperature. They must mean the temperature of a single molecule."

I said "I don't know what they mean, but they can't mean that."

So can anyone out there answer this?

**********************************************************

Here are some examples of its usage:

Coherent matter waves of fullerenes:
http://arxiv.org/abs/quant-ph/0412003

http://arxiv.org/PS_cache/quant-ph/pdf/0402/0402146.pdf [Broken]

Femtosecond laser absorption studies:

http://www.chemie.hu-berlin.de/ernsting/publications/abstract27.html [Broken]

Computational molecular dynamics:
http://www.nd.edu/~izaguirr/papers/SAC03_MaIz0x.pdf [Broken]

http://cphys.s.kanazawa-u.ac.jp/icc...html/p2-49.html [Broken]

Ultracold stuff:
http://newton.ex.ac.uk/aip/physnews.393.html [Broken]

http://www.aip.org/enews/physnews/2002/split/615-2.html [Broken]

Gas-phase electron diffraction:
http://www.genetical.com/dc/ScientificResearch/Rowland/GPED/Usage.html

Atmosphere?:
http://www.sworld.com.au/steven/space/atmosphere/

*********************************************************

So can someone please explain what any of these people mean when they invoke the phrase "molecular temperature"?

I seriously doubt that they mean the temperature of a single molecule.

Last edited by a moderator:
Related Quantum Physics News on Phys.org

Nicky

In my opinion, the term "molecular temperature" does make intuitive sense. A single molecule can have excited electronic states, vibrational states and rotational states. The molecule may also lie in an external potential well, in which it is bouncing around. All of these excitations contribute to the molecule's "temperature".

Crosson

It is possible to define temperature for any system that has multiple ways to store a given amount of energy. The key missing point in your understanding is that you don't know wat temperature is.

Is it some kind of measurement of the average energy? No, and this is a common mistake.

In order to understand what temperature is, you must understand what entropy is. Entropy is the number of ways a system can arrange a given amount of energy.

Think about a cold object. If I add to it some energy, then its entropy goes up significantly because so many particles share so little energy (lots of ways to arrange)/.

Think about a hot object. If I add to it some energy, its entropy hardly increases because the entropy is already so large.

Entropy is the key to understanding temperature. Once you see how we can define molecular entropy it is easy to define molecular temperature.

Juan R.

Thermal and statistical

shrumeo said:
Hello,

I got into a debate with someone about the term "molecular temperature."

I said "You can't define the temperature of a single molecule. It doesn't make sense."

And they said. "But look at all these papers where people use the term molecular temperature. They must mean the temperature of a single molecule."

I said "I don't know what they mean, but they can't mean that."

So can anyone out there answer this?

So can someone please explain what any of these people mean when they invoke the phrase "molecular temperature"?

I seriously doubt that they mean the temperature of a single molecule.
You are confusing thermal with statistical temperature.

Thermal temperature is a old concept, it is derived from thermal entropy and it applicable only to macroscopic bodies.

Statistical entropy is a modern quantum generalization like statistical temperature. It is applicable to single molecules, atoms, or even single atomic nucleus.

In fact, from quantum thermodynamics one can predict phase transitions on nucleus. For more information on molecular, atomic, and mesoscopic temperatures can see CPS: physchem/0309002 and references therein on nuclear studies (a copy will be available online in www.canonicalscience.com in brief).

Note: In the macroscopic limit both temperatures agree.

Last edited:

shrumeo

Nicky said:
In my opinion, the term "molecular temperature" does make intuitive sense. A single molecule can have excited electronic states, vibrational states and rotational states. The molecule may also lie in an external potential well, in which it is bouncing around. All of these excitations contribute to the molecule's "temperature".
Ok, of course a single molecule can be in a particular vibrational, rotational, electronic, and rotational state. At any given instance it will be one energy state for each of these things. My understanding about temperature is that it represents an average of these things (including translational kinetic energy and such) of large numbers of bodies. I have always thought of it as requiring there to be at least some degree of a distribution of energy states before being able to average these up into an overall temerature.

I just don't see how you could measure the temperature of a single molecule in Kelvins or Celcius. You would have to talk about its energy in units of energy like joules or eV or something.

shrumeo

Crosson said:
It is possible to define temperature for any system that has multiple ways to store a given amount of energy.
At any given instant.

Instantaneously.

A molecule will not be in multiple energy states at the same instant. Will it?

The key missing point in your understanding is that you don't know wat temperature is.
Could you then explain it to me?

Is it some kind of measurement of the average energy? No, and this is a common mistake.
Well, this must be the mistake I am encountering, please elaborate...

In order to understand what temperature is, you must understand what entropy is. Entropy is the number of ways a system can arrange a given amount of energy.

Think about a cold object. If I add to it some energy, then its entropy goes up significantly because so many particles share so little energy (lots of ways to arrange)/.

Think about a hot object. If I add to it some energy, its entropy hardly increases because the entropy is already so large.

Entropy is the key to understanding temperature. Once you see how we can define molecular entropy it is easy to define molecular temperature.
I understand how entropy works. I dig the 2nd law and stuff.
But please explain to me why if I can define molecular entropy then I can define molecular temperature.

I'm curious as to your explanation of both of these terms.

If absolute zero is when particles come to a complete hault, then would the hottest temperature be when those particles are zooming around at light speed? If so, what temperature is that?

shrumeo

Juan R. said:
You are confusing thermal with statistical temperature.

Thermal temperature is a old concept, it is derived from thermal entropy and it applicable only to macroscopic bodies.

Statistical entropy is a modern quantum generalization like statistical temperature. It is applicable to single molecules, atoms, or even single atomic nucleus.

In fact, from quantum thermodynamics one can predict phase transitions on nucleus. For more information on molecular, atomic, and mesoscopic temperatures can see CPS: physchem/0309002 and references therein on nuclear studies (a copy will be available online in www.canonicalscience.com in brief).

Note: In the macroscopic limit both temperatures agree.
How can you have statistics on one molecule, atom, or nucleus?

What is temperature if not a thermal measurement?

shrumeo

And can anyone please explain at least what the folks who wrote the papers on the fullerene matter beams meant by the term "molecular temperature."

shrumeo

If absolute zero is when particles come to a complete hault, then would the hottest temperature be when those particles are zooming around at light speed? If so, what temperature is that?
I guess it would depend on how massive the particles were.

Nicky

shrumeo said:
Ok, of course a single molecule can be in a particular vibrational, rotational, electronic, and rotational state. At any given instance it will be one energy state for each of these things. My understanding about temperature is that it represents an average of these things (including translational kinetic energy and such) of large numbers of bodies. ...
But a molecule is itself an assembly of smaller bodies (nuclei and electrons). Consider that, for example, a large protein molecule may contain many thousands of constituent atoms.

shrumeo

Nicky said:
But a molecule is itself an assembly of smaller bodies (nuclei and electrons). Consider that, for example, a large protein molecule may contain many thousands of constituent atoms.
So are you saying that a molecule can be large enough to contain so many different energy states that one could possibly measure a temperature somehow from this molecule?

As in a strand of DNA or cellulose or something. It could be so long and wrapped up in itself and be of such a size as to have a single temperature?

Apologies for the following stream:

I was thinking of this too as a possibility. I brought this up to someone else and we got into a discussion about what do we mean by a molecule? Is a big sheet of cross-linked polymer like latex one big (or a few big) molecule(s)? If so, then one could consider a plate of glass one molecule or crystal or a chunk of metal a large molecule? I mean, all the atoms are continuously bonded to another atom in the same structure. We don't usually think of these things a molecules and they really aren't, but for some reason we normally consider very large but still microscopic molecules, such as in biology, as molecules. It's as if we wouldn't have another name for it. Well, macromolecule is a typical term, and I do suppose it may have been introduced to possibly differentiate between small molecules that behave very discretely and a very large polymeric-type molecules.

So besides all that, it doesn't seem to me that the authors of the C70 matter beam paper(s) are using the term "molecular temperature" this way. Do you think they mean the temperature of single fullerene molecules in the sense that we have discussed above?

Nicky

shrumeo said:
So are you saying that a molecule can be large enough to contain so many different energy states that one could possibly measure a temperature somehow from this molecule?

As in a strand of DNA or cellulose or something. It could be so long and wrapped up in itself and be of such a size as to have a single temperature?
Yes, that is what I mean, except that the temperature concept is even applicable to molecules much smaller than DNA.

So besides all that, it doesn't seem to me that the authors of the C70 matter beam paper(s) are using the term "molecular temperature" this way. Do you think they mean the temperature of single fullerene molecules in the sense that we have discussed above?
Here is an except from one of the fullerene beam papers:

arxiv.org/quant-ph/0412003 said:
This interferometric setup is now complemented by a
heating stage about 1 m behind the oven and 7 cm in
front of the first grating. It serves to vary the internal
energy of the molecules by photon absorption
They appear to mean "temperature" as the excitation of individual molecules. This is C70 they are talking about, so each one is really a little sheet of graphite curled in on itself. It's practically a macroscopic object.

Juan R.

shrumeo said:
How can you have statistics on one molecule, atom, or nucleus?

What is temperature if not a thermal measurement?
It appears that you have read books in equilibrium statistical thermodynamics or kinetics and are misunderstanding statistical in a broad sense with statistics of a population of molecules or atoms. It also appears that you believe that thermal temperature and kinetic temperature are the same that statistical temperature.

One can perfectly do statistics of a single molecule, atom, or nucleus, taking the different quantum states of that system like a "population".

The general formula for the entropy of a system is

$$S = -k \ Tr \{ \rho \ ln \rho \}$$

where $$\rho$$ is the density matrix of the single molecule, atom, or nucleus and $$k$$ is the Boltzman constant. One also know the energy $$E$$ of a single molecule, atom, or nucleus, therefore

$$\frac{1}{T} \equiv \frac{\partial S}{\partial E}$$

I think that you error is in believing that entropy is a macroscopic quantity applicable only to a great number of molecules, what is not true.

From above definition of temperature one can obtain kinetic temperature (average of kinetic motion) and thermal temperature (internal energy $$U$$ by unit of entropy for an Avogadro number of molecules) like special cases.

Last edited:

shrumeo

Nicky said:
Yes, that is what I mean, except that the temperature concept is even applicable to molecules much smaller than DNA.
Well DNA can be all sizes.

So where do you draw the line?

At what point do decide that a molecule is large enough to have its own temperature? And at what size does a molecule stop being a molecule?

Here is an except from one of the fullerene beam papers:

arxiv.org/quant-ph/0412003 This interferometric setup is now complemented by a heating stage about 1 m behind the oven and 7 cm in front of the first grating. It serves to [B said:
vary the internal
energy [/B]of the molecules by photon absorption

They appear to mean "temperature" as the excitation of individual molecules. This is C70 they are talking about, so each one is really a little sheet of graphite curled in on itself. It's practically a macroscopic object.
{I know what C70 is, thanks. And I wouldn't go around saying that it was practiaclly a macroscopic object since it's maybe 8 angstroms across.}
Are you so sure?
See, they used the term "internal energy" when speaking of the molecules.
Why would they use the term internal energy in the excerpt you chose?
They could definitely calculate the internal energy of one fullerene molecule.
Are you saying they can also calculate the temperature?
The way I read that excerpt is that they had a heating stage right near the grating to increase the temperature of the beam which is a result of varying or increasing the internal energy of the molecules.

Do you see why I'm having such a hard time of this?
The normal definition of temperature doesn't fit here.

I sincerely doubt they mean to express in their paper that a C70 molecule has enough variation in it's states of energy to be able to calculate (couldn't measure it) an instantaneous temperature in Kelvin.

I think they just threw the word molecular in before temperature to make it sound cool. I hink they just mean the temperature of the molecular beam.

{edit: Yes, I see now, by actually reading the paper instead of just skimming, that they intend to mean (or mean to intend?) that the term molecular temperature refers to the temperature of single molecules.}

Last edited:

shrumeo

Juan R. said:
It appears that you have read books in equilibrium statistical thermodynamics or kinetics and are misunderstanding statistical in a broad sense with statistics of a population of molecules or atoms. It also appears that you believe that thermal temperature and kinetic temperature are the same that statistical temperature.

One can perfectly do statistics of a single molecule, atom, or nucleus, taking the different quantum states of that system like a "population".
Wouldn't this be a rather small population?

{No, I'm just a chemist who took the word of certain professors back in the day...}

The general formula for the entropy of a system is

$$S = -k \ Tr \{ \rho \ ln \rho \}$$

where $$\rho$$ is the density matrix of the single molecule, atom, or nucleus and $$k$$ is the Boltzman constant.
What's $$r$$?
This is the first time I've seen this equation.
Of course, since I am a simple chemist, I've only seen something like:

$$S = k \{ \ln \Omega \} \$$

It's this sort of understanding (or really this level of understanding) and this defintion of entropy that I am working with. Or really I am thinking along the lines of relating entropy and temperature via the heat capacity of a given substance. These are all "macroscopic" terms in this line of thinking, of course.

But anyway, are you saying that perhaps the equation you have above is a quantum modification or a generalization of the old-school Boltzmann Principle?
One also know the energy $$E$$ of a single molecule, atom, or nucleus, therefore

$$\frac{1}{T} \equiv \frac{\partial S}{\partial E}$$
Yeah, see that's what my problem here is.
I always thought of the Bolzmann constant as was a way to relate energy to temperature, especially since it's expressed in terms of J/K.

I guess here you are defining temperature as a relation between entropy and total energy? In this case would the entropy be a component of this total energy? If that is indeed what you mean by $$E$$ since it's vague to me what $$E$$ you are talking about here.

Any time I see this sort of thing (where temperature is in an equation that can apply to a single particle) I assume that the temperature in the equation is the "ambient temperature" or the temperature of the surroundings. For example, when someone talks about the temerature dependence of some molecular or lattice vibration, they don't mean the temperature of the single vibration or the temperature of the single molecule, but the temperature of the surrounding medium. But, I'm probably wrong in this way of looking at things.

I think that you error is in believing that entropy is a macroscopic quantity applicable only to a great number of molecules, what is not true.
Well, I guess I wasn't thinking in terms of entropy so much.
I was really looking at a temperature as an average and nothing more.
I guess you are saying effectively the same thing as Nicky.
That since there is an ensemble of energy states within a single molecule at a given instant, whether they are different manifestations (ie, vibrations, rotations, electronic states, etc), that these energies can all be averaged into a temperature?

So, in the old-school Boltzmann principle, the value for $$\Omega$$ could be very small, but as long as it's more than 1, all is good?

From above definition of temperature one can obtain kinetic temperature (average of kinetic motion) and thermal temperature (internal energy $$U$$ by unit of entropy for an Avogadro number of molecules) like special cases.
So you are saying that in the fullerene beam paper, when they use the term "molecular temperature", the authors are perhaps referring to the kinetic temperature of single fullerene molecules?

For instance when they say:
Arndt et al. said:
We change the internal temperature
of the molecules in a controlled way before they enter a
near-field interferometer, and observe the corresponding
reduction of the interference contrast.
that this is what they mean?

It's almost like they wanted to say internal energy, but said temperature instead.

Or here:
Decoherence of the fullerene matter waves can be induced by heating the molecules with multiple laser beams (514.5 nm, 40 µm waist radius, 0 − 10 W) before they enter the interferometer. The resulting molecular temperature can be assessed by detecting the heating dependent fraction of fullerene ions using the electron multiplier D1 over the heating stage.
Do you think they could have used the term "beam temperature" and it would have just meant the same thing?

But then they MUST mean the temperature of single fullerenes here:
The laser heating increases the molecular temperature by 140 K per absorbed photon. We calculate that they reach up to 5000 K for very short times, but the re-emission of thermal photons is so efficient that even the hottest molecules are cooled to below∼ 3000 K when they enter the interferometer 7.2 cm behind the heating stage.
But they then sort of talk about the way they calculated these temperatures:
FIG. 3: Spectral photon emission rate R of C70 molecules, as used for the calculation of thermal decoherence. We use the published [25] absorption cross-section for (S0 !S1) and a heat capacity of CV = 202kB. The fall-off to short wavelengths is determined by the limited internal energy of the molecules, while the decrease at long wavelengths is due to the lack of accessible radiative transitions at energies below 1.5 eV. The figure shows that in the absence of cooling a single molecule at 2500 K travelling at 190 m s−1 (that is, with a transit time of 4 ms through the interferometer) would emit an integrated number of three visible photons. This is sufficient to determine the path of the molecule if the emission occurs close to the second grating.
Is that heat capacity they are talking about a "molecular heat capacity"?

I'm convinced now, of course, that these folks are talking about the temperature of single fullerene molecules, but am I being too presumptuous in thinking that "molecular temperature" (or atomic or nuclear) cannot mean the same thing as traditional bulk temperature? They do talk later about molecular analogs to blackbody radiation, thermionic emission, and evaporative cooling, and it is obvious to me how these analogies are drawn, so the temperature they speak of is merely a molecular analog and not the same thing as a bulk temperature.

Is it like thermal entropy vs. informational entropy where people use the same word to actually mean two different (but maybe analogous) things?

Juan R.

shrumeo said:
Wouldn't this be a rather small population?
Yes, but entropy is defined even for a population of a single element, trivially is zero then.

shrumeo said:
What's $$r$$?
This is the first time I've seen this equation.
It is not $$r$$, it is $$Tr$$, what is the trace operator.

Your Boltzmann-Planck formula is a special formula, it is not the proper definition of entropy like many (all?) physical chemistry and statistical thermodynamics textbooks argue.

Only when the system is at microcanonical equilibrium being closed, without long-range correlations, external EM fields, absence of gravitatory effects, etc. the quantum state of the system is

$$\rho = \frac{1}{\Omega}$$

and the Boltzmann-Planck formula $$S = k \{ \ln \Omega \} \$$ follows from the Von Neumann definition by direct substitution.

$$S = -k \ Tr \{ \rho \ ln \rho \}$$.

but in general the quantum state is not defined by the inverse of number of available states.

You are relating entropy and temperature using specific concepts aplicable only in determined situations.

The definition of temperature is that i wrote. It is the standard definition of temperature in physicists books. It is know since 1930 at least, but irrelevant chemists textbooks ignore rigorous formulations and still work with heat engines (19th century thermodynamics).

In CPS: physchem/0309002 i did a criticism to a 2002 paper by a group of very famous chemists (one was Evans) published in PR claiming for an experimental violation of second law of thermodynamics in nanoclusters. They used arguments from usual chemical thermodynamics (even a very very wrong argument from the Levine on physical chemistry, Do you know the book?). Of course all the paper was wrong. Since chemical thermodynamicians have no idea of rigorous thermodynamics. At least, two physicists published a formal comment showing that article was completely wrong with basic misunderstanding of elementary 20th century thermodynamics. A specialist from institute of microelectronic contacted with me and said that my criticism was correct, the only violation of thermodynamics "was in the title of paper by Wang et al."

I have atempted to modify the ideas of chemist and correct their really great errors, but many of them ignore modern stuff and continue to publish wrong papers and books; many physicists, biologists, and enginners simply don't read chemical literature. In some cases, enginners have a more elevated level than chemical thermodynamicians like celebrated Klotz (do you know?)

$$E$$ is the internal energy of the system, the notation $$E$$ is used in statistical theories and the notation $$U$$ in thermal macroscopic. However, one can introduce the total energy (internal more kinetic) into $$dS$$.

shrumeo said:
I assume that the temperature in the equation is the "ambient temperature" or the temperature of the surroundings.
This is a common error of chemical thermodynamics. The above definition is the definition of temperature of a body from its entropy and energy. It is an intrinsic property of that body. Only at equilbrium (chemical thermodynamics deals only with that) body $$T$$ is equal to external $$T$$ and you can talk of "external" or "ambient" temperature.

Temperature is not average. It is the definition that i said. After, in special systems and situations, it "looks like". For example, aplying the above definition to an equilibrium ideal gas of classical puntual molecules one recovers the idea of temperature like average of kinetic motion (velocity).

shrumeo said:
So, in the old-school Boltzmann principle, the value for $$\Omega$$ could be very small, but as long as it's more than 1, all is good?
If $$\Omega = 2$$ then entropy is not zero. If $$\Omega = 1$$ entropy also exists! then it is zero. However, the Boltzmann-Planck definition is only valid in the restricted conditions above stated.

shrumeo said:
So you are saying that in the fullerene beam paper, when they use the term "molecular temperature", the authors are perhaps referring to the kinetic temperature of single fullerene molecules?
I didn't read Arndt et al. paper but i don't think that are refering just to kinetic energy. It appears that they are talking about the thermal agitation of the molecules in different quantum states without kinetic motion, only internal motion and the posterior cooling of the molecule. Probably they are doing an experiment of nuclear temperature and measuring (in-)coherence of nuclear wavepackets and reduction (localization) by an external perturbation.

shrumeo said:
Is that heat capacity they are talking about a "molecular heat capacity"?
I don't know because i didn't read. But i think that you believe that heat capacity is a macroscopic concept but is not. In fact, the definition of heat capacity for any system is

$$C_{z} \equiv T \left(\frac{\partial S}{\partial T} \right) _{z}$$

Thermal (or bulk) temperature is above definition that i wrote

$$\frac{1}{T} \equiv \frac{\partial S}{\partial E}$$

when aplied to a macroscopic system

$$\frac{1}{T_{bulk}} \equiv \frac{\partial S_{bulk}}{\partial E_{bulk}}$$

Informational entropy and information temperature are other concepts; are more conceptual (mathematical) than physical. For example, one can talk of informational temperature of a trasmited message for a coaxial line but it does not signify that the coaxial is heated.

shrumeo

Juan R. said:
Yes, but entropy is defined even for a population of a single element, trivially is zero then.
Ah, ok then.

It is not $$r$$, it is $$Tr$$, what is the trace operator.
Got it. So this is sort of the matrix mechanics version?
Your Boltzmann-Planck formula is a special formula, it is not the proper definition of entropy like many (all?) physical chemistry and statistical thermodynamics textbooks argue.
Yes, this is a real eye-opener.
The definition of temperature is that i wrote. It is the standard definition of temperature in physicists books. It is know since 1930 at least, but irrelevant chemists textbooks ignore rigorous formulations and still work with heat engines (19th century thermodynamics).
So to try and put in into words, the definition of temperature is the inverse of the increase in entropy with respect to energy?

That's it?

It's funny, I looked through 3 P.Chem books readily available to me and the definition of temperature is never given in those terms and the equation defining temperture is not there. They all discuss it in terms of the zeroth law and how to construct a temperature scale based on an ideal gas, and that's it.

I couldn't find a copy of the Levine book but I know of it.

I have atempted to modify the ideas of chemist and correct their really great errors, but many of them ignore modern stuff and continue to publish wrong papers and books; many physicists, biologists, and enginners simply don't read chemical literature.
Engineers should do so more often.
I see so many possibilities for engineers to carry the work of chemists to applications that it's crazy. Sometimes I want to try to start these things myself, but I am discouraged with "Let some engineer worry about that...":)

In some cases, enginners have a more elevated level than chemical thermodynamicians like celebrated Klotz (do you know?)
No, I never had this book.
But I looked it up and here is a desription that might make you squirm.

A new, millennium edition of the classic treatment of chemical thermodynamics Widely recognized for half a century for its first-rate, logical introduction to phenomenological thermodynamics, this classic work is now thoroughly revised for the new millennium. The Sixth Edition continues to cover the fundamentals and methods of thermodynamics with exceptional vigor and clarity, while incorporating many new developments.

If $$\Omega = 2$$ then entropy is not zero. If $$\Omega = 1$$ entropy also exists! then it is zero. However, the Boltzmann-Planck definition is only valid in the restricted conditions above stated.
I guess that's what I really meant, that there would be no entropy, not really that entropy would not exist.

I didn't read Arndt et al. paper but i don't think that are refering just to kinetic energy. It appears that they are talking about the thermal agitation of the molecules in different quantum states without kinetic motion, only internal motion and the posterior cooling of the molecule. Probably they are doing an experiment of nuclear temperature and measuring (in-)coherence of nuclear wavepackets and reduction (localization) by an external perturbation.
I don't think they are talking about anything nuclear.
I believe they are discussion something closer to what you wrote before that.
I think they are talking about internal energy states (vibrational, etc).

I don't know because i didn't read. But i think that you believe that heat capacity is a macroscopic concept but is not. In fact, the definition of heat capacity for any system is

$$C_{z} \equiv T \left(\frac{\partial S}{\partial T} \right) _{z}$$
Thanks. Just knowing that single particles can have a heat capacity should shed some light on future thoughts.

I work mostly with nanoparticles and the like and this may be good to know for future reference. I might have to start looking more into physics books instead of chemistry ones. It might give me an edge in understanding and I won't have the embarassment of "violating the 2nd law...":)

Informational entropy and information temperature are other concepts; are more conceptual (mathematical) than physical. For example, one can talk of informational temperature of a trasmited message for a coaxial line but it does not signify that the coaxial is heated.
Again, this is why I come here. I learn something every time. I have never heard of informational temperature, I'll look it up...

Juan R.

shrumeo said:
Got it. So this is sort of the matrix mechanics version?
I think that you are misleading.

The trace operator of entropy is not related to the matrix version of quantum mechanics (i.e. Heisemberg one). The definition of an observable $$A$$ in quantum mechanics (this appears in almost all books in quantum mechanics that i know) is

$$<A> = \ Tr \{ \rho A \}$$

with normalized quantum state $$\rho$$.

Only when the system can be described in wavefuntions terms $$\rho = | \Psi \rangle \langle \Psi|$$ then one recovers the formula that appears in P. chem and quantum chem textbooks.

$$<A> = \langle \Psi|A| \Psi \rangle$$

for normalized wavefunctions. This formula is approximated and does not apply, for example, to molecules in an external EM field. Now you can express the above formula in a basis of functions and rewrite it in matricial (Heisemberg) form.

Yes, inverse of temperature is the ratio of entropy with respect to energy.

Regarding thermodynamics, P.Chem books are really bad wiith lot of hiden asumptions, half trues, wrong equations, etc.

I said that often chemical engineers does not read literature in chemical thermodynamics because don't work often (it is really outdated). For example, Klotz manual is asummed to be a marvellous book, but it works at level of 19th century formalism!!!!!

for example in that book the universal second law of thermodynamics is presented like

$$dS \geq \frac {DQ}{T}$$

wth $$D$$ an inexact differential. But that is false, because that law is not universal. That law does not work in a chemical reactor. By this reason, engineers work with more general laws, see for example (Ind. Eng. Chem. Res. 2002, 41, 2931-2940). Don't worry if that article use $$dQ$$ instead of $$DQ$$ because this is correct, heat is not an inexact differential like many physical and chemical textbooks incorrectly argue from cycles and heat engines. Theoretical biologists also work with more advanced formalisms because that "universal" law does not work for living cells.

shrumeo said:
I guess that's what I really meant, that there would be no entropy, not really that entropy would not exist.
I mean that "there would be no entropy" is not correct. The entropy is zero, zero entropy is not a "no entropy".

If you are interested in nanoparticles, you would are interested in nanothermodynamics. I recomend you download the paper in nanothermodynamics from www.canonicalscience.com when new web page is ready (in a few days).

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving