- #1

The quarks inside the protons and neutrons are moving around, so?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Nathew
- Start date

- #1

The quarks inside the protons and neutrons are moving around, so?

- #2

Vanadium 50

Staff Emeritus

Science Advisor

Education Advisor

2021 Award

- 29,203

- 14,461

- #3

Baluncore

Science Advisor

2021 Award

- 11,760

- 5,950

If you know why you calculated that instantaneous theoretical temperature then maybe you will know how to interpret and use the result.

- #4

Crazymechanic

- 831

- 12

Just like people are categorized in groups , races , social groups because they are more than one if you would have one single guy in the whole world and no one else it would be hard to say something about him as there is nothing to compare to.

- #5

Electric Red

- 31

- 2

Well.. the definition of temperature is:

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

Last edited:

- #6

Drakkith

Staff Emeritus

Science Advisor

- 22,058

- 6,138

Well.. the definition of temperature is:

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

I don't think this is correct. Accelerating a baseball to 0.99c doesn't increase its temperature. The kinetic energy of the object is not something temperature takes into account.

- #7

lightarrow

- 1,939

- 50

In an inertial frame of reference which is co-moving with the atom, the atom has velocity zero (for example the atom is in a box and the box follows its motion). So the atom's temperature would depend on the inertial frame I choose? The temperature of a gas in a box doesn't depend on the inertial frame I choose...

If you know why you calculated that instantaneous theoretical temperature then maybe you will know how to interpret and use the result.

--

lightarrow

- #8

lightarrow

- 1,939

- 50

No.I know temperature is the measure of how much atoms are moving, so can a single atom even have a temperature?

--

lightarrow

- #9

Baluncore

Science Advisor

2021 Award

- 11,760

- 5,950

- #10

- 35,995

- 4,714

Zz.

- #11

Crazymechanic

- 831

- 12

the only way to say it's temperature I assume would be to get to know it's kinetic energy at the moment and then translate that into a possible temperature via maths.

- #12

Electric Red

- 31

- 2

I don't think this is correct. Accelerating a baseball to 0.99c doesn't increase its temperature. The kinetic energy of the object is not something temperature takes into account.

But.. if the entropy is a function of the energy, you actually can calculate the temperature with that formula. Imagine the baseball you were talking about, if you'd give that an enormous velocity, ofcourse the temperature of the environment will change.

- #13

Drakkith

Staff Emeritus

Science Advisor

- 22,058

- 6,138

But.. if the entropy is a function of the energy, you actually can calculate the temperature with that formula. Imagine the baseball you were talking about, if you'd give that an enormous velocity, ofcourse the temperature of the environment will change.

But not of the baseball...

A subtlety here is the distinction between the atom/object ITSELF, and the atom/object itself + the environment.

- #14

Crazymechanic

- 831

- 12

- #15

Electric Red

- 31

- 2

I have to say, that my last formula isn't valid, because we are not talking about a equilibrium. And I'm getting more and more convinced about the fact that it is almost impossible to talk about the temperature of a single atom.. but think about this:

the theory of the big bang states that in the beginning of the universe, the universe was made of vibrating quarks, and they were vibrating at such speed, it wasn't possible to create nucleons, because it was too hot.

So, what now?

- #16

- 35,995

- 4,714

Zz.

- #17

Baluncore

Science Advisor

2021 Award

- 11,760

- 5,950

But then if the answer is “YES” there must be more precise definitions of the particular concept of temperature that is being considered in each case.

If we restrict our definition of the term “temperature” to a subset of those used by others, then our answer could be “NO”, “MAYBE” or “YES”. It is all a question of definition.

- #18

Drakkith

Staff Emeritus

Science Advisor

- 22,058

- 6,138

The answer to the OP cannot be an unconditional definitive “NO” because there are peer reviewed Physicists reporting the theoretical temperature of individual atoms.

Please provide a reference for this.

- #19

- 757

- 355

and the energy of a single atom is well defined (and it is),

then its temperature is well defined.

The entropy of an atom is just a measure of the number of possible states it could have for constant energy. A single particle in a 3D box can have many different states for a given total energy, and this number goes down as the energy goes down until you reach the unique ground state, a state of minimum energy and zero entropy.

The energy of an atom is well defined to the extent that it can be calculated and measured. There are limits imposed by the uncertainty principle, but this doesn't affect what the energy levels themselves are.

Knowing this, it is perfectly legitimate to speak of the temperature of a single atom as a measure of how quickly the energy changes in response to a change in entropy.

Hope this helps:)

- #20

Drakkith

Staff Emeritus

Science Advisor

- 22,058

- 6,138

Knowing this, it is perfectly legitimate to speak of the temperature of a single atom as a measure of how quickly the energy changes in response to a change in entropy.

Can you elaborate on this?

- #21

Jolb

- 419

- 29

I think there are a few ways to define temperature, and they are equivalent in the thermodynamic limit of a large number of particles. When we ask about the temperature of a single atom, we run into trouble. For example, the thermodynamic definition of entropy 1/T=∂S/∂E is problematic since it begs the question of how we define entropy.

One way of looking at temperature that is a little more loose but, in my opinion, gives a really good intuitive way of thinking about temperature is in terms of Boltzmann statistics. This way can be used to develop an idea for strange concepts like the temperature of an individual atom or negative temperature. So here is my non-rigorous but intuitively helpful definition of temperature.

Suppose we have a system with energy levels E_{n} and a probability distribution describing the occupation of each of these levels. Experiments on such systems show that in most cases, the observed probability distributions for the occupation of the energy levels all follow the form P(E_{n}) ~ e^{-βEn}

where different experimental conditions yield different values for β. We define the system's temperature T to be 1/(kβ) where k is Boltzmann's constant. I like to call this distribution the "Boltzmann distribution." So the temperature basically is defined as the reciprocal of the exponential decay constant appearing in the Boltzmann distribution describing the system.

To be more explicit, I used the ~ symbol to write the Boltzmann distribution above to indicate proportionality rather than equality. The more correct equation would be

P(E_{n}) = e^{-βEn}/Z

where Z is the partition function: Z=Ʃ_{n} e^{-βEn}

Now we can use this definition to play some gedanken games with a single atom. For simplicity let us say that we have an atom with only two energy levels, E_{1}=0 and E_{2}=ε.

If the atom is in its ground state, we would describe it by saying P(E=E_{1}=0) = 1 and P(E=E_{2}=ε)=0. This is obvious because if we know the atom is in its ground state, we will surely measure it to be there and not in the excited state--we know a priori what the probability distribution is [we didn't use any "temperature" so far, just common sense.] But now if we try and compute what β factor would reproduce this distribution, we find β=∞ ⇔ T=0. [This is because e^{-βEn} is equal to 1 for n=0 and e^{-βEn} is equal to 0 only if β=∞.] So an atom in its ground state has zero temperature according to this statistical definition.

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E_{1}=0)=1/2 and P(E=E_{2}=ε)=1/2? Doing the arithmetic like last time [If E_{1}≠E_{2}, then e^{-βE1}=e^{-βE2} if and only if β=0⇔T=∞], then we find the temperature must be infinite.

This should all make sense so far. The Boltzmann distribution starts out at T=0 ensuring everything is in its ground state, and it becomes a more and more uniform distribution [entropy increases] as T→+∞. You can try different probability distributions and see that this actually happens--temperature increases as the probability distribution goes from 100% E_{1} to a 50%-50% split.

Now what if we know the atom is in its excited state? P(E=E_{1})=0 and P(E=E_{2})=1. The only way this is possible is if β is negative, corresponding to negative temperature! So an atom in its excited state actually has negative temperature.

This is a general fact: negative temperatures indicate "population inversions" or situations where the occupation of a state exceeds that of a lower energy state. But these systems are NOT in thermodynamic equilibrium and you would NEVER see them in any thermodynamic ensemble. However population inversions can be created in different situations, such as in a ferromagnet or in a laser. Applying thermodynamic definitions to non-thermodynamic probability distributions is what leads you to semi-sense like "negative temperature," but strictly speaking this isn't real thermodynamics since these systems are not in thermodynamic equilibrium.

One way of looking at temperature that is a little more loose but, in my opinion, gives a really good intuitive way of thinking about temperature is in terms of Boltzmann statistics. This way can be used to develop an idea for strange concepts like the temperature of an individual atom or negative temperature. So here is my non-rigorous but intuitively helpful definition of temperature.

Suppose we have a system with energy levels E

where different experimental conditions yield different values for β. We define the system's temperature T to be 1/(kβ) where k is Boltzmann's constant. I like to call this distribution the "Boltzmann distribution." So the temperature basically is defined as the reciprocal of the exponential decay constant appearing in the Boltzmann distribution describing the system.

To be more explicit, I used the ~ symbol to write the Boltzmann distribution above to indicate proportionality rather than equality. The more correct equation would be

P(E

where Z is the partition function: Z=Ʃ

Now we can use this definition to play some gedanken games with a single atom. For simplicity let us say that we have an atom with only two energy levels, E

If the atom is in its ground state, we would describe it by saying P(E=E

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E

This should all make sense so far. The Boltzmann distribution starts out at T=0 ensuring everything is in its ground state, and it becomes a more and more uniform distribution [entropy increases] as T→+∞. You can try different probability distributions and see that this actually happens--temperature increases as the probability distribution goes from 100% E

Now what if we know the atom is in its excited state? P(E=E

This is a general fact: negative temperatures indicate "population inversions" or situations where the occupation of a state exceeds that of a lower energy state. But these systems are NOT in thermodynamic equilibrium and you would NEVER see them in any thermodynamic ensemble. However population inversions can be created in different situations, such as in a ferromagnet or in a laser. Applying thermodynamic definitions to non-thermodynamic probability distributions is what leads you to semi-sense like "negative temperature," but strictly speaking this isn't real thermodynamics since these systems are not in thermodynamic equilibrium.

Last edited:

- #22

Drakkith

Staff Emeritus

Science Advisor

- 22,058

- 6,138

Thanks, Jolb! Great explanation!

- #23

lightarrow

- 1,939

- 50

All this is very interesting and original (I have saved it in my pc ); however it doesn't seem to take in account the atom's kinetic energy, which was the OP question.[...]

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E_{1}=0)=1/2 and P(E=E_{2}=ε)=1/2?

[...]

--

lightarrow

- #24

- 757

- 355

[itex]T\equiv (\frac{\partial U}{\partial S})_{V,N}[/itex]

That the temperature happens to be proportional to the kinetic energy of particularly well-behaved (ideal) gases is a convenient coincidence, but the relationship between energy and temperature can be greatly different for other systems (i.e. the energy of bodies emitting thermal (blackbody) radiation).

Where [itex]U[/itex] is the internal energy, [itex]P[/itex] is pressure, [itex]V[/itex] is volume, [itex]N[/itex] is the number of particles, and [itex]\mu[/itex] is the chemical potential, the first law of thermodynamics (conservation of energy) can be written as:

[itex]dU= -PdV +TdS +\mu dN[/itex]

the temperature in this picture is defined as

[itex]T\equiv (\frac{\partial U}{\partial S})_{V,N}[/itex]

This isn't the only way to express the temperature, because instead of talking about our system in terms of the internal energy [itex]U(S,V,N)[/itex], we could express our system in terms of the Helmholtz free energy [itex]F(T,V,N)[/itex], the enthalpy [itex]H(S,P,N)[/itex], or the Gibbs free energy [itex]G(T,P,N)[/itex], and each of these pictures can be remarkably convenient in certain situations.

- #25

- 757

- 355

This is a very mathematical argument, but it's really how I make sense of the concept of the temperature of a single particle. Things get especially complicated when we consider more realistic atoms with complicated internal dynamics, but you can be sure that to the extent that [itex]U[/itex], [itex]S[/itex], [itex]V[/itex], and [itex]N[/itex] are well defined, then so is [itex]T[/itex].

- #26

Nugatory

Mentor

- 14,044

- 7,687

Knowing this, it is perfectly legitimate to speak of the temperature of a single atom as a measure of how quickly the energy changes in response to a change in entropy.

With one caution: the temperature that we're talking doesn't have much to do with the popular notion of temperature. I'm inclined to agree with ZapperZ's #10.

- #27

- 757

- 355

For me, understanding how to derive them from a object/reservoir system from conditional probabilities and maximum entropy of the whole was the most mind-blowing thing I learned in classical physics that didn't involve a Lagrangian.

- #28

Jolb

- 419

- 29

All this is very interesting and original (I have saved it in my pc ); however it doesn't seem to take in account the atom's kinetic energy, which was the OP question.

--

lightarrow

Well as far as I understand it, the same problem would apply to any system--take an ideal gas for example. We derive nice expressions for the

It seems like this is a cheap way out, but I think the Ergodic 'theorem' (on which Thermodynamics relies crucially) actually justifies it by laying out assumptions that ensure thermodynamic systems should always be viewed "at rest". In the thermodynamic limit, systems have a "Poincare recurrence time", such that they start retracing their path through a phase space of canonical variables. How can a system come back to its original point in phase space if there is a constant linear motion? I think somewhere in the details of taking the thermodynamic limit and assuming ergodicity, you automatically enter the "mutually comoving" frame. This paragraph is more educated guess on my part, and I do think this is a good question that maybe someone else can answer better.

Also, what I said about negative temperature is mostly lifted from a textbook I used: Kardar's "Statistical Physics of Particles" which is actually mostly available online. Attached is the passage from Kardar that I am stealing from.

PS: A little wiki-ing yields some info about my conjecture with regards to the Ergodic *hypothesis*. It seems that once we assume Liouville's theorem holds, we are already forced to view things in the comoving frame. From wiki:

http://en.wikipedia.org/wiki/Ergodic_hypothesisLiouville's Theorem shows that, for conserved classical systems, the local density of microstates following a particle path through phase space is constant as viewed by an observer moving with the ensemble (i.e., the total or convective time derivative is zero). Thus, if the microstates are uniformly distributed in phase space initially, they will remain so at all times. Liouville's theorem ensures that the notion of time average makes sense, but ergodicity does not follow from Liouville's theorem.

Let me know if that seems to answer the question.

Last edited:

- #29

Jolb

- 419

- 29

After rereading the thread I realize I am probably getting a little caught up worrying about the subtleties of the foundations of thermodynamics when there's actually a much simpler answer to the OP's question.

The example I gave stresses the statistical definition rather than the "average kinetic energy"/"how much atoms are moving" because it really is better. Entropy is really defined statistically so the definition 1/T=∂S/∂U is statistical.

Kardar even mentions how classical mechanics and its crucial thermodynamic theorems like Liouville's theorem rely on Hamiltonian evolution through a continuous phase space, and how discrete systems (like an atom) do not fall under that umbrella--so why should our thermodynamical laws apply? He hints at the idea that QM fills the gap and allows thermodynamics to apply to discrete systems. So applying thermodynamics to discrete systems is what I did to try and explain the "temperature" of a single atom.

As to the question of how an atom's linear motion affects its temperature--as I explained before this is only an issue if you fail to realize the states are all quantized in reality. But if you are trying to think about it clasically--like the example of how linear motion of a box of ideal gas affects its temperature--then you need to worry about Liouville's theorem and ergodicity.

PS: To illustrate the fact that thermodynamics runs deeper than the classical mechanics/quantum mechanics divide, allow me to (re)post one of my favorite Einstein quotes:

I got caught up in answering the question "can a single atom even have a temperature?" and I didn't think about the first part of what the OP said.I know temperature is the measure of how much atoms are moving, so can a single atom even have a temperature?

Protons and neutrons, and more importantly electrons are all actually described by quantum mechanics rather than classical mechanics. To say that they're "moving around" like ideal gas molecules is not a good description. It is often much more useful to describe them in terms of discrete energy states, like in the example I gave.The quarks inside the protons and neutrons are moving around, so?

The example I gave stresses the statistical definition rather than the "average kinetic energy"/"how much atoms are moving" because it really is better. Entropy is really defined statistically so the definition 1/T=∂S/∂U is statistical.

Kardar even mentions how classical mechanics and its crucial thermodynamic theorems like Liouville's theorem rely on Hamiltonian evolution through a continuous phase space, and how discrete systems (like an atom) do not fall under that umbrella--so why should our thermodynamical laws apply? He hints at the idea that QM fills the gap and allows thermodynamics to apply to discrete systems. So applying thermodynamics to discrete systems is what I did to try and explain the "temperature" of a single atom.

As to the question of how an atom's linear motion affects its temperature--as I explained before this is only an issue if you fail to realize the states are all quantized in reality. But if you are trying to think about it clasically--like the example of how linear motion of a box of ideal gas affects its temperature--then you need to worry about Liouville's theorem and ergodicity.

PS: To illustrate the fact that thermodynamics runs deeper than the classical mechanics/quantum mechanics divide, allow me to (re)post one of my favorite Einstein quotes:

It is the only physical theory of universal content which I am convinced will never be overthrown, within the framework of applicability of its basic concepts.

Last edited:

- #30

Vanadium 50

Staff Emeritus

Science Advisor

Education Advisor

2021 Award

- 29,203

- 14,461

The OP. has not explained to what extent he/she understands the concept of temperature. So let's wait until there is a response to such inquiry before we delve into exotic aspect of this phenomenon.

It's a pity this was ignored. Really a pity. The OP is in high school, for heaven's sake, and most of the posts are at a much higher level, and frankly, don't address the issue.

Thread closed.

Share:

- Replies
- 4

- Views
- 222

- Replies
- 6

- Views
- 996

- Replies
- 2

- Views
- 193

- Replies
- 10

- Views
- 484

- Replies
- 45

- Views
- 1K

- Last Post

- Replies
- 1

- Views
- 261

- Replies
- 5

- Views
- 396

- Replies
- 3

- Views
- 15K

- Replies
- 3

- Views
- 581

- Replies
- 3

- Views
- 483