- #1

Nathew

The quarks inside the protons and neutrons are moving around, so?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Nathew
- Start date

- #1

Nathew

The quarks inside the protons and neutrons are moving around, so?

- #2

Vanadium 50

Staff Emeritus

Science Advisor

Education Advisor

- 27,661

- 11,880

- #3

Baluncore

Science Advisor

- 10,060

- 4,420

If you know why you calculated that instantaneous theoretical temperature then maybe you will know how to interpret and use the result.

- #4

- 831

- 12

Just like people are categorized in groups , races , social groups because they are more than one if you would have one single guy in the whole world and noone else it would be hard to say something about him as there is nothing to compare to.

- #5

- 31

- 2

Well.. the definition of temperature is:

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

Last edited:

- #6

Drakkith

Staff Emeritus

Science Advisor

- 21,370

- 5,207

Well.. the definition of temperature is:

[itex](\frac{\partial S}{\partial E})^{-1}=T[/itex]

Which means, the partial derivative of the entropy to the Energy, of which you should take the inverse, and that IS the temperature - per definition.

So.. Since the molecule has a certain energy because it is moving, you only need to think about what the entropy of this one molecule is.

Then you will be able to calculate the temperature, I assume.

I don't think this is correct. Accelerating a baseball to 0.99c doesn't increase its temperature. The kinetic energy of the object is not something temperature takes into account.

- #7

- 1,938

- 50

In an inertial frame of reference which is co-moving with the atom, the atom has velocity zero (for example the atom is in a box and the box follows its motion). So the atom's temperature would depend on the inertial frame I choose? The temperature of a gas in a box doesn't depend on the inertial frame I choose...

If you know why you calculated that instantaneous theoretical temperature then maybe you will know how to interpret and use the result.

--

lightarrow

- #8

- 1,938

- 50

No.I know temperature is the measure of how much atoms are moving, so can a single atom even have a temperature?

--

lightarrow

- #9

Baluncore

Science Advisor

- 10,060

- 4,420

- #10

- 35,977

- 4,683

Zz.

- #11

- 831

- 12

the only way to say it's temperature I assume would be to get to know it's kinetic energy at the moment and then translate that into a possible temperature via maths.

- #12

- 31

- 2

I don't think this is correct. Accelerating a baseball to 0.99c doesn't increase its temperature. The kinetic energy of the object is not something temperature takes into account.

But.. if the entropy is a function of the energy, you actually can calculate the temperature with that formula. Imagine the baseball you were talking about, if you'd give that an enormous velocity, ofcourse the temperature of the environment will change.

- #13

Drakkith

Staff Emeritus

Science Advisor

- 21,370

- 5,207

But.. if the entropy is a function of the energy, you actually can calculate the temperature with that formula. Imagine the baseball you were talking about, if you'd give that an enormous velocity, ofcourse the temperature of the environment will change.

But not of the baseball...

A subtlety here is the distinction between the atom/object ITSELF, and the atom/object itself + the environment.

- #14

- 831

- 12

- #15

- 31

- 2

I have to say, that my last formula isn't valid, because we are not talking about a equilibrium. And I'm getting more and more convinced about the fact that it is almost impossible to talk about the temperature of a single atom.. but think about this:

the theory of the big bang states that in the beginning of the universe, the universe was made of vibrating quarks, and they were vibrating at such speed, it wasn't possible to create nucleons, because it was too hot.

So, what now?

- #16

- 35,977

- 4,683

Zz.

- #17

Baluncore

Science Advisor

- 10,060

- 4,420

But then if the answer is “YES” there must be more precise definitions of the particular concept of temperature that is being considered in each case.

If we restrict our definition of the term “temperature” to a subset of those used by others, then our answer could be “NO”, “MAYBE” or “YES”. It is all a question of definition.

- #18

Drakkith

Staff Emeritus

Science Advisor

- 21,370

- 5,207

The answer to the OP cannot be an unconditional definitive “NO” because there are peer reviewed Physicists reporting the theoretical temperature of individual atoms.

Please provide a reference for this.

- #19

- 757

- 355

and the energy of a single atom is well defined (and it is),

then its temperature is well defined.

The entropy of an atom is just a measure of the number of possible states it could have for constant energy. A single particle in a 3D box can have many different states for a given total energy, and this number goes down as the energy goes down until you reach the unique ground state, a state of minimum energy and zero entropy.

The energy of an atom is well defined to the extent that it can be calculated and measured. There are limits imposed by the uncertainty principle, but this doesn't affect what the energy levels themselves are.

Knowing this, it is perfectly legitimate to speak of the temperature of a single atom as a measure of how quickly the energy changes in response to a change in entropy.

Hope this helps:)

- #20

Drakkith

Staff Emeritus

Science Advisor

- 21,370

- 5,207

Knowing this, it is perfectly legitimate to speak of the temperature of a single atom as a measure of how quickly the energy changes in response to a change in entropy.

Can you elaborate on this?

- #21

- 419

- 28

I think there are a few ways to define temperature, and they are equivalent in the thermodynamic limit of a large number of particles. When we ask about the temperature of a single atom, we run into trouble. For example, the thermodynamic definition of entropy 1/T=∂S/∂E is problematic since it begs the question of how we define entropy.

One way of looking at temperature that is a little more loose but, in my opinion, gives a really good intuitive way of thinking about temperature is in terms of Boltzmann statistics. This way can be used to develop an idea for strange concepts like the temperature of an individual atom or negative temperature. So here is my non-rigorous but intuitively helpful definition of temperature.

Suppose we have a system with energy levels E_{n} and a probability distribution describing the occupation of each of these levels. Experiments on such systems show that in most cases, the observed probability distributions for the occupation of the energy levels all follow the form P(E_{n}) ~ e^{-βEn}

where different experimental conditions yield different values for β. We define the system's temperature T to be 1/(kβ) where k is Boltzmann's constant. I like to call this distribution the "Boltzmann distribution." So the temperature basically is defined as the reciprocal of the exponential decay constant appearing in the Boltzmann distribution describing the system.

To be more explicit, I used the ~ symbol to write the Boltzmann distribution above to indicate proportionality rather than equality. The more correct equation would be

P(E_{n}) = e^{-βEn}/Z

where Z is the partition function: Z=Ʃ_{n} e^{-βEn}

Now we can use this definition to play some gedanken games with a single atom. For simplicity let us say that we have an atom with only two energy levels, E_{1}=0 and E_{2}=ε.

If the atom is in its ground state, we would describe it by saying P(E=E_{1}=0) = 1 and P(E=E_{2}=ε)=0. This is obvious because if we know the atom is in its ground state, we will surely measure it to be there and not in the excited state--we know a priori what the probability distribution is [we didn't use any "temperature" so far, just common sense.] But now if we try and compute what β factor would reproduce this distribution, we find β=∞ ⇔ T=0. [This is because e^{-βEn} is equal to 1 for n=0 and e^{-βEn} is equal to 0 only if β=∞.] So an atom in its ground state has zero temperature according to this statistical definition.

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E_{1}=0)=1/2 and P(E=E_{2}=ε)=1/2? Doing the arithmetic like last time [If E_{1}≠E_{2}, then e^{-βE1}=e^{-βE2} if and only if β=0⇔T=∞], then we find the temperature must be infinite.

This should all make sense so far. The Boltzmann distribution starts out at T=0 ensuring everything is in its ground state, and it becomes a more and more uniform distribution [entropy increases] as T→+∞. You can try different probability distributions and see that this actually happens--temperature increases as the probability distribution goes from 100% E_{1} to a 50%-50% split.

Now what if we know the atom is in its excited state? P(E=E_{1})=0 and P(E=E_{2})=1. The only way this is possible is if β is negative, corresponding to negative temperature! So an atom in its excited state actually has negative temperature.

This is a general fact: negative temperatures indicate "population inversions" or situations where the occupation of a state exceeds that of a lower energy state. But these systems are NOT in thermodynamic equilibrium and you would NEVER see them in any thermodynamic ensemble. However population inversions can be created in different situations, such as in a ferromagnet or in a laser. Applying thermodynamic definitions to non-thermodynamic probability distributions is what leads you to semi-sense like "negative temperature," but strictly speaking this isn't real thermodynamics since these systems are not in thermodynamic equilibrium.

One way of looking at temperature that is a little more loose but, in my opinion, gives a really good intuitive way of thinking about temperature is in terms of Boltzmann statistics. This way can be used to develop an idea for strange concepts like the temperature of an individual atom or negative temperature. So here is my non-rigorous but intuitively helpful definition of temperature.

Suppose we have a system with energy levels E

where different experimental conditions yield different values for β. We define the system's temperature T to be 1/(kβ) where k is Boltzmann's constant. I like to call this distribution the "Boltzmann distribution." So the temperature basically is defined as the reciprocal of the exponential decay constant appearing in the Boltzmann distribution describing the system.

To be more explicit, I used the ~ symbol to write the Boltzmann distribution above to indicate proportionality rather than equality. The more correct equation would be

P(E

where Z is the partition function: Z=Ʃ

Now we can use this definition to play some gedanken games with a single atom. For simplicity let us say that we have an atom with only two energy levels, E

If the atom is in its ground state, we would describe it by saying P(E=E

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E

This should all make sense so far. The Boltzmann distribution starts out at T=0 ensuring everything is in its ground state, and it becomes a more and more uniform distribution [entropy increases] as T→+∞. You can try different probability distributions and see that this actually happens--temperature increases as the probability distribution goes from 100% E

Now what if we know the atom is in its excited state? P(E=E

This is a general fact: negative temperatures indicate "population inversions" or situations where the occupation of a state exceeds that of a lower energy state. But these systems are NOT in thermodynamic equilibrium and you would NEVER see them in any thermodynamic ensemble. However population inversions can be created in different situations, such as in a ferromagnet or in a laser. Applying thermodynamic definitions to non-thermodynamic probability distributions is what leads you to semi-sense like "negative temperature," but strictly speaking this isn't real thermodynamics since these systems are not in thermodynamic equilibrium.

Last edited:

- #22

Drakkith

Staff Emeritus

Science Advisor

- 21,370

- 5,207

Thanks, Jolb! Great explanation!

- #23

- 1,938

- 50

All this is very interesting and original (I have saved it in my pc ); however it doesn't seem to take in account the atom's kinetic energy, which was the OP question.[...]

What if it were in the state with equal probability of being measured in its excited state and in its ground state? P(E=E_{1}=0)=1/2 and P(E=E_{2}=ε)=1/2?

[...]

--

lightarrow

- #24

- 757

- 355

[itex]T\equiv (\frac{\partial U}{\partial S})_{V,N}[/itex]

That the temperature happens to be proportional to the kinetic energy of particularly well-behaved (ideal) gases is a convenient coincidence, but the relationship between energy and temperature can be greatly different for other systems (i.e. the energy of bodies emitting thermal (blackbody) radiation).

Where [itex]U[/itex] is the internal energy, [itex]P[/itex] is pressure, [itex]V[/itex] is volume, [itex]N[/itex] is the number of particles, and [itex]\mu[/itex] is the chemical potential, the first law of thermodynamics (conservation of energy) can be written as:

[itex]dU= -PdV +TdS +\mu dN[/itex]

the temperature in this picture is defined as

[itex]T\equiv (\frac{\partial U}{\partial S})_{V,N}[/itex]

This isn't the only way to express the temperature, because instead of talking about our system in terms of the internal energy [itex]U(S,V,N)[/itex], we could express our system in terms of the Helmholtz free energy [itex]F(T,V,N)[/itex], the enthalpy [itex]H(S,P,N)[/itex], or the Gibbs free energy [itex]G(T,P,N)[/itex], and each of these pictures can be remarkably convenient in certain situations.

- #25

- 757

- 355

This is a very mathematical argument, but it's really how I make sense of the concept of the temperature of a single particle. Things get especially complicated when we consider more realistic atoms with complicated internal dynamics, but you can be sure that to the extent that [itex]U[/itex], [itex]S[/itex], [itex]V[/itex], and [itex]N[/itex] are well defined, then so is [itex]T[/itex].

Share:

- Replies
- 2

- Views
- 3K