I Interpretation of temperature in liquids/solids

dextercioby
Science Advisor
Insights Author
Messages
13,388
Reaction score
4,042
TL;DR Summary
What is the physical interpretation of temperature of a mass of liquid or a solid?
Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.

Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
 
Physics news on Phys.org
The simplest interpretation is that "temperature is the average KE of the molecules". That may be too simple for some, but once you start considering changes of phase, it becomes complicated. It then requires a deeper energy analysis than simple temperature.
 
dextercioby said:
TL;DR Summary: What is the physical interpretation of temperature of a mass of liquid or a solid?

Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.

Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
Regarding the "physical interpretation of temperature", I would rely on “An Introduction to Thermal Physics” by Daniel V. Schroeder (Oxford University Press 2021). Schroeder's proposal for a theoretical definition of temperature is:

Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.

On base of this theoretical definition, one arrives at the thermodynamic definition of temperature:

"The temperature of a system is the reciprocal of the slope of its entropy vs. energy graph. The partial derivative is to be taken with the system's volume and number of particles held fixed;* more explicitly,
$$\frac 1 T\equiv\left( \frac {\partial S} {\partial U}\right)_{N,V} .$$"
 
Last edited:
  • Like
Likes russ_watters, TeethWhitener, jbriggs444 and 3 others
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.

Temperature determines the direction of heat flow, from higher temperature to lower temperature,
 
Temperature is the average kinetic energy whether it is expressed as translation as in liquids and gases or vibration in solids.
 
Please, lets get away from the "temperature is kinetic energy" high-school narrative.

The simple picture is
Lord Jestocost said:
Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.
 
  • Like
Likes Demystifier, PhDeezNutz, Lord Jestocost and 2 others
I am sorry, nondisrespect meant, but what makes you think I have not asked about the complicated phenomenological explanation?
 
DrClaude said:
Please, lets get away from the "temperature is kinetic energy" high-school narrative.
How is the energy transferred if it is not by kinetic interaction?
 
Baluncore said:
How is the energy transferred if it is not by kinetic interaction?
Radiation, for one.
 
  • Like
Likes DaveE, Klystron and Lord Jestocost
  • #10
dextercioby said:
I am sorry, nondisrespect meant, but what makes you think I have not asked about the complicated phenomenological explanation?
My post wasn't addressing the OP. Let me do that now.
dextercioby said:
Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.
I would consider the energy to be the "internal property of a bulk of matter." Generally speaking, an increase in temperature mean an increase in energy, but the two things are not the same. This is exemplified by heat capacity: the amount of energy necessary for a certain change in temperature is different fro different substances. Then you also have to account for phase transitions, where a lot of energy can flow without a change in temperature.

This is why I like Schroeder's simple picture of temperature describing the tendency of a thermodynamic system to exchange energy with another.

dextercioby said:
Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
In more complex systems than ideal gases, the energy is distributed among the different degrees of freedom.
 
  • Like
Likes DaveE, russ_watters, Astronuc and 1 other person
  • #11
Vanadium 50 said:
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.

Temperature determines the direction of heat flow, from higher temperature to lower temperature,
That's not as easy as this. Heat can flow in any direction in a solid, and in particular against the direction of hot to cold.

Take Fourier's law q equals -kappa grad T. Then, remember thermoelectricity exists, so consider the generalized Fourier's law q equals -kappa grad T plus ST J. The first term is the usual Fourier's term, the latter is a Peltier heat which exists whenever there is an electric current. S can have either sign, J can have any direction, and S can even be a non symmetric tensor.

That, or consider anisotropic matetials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.
 
  • Like
Likes DaveE and dextercioby
  • #12
It is true that things are complicated. Negative absolute temperatures are another complication. But I think it is better to go from B-level to I-level to A-level definitions than to get to the end before the poster has reached the middle.
 
  • Like
Likes DrClaude and Bystander
  • #13
Given that @dextercioby is already an SA/HH and is asking for an intermediate answer, I think really internalizing the definition from @Lord Jestocost ’s post will help immensely. Essentially, it says that if you increase the energy of a system and its entropy only goes up a little bit, then the system has a high temperature. Conversely, if you increase the energy and its entropy goes up a lot, it’s at a low temperature.

If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
 
  • Like
Likes Fra and DrClaude
  • #14
TeethWhitener said:
Given that @dextercioby is already an SA/HH and is asking for an intermediate answer, I think really internalizing the definition from @Lord Jestocost ’s post will help immensely. Essentially, it says that if you increase the energy of a system and its entropy only goes up a little bit, then the system has a high temperature. Conversely, if you increase the energy and its entropy goes up a lot, it’s at a low temperature.

If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
Taking this a step further, we can see that heat flow from high temperature to low temperature is simply a reflection of the second law of thermodynamics. When an object at high temperature comes into contact with an object at low temperature, the energy flows in the direction in accordance with the second law ##\Delta S\geq0##. Since a change in energy is associated with a much larger change in entropy for low temperature objects than for high temperature objects, the increase in energy will have to happen to the low temperature object in order to be consistent with ##\Delta S\geq0##, so heat will flow from hot to cold.
 
  • #15
TeethWhitener said:
Taking this a step further, we can see that heat flow from high temperature to low temperature is simply a reflection of the second law of thermodynamics. When an object at high temperature comes into contact with an object at low temperature, the energy flows in the direction in accordance with the second law ##\Delta S\geq0##. Since a change in energy is associated with a much larger change in entropy for low temperature objects than for high temperature objects, the increase in energy will have to happen to the low temperature object in order to be consistent with ##\Delta S\geq0##, so heat will flow from hot to cold.
Could heat flow from a colder to a warmer object if the warmer object increases the total entropy for example by evaporating.
 
  • #16
Philip Koeck said:
Could heat flow from a colder to a warmer object if the warmer object increases the total entropy for example by evaporating.
No. Of course, the warmer object will see its temperature decrease due to evaporation, and at one point might cool down below the colder object's temperature.
 
  • Like
Likes russ_watters, TeethWhitener and Lord Jestocost
  • #17
DrClaude said:
No. Of course, the warmer object will see its temperature decrease due to evaporation, and at one point might cool down below the colder object's temperature.
But according to post 11 by @fluidistic something like that, just without a phase change, can happen in solids.
Would you agree with that?

I'm just trying to make up my mind whether heat always flows from hot to cold.
Most of the the posts in this thread seem to say yes, but post 11 says no.
 
Last edited:
  • #18
fluidistic said:
That's not as easy as this. Heat can flow in any direction in a solid, and in particular against the direction of hot to cold.

That, or consider anisotropic matetials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.
How does the total entropy increase in that case?
 
  • Like
Likes Lord Jestocost
  • #19
dextercioby said:
Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
Temperature is not about average energy, but about energy distribution. In thermal equilibrium, the probability of being in a given state of energy ##E## is proportional to ##e^{-E/kT}##. The temperature ##T## is defined by this exponential distribution. If the energy is not distributed this way, then the system is not in thermal equilibrium and hence does not have temperature. If a system out of thermal equilibrium interacts with a thermometer, the thermometer will still show some value, but this value is not really the temperature in the above sense.
 
Last edited:
  • Like
Likes TeethWhitener and weirdoguy
  • #20
For a gas in a gravitational field, there is no “average kinetic energy” gradient in thermal equilibrium.
 
  • Like
Likes Demystifier
  • #21
Lord Jestocost said:
For a gas in a gravitational field, there is no “average kinetic energy” gradient in thermal equilibrium.
You are right. I've deleted a wrong part in my previous post. The Boltzmann distribution is proportional to
$$e^{-\beta(E_{\rm kin}+E_{\rm pot})}=e^{-\beta E_{\rm kin}} e^{-\beta E_{\rm pot}} = f_1(v) f_2(z)$$
so the probability factorizes into a ##v##-dependent and ##z##-dependent part, i.e. there is no correlation between kinetic energy and altitude.
 
  • Like
Likes Lord Jestocost
  • #22
dextercioby said:
What is the physical interpretation of temperature of a mass of liquid or a solid?

Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules
Demystifier said:
Temperature is not about average energy, but about energy distribution.
Vanadium 50 said:
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.
TeethWhitener said:
If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
It seems the the original question was what is the physical interpretation ("classical mechanical picture"?) of the "energy modes" in different phases? (kinetic energy, potential energy and vibration of bond states etc).

But I agree that the opposite is perhaps also an interesting question. What is the "information theoretic" interpretation of energy modes? This is motivated by that we already have an informaiton theoretic understanding of entropy, as the amount of missing information of the microstate - given the macrostate.

But to get a similar abstract interpretation of temperature, as that is usually defined as in post #3, is the inverse of the energy derivative of entropy. Then, wouldn't it be nice to also get rid of the "kinetic energy" interpretations from mechanics?

First think that comes to my mind, is that the relation between amount of energy and amount of entropy are quite similar to the relation between amount of memory and amount of information. We alreay have the landauer's principles that relates the erasure of a given amount of memory with a given heat. Can we take this a step further?

Another interesting association is that confined energy is related to inertia, even in classical mechanics. And in information processing, it seems also that that MORE data you have, the MORE evidence you have about something you inferred, the larger resistance is there to revising this in the face of new smaller amounts of contradicting information, right?

This is quite exciting. I was going to post this subquestion in the interpretation QM forum, but I am not sure if fits there either. It's a foundationa interpretational issue, but which is not specific to QM.

So I supposed this results in the the challenge: Define "temperature" in abstract terms, without even relating to mechanical notions such as kinetic energy. Entropy is in principle already defined in abstract terms. But what about energy, and thus temperature?

/Fredrik
 
  • #23
I'm having difficulties relating temperature to potential energy in general.

It's clear that heat can be stored in the potential energy of molecular and atomic vibrations but I can't see how this would be true in general, for example for the potential energy of gas molecules in a gravitational field.

So in the first case an increase of average potential energy is related to an increase in temperature whereas in the second it's not, I would say.

I also get the impression that there's a qualitative difference between the potential energy in vibrations and that due to altitude, for example.
Normally the zero-point for potential energy is not defined, but for vibrations it seems you can define it very easily.
 
  • #24
The definition of temperature must be relative to a given microstructure (or phase), because entropy is. And "energy" must also be implicitly referring to the energy stored in that specific microstructure. So this is why temperature of a microstructure, isn't related to energy stored in ANOTHER microstructure, except possibly via energy transfer, which I see conceptually a phase transition. So the "energy" in the temperature definition, should be counted per phase, or per microstructure.

The trouble with gravity is that I think it's microstructure is unknown or not well understood, and it seems to add another level of complexity beyond. Would it make sense to see gravity is a continous phase transition? It's an important question though. I would personally attack that problem from the perspective of intertia, rather than as a fundamental interaction.

/Fredrik
 
  • #25
Lord Jestocost said:
Regarding the "physical interpretation of temperature", I would rely on “An Introduction to Thermal Physics” by Daniel V. Schroeder (Oxford University Press 2021). Schroeder's proposal for a theoretical definition of temperature is:

Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.

On base of this theoretical definition, one arrives at the thermodynamic definition of temperature:

"The temperature of a system is the reciprocal of the slope of its entropy vs. energy graph. The partial derivative is to be taken with the system's volume and number of particles held fixed;* more explicitly,
$$\frac 1 T\equiv\left( \frac {\partial S} {\partial U}\right)_{N,V} .$$"
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
This definition really only says that T is one of the quantities we study in thermodynamics and it's related to other quantities in a particular way.
After such a definition I would still ask: But what is temperature? ("actually")

However I do see the difficulties with defining T as average kinetic energy.
What's not clear to me is how small the objects have to be that have this energy.
For example if you had a mechanical construction with billions of small steel balls on springs, how small would the steel balls have to be so that you would assign a temperature to the vibration of these balls?
 
  • Like
Likes Fra and dextercioby
  • #26
Philip Koeck said:
How does the total entropy increase in that case?
In which of the two cases?

In general you start with ##dU = TdS +\overline{\mu}dN## (or an equivalent one if the volume or magnetization change, for example), you then compute the associated fluxes, and then use the continuity equation for the entropy. This equation gives you the rate of entropy production in any infinitesimal volume. Then you just have to integrate that expression over the whole volume. This gives you the answer you're looking for.

If there's an electric current, entropy will be generated due to Joule heat, and also due to Fourier's heat conduction.

There's nothing truly shocking in that heat does not always flow from hot to cold. If you consider a "Peltier cell", it is a heat engine. Think of it like a fridge. You input current (energy) into it, it performs its job at cooling down something that's already cold, i.e. it removes heat from something colder than its surroundings. The heat flux is therefore from the cold object towards the hotter one. The total entropy is of course strictly positive in this case. No law of thermodynamics is broken.
 
  • #27
Philip Koeck said:
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
This definition really only says that T is one of the quantities we study in thermodynamics and it's related to other quantities in a particular way.
After such a definition I would still ask: But what is temperature? ("actually")
This was partly what i meant above as well, and it then boils down to what is entropy and what is energy.

But you can question this definition at a deeper level. And here it seems that temperature defines as a similar derivative, defines how two microstructures "share energy" at equiblirium.

What what is this "energy" that they share, defined at some more abstract information theoretic level? (or the same level where we alreadt have the abstract definition of entropy).

Right now, all we have is a semi-information theoretic picture. This is not satisfactory.

/Fredrik
 
  • Like
Likes Philip Koeck
  • #28
fluidistic said:
In which of the two cases?
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
 
  • Like
Likes Lord Jestocost
  • #29
Philip Koeck said:
However I do see the difficulties with defining T as average kinetic energy.
What's not clear to me is how small the objects have to be that have this energy.
For example if you had a mechanical construction with billions of small steel balls on springs, how small would the steel balls have to be so that you would assign a temperature to the vibration of these balls?
You could use the equipartition theorem as a starting point. Each little ball will have a translational energy of ##\frac{3}{2}k_BT## on average. If you have several thousand little balls at room temperature, that’s not much energy. If you have ##10^{23}## balls at room temperature, that’s significantly more energy.
 
  • Like
Likes Philip Koeck
  • #30
fluidistic said:
There's nothing truly shocking in that heat does not always flow from hot to cold.
In case ##1/T## is the only driving thermodynamic force??
 
  • #31
Philip Koeck said:
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
"Temperature is expressed as the inverse of the rate of change of entropy with internal energy, with volume V and number of particles N held constant. This is certainly not as intuitive as molecular kinetic energy, but in thermodynamic applications it is more reliable and more general."

http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/temper2.html
 
  • Like
Likes Philip Koeck
  • #32
Fra said:
First think that comes to my mind, is that the relation between amount of energy and amount of entropy are quite similar to the relation between amount of memory and amount of information.
Robert Alicki and Michal Horodecki in "Information-thermodynamics link revisited" (https://arxiv.org/abs/1905.11057v1):

"Only information carriers are physical

It is true that, as Landauer wrote : '[Information] is always tied to a physical representation. It is represented by
engraving on a stone tablet, a spin, a charge, a hole in a punched card, a mark on paper, or some other equivalent. This ties the handling of information to all the possibilities and restrictions of our real physical word, its laws of physics and its storehouse'.
However, the legitimate questions concern the physical properties of information carriers like 'stone tablet, a spin, a charge, a hole in a punched card, a mark on paper', but not the information itself."

[Bold by LJ]
 
  • #33
Philip Koeck said:
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
Using my intuition only rather than solving the heat problem (I could do it in a matter of minutes if I had access to my cpu), I would say that a bar connected between a hot reservoir at one end, and a cold reservoir at its other end, would end up with a temperature distribution such that it increases as one moves towards the hotter end, but that it would also be hotter on say the top of the bar, and colder on its bottom, in the case where the thermal conductivity is anisotropic. The heat gradient would not point in the direction that goes from the hot to cold reservoir.
 
  • #34
Lord Jestocost said:
Robert Alicki and Michal Horodecki in "Information-thermodynamics link revisited" (https://arxiv.org/abs/1905.11057v1):
Thanks, I don't think I read that paper, i'll read it and see if there is something new in there.
Lord Jestocost said:
However, the legitimate questions concern the physical properties of information carriers like 'stone tablet, a spin, a charge, a hole in a punched card, a mark on paper', but not the information itself."
[Bold by LJ]
I agree, which motivates also the stance I have to all this. The physical properties of information carriers are precisely the microstructure of the physical "agents/observers" in my view. In a conservative view, it's easy to think in regulartion quantum information theory that the information carriers are classical, or "macroscopic at least", as otherwise they fail to be reliable. But I think that will not work, so the story has to be more complicated.

/Fredrik
 
  • #35
Philip Koeck said:
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
I gave you a recipe to compute the entropy increase (over time) in the general case. You can certainly apply it to the case of a bar placed between 2 reservoirs at different temperatures, where the thermal conductivity of the bar is anistropic. We can keep it simple by considering a 2D case and a crystal orientation such that kappa is a 2x2 matrix like so:
##\kappa =
\begin{bmatrix}
\kappa_{xx} & 0\\
0 & \kappa_{yy}
\end{bmatrix}
##
You can quickly get an analytical formula for the general expression for ##T(x,y)## and you'll see that this temperature distribution depends on both ##x## and ##y## and that the thermal gradient doesn't align along ##\hat x##.
Now onto the entropy question, the continuity eq. is ##-\dot S =\nabla \cdot \vec J_S=-\nabla \cdot \left( \frac{\kappa}{T} \nabla T \right)##. That's the entropy production. You have everything in hands, and you can work out every details for the problem you set up. As you can see, there is an entropy production due to the thermal gradient, this is why I mentioned that there's an entropy production term due to Fourier's conduction term, earlier in this thread. The "external work being done on the solid" might be the entering heat flux, i.e. energy, into the system.
To answer Lord Jescott's question, yes, the driving force is 1/T (or T, depending on your preferred convention).
 
  • Like
Likes dextercioby
  • #36
Lord Jestocost said:
Robert Alicki and Michal Horodecki in "Information-thermodynamics link revisited" (https://arxiv.org/abs/1905.11057v1):

"The discussion of above suggests the following picture. If by information we mean the stable information i.e. one that can be reliably stored and transmitted, rather than information carried by fast changing random congurations of physical systems, we expect that the following holds"

As I suspected that sounds like the picture of classical information, encoded in the commmon classical/macroscopic environment. Ie "objective information" that can be communicated between observers.

To me instrinsic information, is the what single physical agents/observers represent. And can't clone their states, without participating in an interaction that changes something. So alot of their premises put me off. Also what is "random" is in the eye of the beholder, random is just a generic word for that the agent fails to distinguish it from noise. Which may very well be due to limiting computational power.

/Fredrik
 
  • #37
fluidistic said:
I gave you a recipe to compute the entropy increase (over time) in the general case. You can certainly apply it to the case of a bar placed between 2 reservoirs at different temperatures, where the thermal conductivity of the bar is anistropic. We can keep it simple by considering a 2D case and a crystal orientation such that kappa is a 2x2 matrix like so:
##\kappa =
\begin{bmatrix}
\kappa_{xx} & 0\\
0 & \kappa_{yy}
\end{bmatrix}
##
You can quickly get an analytical formula for the general expression for ##T(x,y)## and you'll see that this temperature distribution depends on both ##x## and ##y## and that the thermal gradient doesn't align along ##\hat x##.
Now onto the entropy question, the continuity eq. is ##-\dot S =\nabla \cdot \vec J_S=-\nabla \cdot \left( \frac{\kappa}{T} \nabla T \right)##. That's the entropy production. You have everything in hands, and you can work out every details for the problem you set up. As you can see, there is an entropy production due to the thermal gradient, this is why I mentioned that there's an entropy production term due to Fourier's conduction term, earlier in this thread. The "external work being done on the solid" might be the entering heat flux, i.e. energy, into the system.
To answer Lord Jescott's question, yes, the driving force is 1/T (or T, depending on your preferred convention).
If I understand you correctly then the heat flow is actually from hot to cold, just not in the direction of the steepest temperature gradient.
If that's true then there's no problem with the second law and the temperature is the driving force.
 
  • Like
Likes russ_watters and Lord Jestocost
  • #38
Philip Koeck said:
If I understand you correctly then the heat flow is actually from hot to cold, just not in the direction of the steepest temperature gradient.
If that's true then there's no problem with the second law and the temperature is the driving force.
Hmm, I wouldn't say that's what I'm saying. I'm saying that if you connect 2 reservoirs at different temperatures via an anisotropic (in the sense that kappa is anisotropic) material, then the heat flux, at any point in the bar, will not point in the direction of "hot reservoir to cold reservoir". It will point in a direction such that there will be a component towards the cold reservoir, but also a component inward/outward the bar itself. This will disturb the temperature distribution in a way that T(x) if you will, will not be linear in x. In fact it would also depend on y, with parameters involving ##\kappa_{xx}##, ##\kappa_{yy}##, ##L## (length of the bar), ##l## (width of the bar), and even ##\kappa_{xy}## if the material (crystal) making up the bar crystal isn't oriented with a crystallographic axis along the bar itself.

I'm also saying that it's possible to have a bar in which heat flows from cold to hot. This can happen if you don't ignore thermoelectricity. In this case the "energy input" required to make this happens is also the heat flux entering the hot side of the bar.

In all cases, I am claiming that no thermodynamics law is broken, yes. You can convince yourself by computing ##\dot S## the way I suggested above. And saying that heat flows from hot to cold is not correct (unless you stick to simple cases). As a general statement, this is false.
 
  • Like
Likes dextercioby
  • #39
fluidistic said:
I'm also saying that it's possible to have a bar in which heat flows from cold to hot. This can happen if you don't ignore thermoelectricity. In this case the "energy input" required to make this happens is also the heat flux entering the hot side of the bar.
Interesting!
Let's be very specific. If the bar is only in contact with the hot and cold reservoir at its ends and otherwise thermally insulated from the surroundings, is it possible for heat to flow from the cold to the hot reservoir without any sort of work form outside (such as a current from a battery)?
 
  • Like
Likes russ_watters
  • #40
Philip Koeck said:
Interesting!
Let's be very specific. If the bar is only in contact with the hot and cold reservoir at its ends and otherwise thermally insulated from the surroundings, is it possible for heat to flow from the cold to the hot reservoir without any sort of work form outside (such as a current from a battery)?
Yes it is. I am in a hospital for a few days so no time to write a latex answer. The short answer is that the heat flux from the hot reservoir provides the energy for the thermoelectric material to operate, even though it is only a bar. If you do the math you get that if ZT, the figure of merit, is greater than 1, and if the TE material is short circuited with itself so as to allow an electric current through itself thanks to the voltage generated by the Seebeck effect, then heat can flow from cold to hot (depending on the sign of S).
 
  • #41
fluidistic said:
... if the TE material is short circuited with itself so as to allow an electric current through itself thanks to the voltage generated by the Seebeck effect, then heat can flow from cold to hot (depending on the sign of S).
Could you explain what you mean by S? Is it entropy or change of entropy or something else?
If it's entropy, the entropy of what?

Without doing any calculation what you're describing in effect is that current is flowing due to the thermoelectric effect and at the same time heat is flowing from cold to hot. All this without an external driving force, for example a battery to produce the current.
Is that what you are saying?
 
  • #42
DrClaude said:
Please, lets get away from the "temperature is kinetic energy" high-school narrative.

The simple picture is
While this addresses two substances at different temperatures it does not explain the actual temperature of either which is their respective thermal kinetic energy. This is not a "high school narrative".
 
  • #43
Philip Koeck said:
Could you explain what you mean by S? Is it entropy or change of entropy or something else?
If it's entropy, the entropy of what?

Without doing any calculation what you're describing in effect is that current is flowing due to the thermoelectric effect and at the same time heat is flowing from cold to hot. All this without an external driving force, for example a battery to produce the current.
Is that what you are saying?
Yes, except that the driving force exists, it is the heat flux from the hot reservoir into the bar, this inputs energy into the heat.engine, which is the bar itself.
The S I referred to in your quote is the Seebeck coefficient. (Which I considered as a scalar number for simplicity but which is a tensor in the most general case).
 
  • #44
Hillbillychemist said:
While this addresses two substances at different temperatures it does not explain the actual temperature of either which is their respective thermal kinetic energy. This is not a "high school narrative".
Where is the kinetic energy in spin temperature?
 
  • #45
Don't you think you are getting a little far afield from

"Interpretation of temperature in liquids/solids"?​

 
  • #46
Hillbillychemist said:
Don't you think you are getting a little far afield from

"Interpretation of temperature in liquids/solids"?​

Spin temperature is relevant in some solids.

In any case, to understand temperature in systems other than ideal gases, one must go beyond the high-schoolish "temperature is kinetic energy".
 
  • Like
Likes russ_watters, Lord Jestocost and weirdoguy
  • #47
DrClaude said:
Spin temperature is relevant in some solids.

In any case, to understand temperature in systems other than ideal gases, one must go beyond the high-schoolish "temperature is kinetic energy".
Only if the situation requires it.
 
  • #48
fluidistic said:
Yes, except that the driving force exists, it is the heat flux from the hot reservoir into the bar, this inputs energy into the heat.engine, which is the bar itself.
The S I referred to in your quote is the Seebeck coefficient. (Which I considered as a scalar number for simplicity but which is a tensor in the most general case).
I'm still missing something here.
Assume again the thermoelectric bar with its ends touching a hot and a cold reservoir, but otherwise thermally insulated.
The heat current from the hot reservoir drives the TE-current (assuming we've short circuited).
The heat current has nowhere else to go other than to the cold reservoir since the bar is insulated everywhere else.
This is a kind of heat engine, just like you say.
Now the electric current produces a heat current due to the Peltier effect.
I guess this second heat current could partly cancel the initial heat current that's due to the temperature gradient, but never completely.
If it did you would get unlimited electricity for free.

So in total, I would say, heat flows only from hot to cold even in a TE-device.
 
  • #49
Philip Koeck said:
I'm still missing something here.
Assume again the thermoelectric bar with its ends touching a hot and a cold reservoir, but otherwise thermally insulated.
The heat current from the hot reservoir drives the TE-current (assuming we've short circuited).
The heat current has nowhere else to go other than to the cold reservoir since the bar is insulated everywhere else.
This is a kind of heat engine, just like you say.
Now the electric current produces a heat current due to the Peltier effect.
I guess this second heat current could partly cancel the initial heat current that's due to the temperature gradient, but never completely.
If it did you would get unlimited electricity for free.

So in total, I would say, heat flows only from hot to cold even in a TE-device.
You are missing that the heat engine can do work, so not all the heat from the hot reservoir goes into the cold one.
It may be hard to visualize without a sketch, but here is a possible set up. Place a Bi2Te3 bar between 2 reservoirs kept at different fixed temperatures. The Seebeck effect is going to produce an emf (roughly) worth ##S\Delta T##. This comes from the generalized Ohm's law ##\vec J=-\sigma \nabla \overline{\mu}/e-\sigma S \nabla T## (in open circuit condition, so J equals 0). So it's like a battery in some way. Now you attach a Cu wire (whose Seebeck coefficient S is near 0V/K compared to the Bi2Te3 one) near the top and the bottom (say near the 2 reservoirs, but you're not obliged too. At the junctions there will be heat released/adsorbed), to close the electric circuit. You can assume this Cu wire has a resistance Rload. The electric current going through the TE material and the Cu wire is ##I=S\Delta T/(R_{TE}+Rload)=A|\vec J_e|## where A is the cross section area.
Now if you look at the heat flux ##\vec J_Q=-\kappa \nabla T +ST\vec J_e## in the Bi2Te3 material, and look for the condition where it flows against the ##-\nabla T## direction, you'll get that (in the case where Rload is negligible compared to ##R_{TE}##, which is feasible in practice) ##ZT>1## where Z is the usual thermoelectric figure of merit. Bi2Te3 satisfies this condition near room temperature. I have no time to show you the math, but it's extremely easy to do.

Note that there is no Peltier effect right at the reservoirs/Bi2Te3, since there is no electric current through the reservoirs. The Peltier effect manifested as heat released/absorbed is at the Cu/Bi2Te3 junctions.

Things are in reality severely more complicated though, the end result is that one can make almost anything one can imagine (to open a can of worms and wormholes, one should consider the Thomson effect whenever a current and a thermal gradient coexist, as in our case, and we can tune the temperature gradient to our likings, and hence the Thomson heat released/adsorbed locally to our liking. But that's only the visible scratch. Researchers have overlooked other thermoelectric effects that exist, but are not reported in the literature yet. I will not go there, this is not allowed in this forum).
 
  • #50
fluidistic said:
Researchers have overlooked other thermoelectric effects that exist, but are not reported in the literature yet. I will not go there, this is not allowed in this forum).
Is that because those effects would lead to perpetual motion machines?
 
  • Like
  • Haha
Likes russ_watters, DrClaude and fluidistic
Back
Top