Interpretation of temperature in liquids/solids

In summary, the physical interpretation of temperature of a mass of liquid or a solid can be defined as a measure of the tendency of an object to spontaneously give up energy to its surroundings. This is exemplified by the thermodynamic definition of temperature as the reciprocal of the slope of its entropy vs. energy graph. However, in more complex systems such as liquids and solids, the energy is distributed among different degrees of freedom, making it difficult to define temperature as simply the average kinetic energy. In these systems, temperature is better understood as how it behaves in terms of heat flow and thermoelectricity.
  • #1
dextercioby
Science Advisor
Homework Helper
Insights Author
13,339
3,027
TL;DR Summary
What is the physical interpretation of temperature of a mass of liquid or a solid?
Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.

Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
 
Physics news on Phys.org
  • #2
The simplest interpretation is that "temperature is the average KE of the molecules". That may be too simple for some, but once you start considering changes of phase, it becomes complicated. It then requires a deeper energy analysis than simple temperature.
 
  • #3
dextercioby said:
TL;DR Summary: What is the physical interpretation of temperature of a mass of liquid or a solid?

Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.

Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
Regarding the "physical interpretation of temperature", I would rely on “An Introduction to Thermal Physics” by Daniel V. Schroeder (Oxford University Press 2021). Schroeder's proposal for a theoretical definition of temperature is:

Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.

On base of this theoretical definition, one arrives at the thermodynamic definition of temperature:

"The temperature of a system is the reciprocal of the slope of its entropy vs. energy graph. The partial derivative is to be taken with the system's volume and number of particles held fixed;* more explicitly,
$$\frac 1 T\equiv\left( \frac {\partial S} {\partial U}\right)_{N,V} .$$"
 
Last edited:
  • Like
Likes russ_watters, TeethWhitener, jbriggs444 and 3 others
  • #4
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.

Temperature determines the direction of heat flow, from higher temperature to lower temperature,
 
  • Like
Likes DrClaude
  • #5
Temperature is the average kinetic energy whether it is expressed as translation as in liquids and gases or vibration in solids.
 
  • Like
Likes Baluncore
  • #6
Please, lets get away from the "temperature is kinetic energy" high-school narrative.

The simple picture is
Lord Jestocost said:
Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.
 
  • Like
Likes Demystifier, PhDeezNutz, Lord Jestocost and 2 others
  • #7
I am sorry, nondisrespect meant, but what makes you think I have not asked about the complicated phenomenological explanation?
 
  • #8
DrClaude said:
Please, lets get away from the "temperature is kinetic energy" high-school narrative.
How is the energy transferred if it is not by kinetic interaction?
 
  • #9
Baluncore said:
How is the energy transferred if it is not by kinetic interaction?
Radiation, for one.
 
  • Like
Likes DaveE, Klystron and Lord Jestocost
  • #10
dextercioby said:
I am sorry, nondisrespect meant, but what makes you think I have not asked about the complicated phenomenological explanation?
My post wasn't addressing the OP. Let me do that now.
dextercioby said:
Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules, so we imagine a gas in which temperature is a measure of how quick molecules are, and how frequently they collide one to another. The higher the temperature, the more energy of molecules.
I would consider the energy to be the "internal property of a bulk of matter." Generally speaking, an increase in temperature mean an increase in energy, but the two things are not the same. This is exemplified by heat capacity: the amount of energy necessary for a certain change in temperature is different fro different substances. Then you also have to account for phase transitions, where a lot of energy can flow without a change in temperature.

This is why I like Schroeder's simple picture of temperature describing the tendency of a thermodynamic system to exchange energy with another.

dextercioby said:
Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
In more complex systems than ideal gases, the energy is distributed among the different degrees of freedom.
 
  • Like
Likes DaveE, russ_watters, Astronuc and 1 other person
  • #11
Vanadium 50 said:
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.

Temperature determines the direction of heat flow, from higher temperature to lower temperature,
That's not as easy as this. Heat can flow in any direction in a solid, and in particular against the direction of hot to cold.

Take Fourier's law q equals -kappa grad T. Then, remember thermoelectricity exists, so consider the generalized Fourier's law q equals -kappa grad T plus ST J. The first term is the usual Fourier's term, the latter is a Peltier heat which exists whenever there is an electric current. S can have either sign, J can have any direction, and S can even be a non symmetric tensor.

That, or consider anisotropic matetials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.
 
  • Like
Likes DaveE and dextercioby
  • #12
It is true that things are complicated. Negative absolute temperatures are another complication. But I think it is better to go from B-level to I-level to A-level definitions than to get to the end before the poster has reached the middle.
 
  • Like
Likes DrClaude and Bystander
  • #13
Given that @dextercioby is already an SA/HH and is asking for an intermediate answer, I think really internalizing the definition from @Lord Jestocost ’s post will help immensely. Essentially, it says that if you increase the energy of a system and its entropy only goes up a little bit, then the system has a high temperature. Conversely, if you increase the energy and its entropy goes up a lot, it’s at a low temperature.

If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
 
  • Like
Likes Fra and DrClaude
  • #14
TeethWhitener said:
Given that @dextercioby is already an SA/HH and is asking for an intermediate answer, I think really internalizing the definition from @Lord Jestocost ’s post will help immensely. Essentially, it says that if you increase the energy of a system and its entropy only goes up a little bit, then the system has a high temperature. Conversely, if you increase the energy and its entropy goes up a lot, it’s at a low temperature.

If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
Taking this a step further, we can see that heat flow from high temperature to low temperature is simply a reflection of the second law of thermodynamics. When an object at high temperature comes into contact with an object at low temperature, the energy flows in the direction in accordance with the second law ##\Delta S\geq0##. Since a change in energy is associated with a much larger change in entropy for low temperature objects than for high temperature objects, the increase in energy will have to happen to the low temperature object in order to be consistent with ##\Delta S\geq0##, so heat will flow from hot to cold.
 
  • #15
TeethWhitener said:
Taking this a step further, we can see that heat flow from high temperature to low temperature is simply a reflection of the second law of thermodynamics. When an object at high temperature comes into contact with an object at low temperature, the energy flows in the direction in accordance with the second law ##\Delta S\geq0##. Since a change in energy is associated with a much larger change in entropy for low temperature objects than for high temperature objects, the increase in energy will have to happen to the low temperature object in order to be consistent with ##\Delta S\geq0##, so heat will flow from hot to cold.
Could heat flow from a colder to a warmer object if the warmer object increases the total entropy for example by evaporating.
 
  • #16
Philip Koeck said:
Could heat flow from a colder to a warmer object if the warmer object increases the total entropy for example by evaporating.
No. Of course, the warmer object will see its temperature decrease due to evaporation, and at one point might cool down below the colder object's temperature.
 
  • Like
Likes russ_watters, TeethWhitener and Lord Jestocost
  • #17
DrClaude said:
No. Of course, the warmer object will see its temperature decrease due to evaporation, and at one point might cool down below the colder object's temperature.
But according to post 11 by @fluidistic something like that, just without a phase change, can happen in solids.
Would you agree with that?

I'm just trying to make up my mind whether heat always flows from hot to cold.
Most of the the posts in this thread seem to say yes, but post 11 says no.
 
Last edited:
  • #18
fluidistic said:
That's not as easy as this. Heat can flow in any direction in a solid, and in particular against the direction of hot to cold.

That, or consider anisotropic matetials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.
How does the total entropy increase in that case?
 
  • Like
Likes Lord Jestocost
  • #19
dextercioby said:
Let's switch to a liquid at room temperature (water) or a crystalline solid (a bar of pure iron). How would you define their temperature? Would it for example be a measure of the average energy of the "electron gas" in the conduction band? What about liquid water, which lacks a rigid crystalline structure, being just a scattter of molecules kept together with vdW forces and "hydrogen bonds"?
Temperature is not about average energy, but about energy distribution. In thermal equilibrium, the probability of being in a given state of energy ##E## is proportional to ##e^{-E/kT}##. The temperature ##T## is defined by this exponential distribution. If the energy is not distributed this way, then the system is not in thermal equilibrium and hence does not have temperature. If a system out of thermal equilibrium interacts with a thermometer, the thermometer will still show some value, but this value is not really the temperature in the above sense.
 
Last edited:
  • Like
Likes TeethWhitener and weirdoguy
  • #20
For a gas in a gravitational field, there is no “average kinetic energy” gradient in thermal equilibrium.
 
  • Like
Likes Demystifier
  • #21
Lord Jestocost said:
For a gas in a gravitational field, there is no “average kinetic energy” gradient in thermal equilibrium.
You are right. I've deleted a wrong part in my previous post. The Boltzmann distribution is proportional to
$$e^{-\beta(E_{\rm kin}+E_{\rm pot})}=e^{-\beta E_{\rm kin}} e^{-\beta E_{\rm pot}} = f_1(v) f_2(z)$$
so the probability factorizes into a ##v##-dependent and ##z##-dependent part, i.e. there is no correlation between kinetic energy and altitude.
 
  • Like
Likes Lord Jestocost
  • #22
dextercioby said:
What is the physical interpretation of temperature of a mass of liquid or a solid?

Usually, the mental image of temperature is: an internal property of a bulk of matter, which typically describes the average kinetic plus rotation/vibration energy of molecules
Demystifier said:
Temperature is not about average energy, but about energy distribution.
Vanadium 50 said:
I think it is better to think of temperature as how it behaves, rather than a "interpretation" in terms of some other quantity. Yes, for a gas, it's average kinetic energy, but lots or things aren't gasses.
TeethWhitener said:
If you think of entropy as a function which counts the number of microstates of a macroscopic system, then the above definition has an intuitive interpretation: at very low temperatures, the number of possible microstates is quite low and increasing the energy of the system causes this number to increase rapidly. However, at very high temperatures, the number of possible microstates becomes very large and increasing the energy doesn’t open up as many new microstates proportionally.
It seems the the original question was what is the physical interpretation ("classical mechanical picture"?) of the "energy modes" in different phases? (kinetic energy, potential energy and vibration of bond states etc).

But I agree that the opposite is perhaps also an interesting question. What is the "information theoretic" interpretation of energy modes? This is motivated by that we already have an informaiton theoretic understanding of entropy, as the amount of missing information of the microstate - given the macrostate.

But to get a similar abstract interpretation of temperature, as that is usually defined as in post #3, is the inverse of the energy derivative of entropy. Then, wouldn't it be nice to also get rid of the "kinetic energy" interpretations from mechanics?

First think that comes to my mind, is that the relation between amount of energy and amount of entropy are quite similar to the relation between amount of memory and amount of information. We alreay have the landauer's principles that relates the erasure of a given amount of memory with a given heat. Can we take this a step further?

Another interesting association is that confined energy is related to inertia, even in classical mechanics. And in information processing, it seems also that that MORE data you have, the MORE evidence you have about something you inferred, the larger resistance is there to revising this in the face of new smaller amounts of contradicting information, right?

This is quite exciting. I was going to post this subquestion in the interpretation QM forum, but I am not sure if fits there either. It's a foundationa interpretational issue, but which is not specific to QM.

So I supposed this results in the the challenge: Define "temperature" in abstract terms, without even relating to mechanical notions such as kinetic energy. Entropy is in principle already defined in abstract terms. But what about energy, and thus temperature?

/Fredrik
 
  • #23
I'm having difficulties relating temperature to potential energy in general.

It's clear that heat can be stored in the potential energy of molecular and atomic vibrations but I can't see how this would be true in general, for example for the potential energy of gas molecules in a gravitational field.

So in the first case an increase of average potential energy is related to an increase in temperature whereas in the second it's not, I would say.

I also get the impression that there's a qualitative difference between the potential energy in vibrations and that due to altitude, for example.
Normally the zero-point for potential energy is not defined, but for vibrations it seems you can define it very easily.
 
  • #24
The definition of temperature must be relative to a given microstructure (or phase), because entropy is. And "energy" must also be implicitly referring to the energy stored in that specific microstructure. So this is why temperature of a microstructure, isn't related to energy stored in ANOTHER microstructure, except possibly via energy transfer, which I see conceptually a phase transition. So the "energy" in the temperature definition, should be counted per phase, or per microstructure.

The trouble with gravity is that I think it's microstructure is unknown or not well understood, and it seems to add another level of complexity beyond. Would it make sense to see gravity is a continous phase transition? It's an important question though. I would personally attack that problem from the perspective of intertia, rather than as a fundamental interaction.

/Fredrik
 
  • #25
Lord Jestocost said:
Regarding the "physical interpretation of temperature", I would rely on “An Introduction to Thermal Physics” by Daniel V. Schroeder (Oxford University Press 2021). Schroeder's proposal for a theoretical definition of temperature is:

Temperature is a measure of the tendency of an Object to spontaneously give up energy to its surroundings. When two objects are in thermal contact, the one that tends to spontaneously lose energy is at the higher temperature.

On base of this theoretical definition, one arrives at the thermodynamic definition of temperature:

"The temperature of a system is the reciprocal of the slope of its entropy vs. energy graph. The partial derivative is to be taken with the system's volume and number of particles held fixed;* more explicitly,
$$\frac 1 T\equiv\left( \frac {\partial S} {\partial U}\right)_{N,V} .$$"
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
This definition really only says that T is one of the quantities we study in thermodynamics and it's related to other quantities in a particular way.
After such a definition I would still ask: But what is temperature? ("actually")

However I do see the difficulties with defining T as average kinetic energy.
What's not clear to me is how small the objects have to be that have this energy.
For example if you had a mechanical construction with billions of small steel balls on springs, how small would the steel balls have to be so that you would assign a temperature to the vibration of these balls?
 
  • Like
Likes Fra and dextercioby
  • #26
Philip Koeck said:
How does the total entropy increase in that case?
In which of the two cases?

In general you start with ##dU = TdS +\overline{\mu}dN## (or an equivalent one if the volume or magnetization change, for example), you then compute the associated fluxes, and then use the continuity equation for the entropy. This equation gives you the rate of entropy production in any infinitesimal volume. Then you just have to integrate that expression over the whole volume. This gives you the answer you're looking for.

If there's an electric current, entropy will be generated due to Joule heat, and also due to Fourier's heat conduction.

There's nothing truly shocking in that heat does not always flow from hot to cold. If you consider a "Peltier cell", it is a heat engine. Think of it like a fridge. You input current (energy) into it, it performs its job at cooling down something that's already cold, i.e. it removes heat from something colder than its surroundings. The heat flux is therefore from the cold object towards the hotter one. The total entropy is of course strictly positive in this case. No law of thermodynamics is broken.
 
  • #27
Philip Koeck said:
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
This definition really only says that T is one of the quantities we study in thermodynamics and it's related to other quantities in a particular way.
After such a definition I would still ask: But what is temperature? ("actually")
This was partly what i meant above as well, and it then boils down to what is entropy and what is energy.

But you can question this definition at a deeper level. And here it seems that temperature defines as a similar derivative, defines how two microstructures "share energy" at equiblirium.

What what is this "energy" that they share, defined at some more abstract information theoretic level? (or the same level where we alreadt have the abstract definition of entropy).

Right now, all we have is a semi-information theoretic picture. This is not satisfactory.

/Fredrik
 
  • Like
Likes Philip Koeck
  • #28
fluidistic said:
In which of the two cases?
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
 
  • Like
Likes Lord Jestocost
  • #29
Philip Koeck said:
However I do see the difficulties with defining T as average kinetic energy.
What's not clear to me is how small the objects have to be that have this energy.
For example if you had a mechanical construction with billions of small steel balls on springs, how small would the steel balls have to be so that you would assign a temperature to the vibration of these balls?
You could use the equipartition theorem as a starting point. Each little ball will have a translational energy of ##\frac{3}{2}k_BT## on average. If you have several thousand little balls at room temperature, that’s not much energy. If you have ##10^{23}## balls at room temperature, that’s significantly more energy.
 
  • Like
Likes Philip Koeck
  • #30
fluidistic said:
There's nothing truly shocking in that heat does not always flow from hot to cold.
In case ##1/T## is the only driving thermodynamic force??
 
  • #31
Philip Koeck said:
The problem with definitions like that is that they don't really explain anything, in my opinion anyway.
"Temperature is expressed as the inverse of the rate of change of entropy with internal energy, with volume V and number of particles N held constant. This is certainly not as intuitive as molecular kinetic energy, but in thermodynamic applications it is more reliable and more general."

http://hyperphysics.phy-astr.gsu.edu/hbase/thermo/temper2.html
 
  • Like
Likes Philip Koeck
  • #32
Fra said:
First think that comes to my mind, is that the relation between amount of energy and amount of entropy are quite similar to the relation between amount of memory and amount of information.
Robert Alicki and Michal Horodecki in "Information-thermodynamics link revisited" (https://arxiv.org/abs/1905.11057v1):

"Only information carriers are physical

It is true that, as Landauer wrote : '[Information] is always tied to a physical representation. It is represented by
engraving on a stone tablet, a spin, a charge, a hole in a punched card, a mark on paper, or some other equivalent. This ties the handling of information to all the possibilities and restrictions of our real physical word, its laws of physics and its storehouse'.
However, the legitimate questions concern the physical properties of information carriers like 'stone tablet, a spin, a charge, a hole in a punched card, a mark on paper', but not the information itself."

[Bold by LJ]
 
  • Like
Likes Fra
  • #33
Philip Koeck said:
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
Using my intuition only rather than solving the heat problem (I could do it in a matter of minutes if I had access to my cpu), I would say that a bar connected between a hot reservoir at one end, and a cold reservoir at its other end, would end up with a temperature distribution such that it increases as one moves towards the hotter end, but that it would also be hotter on say the top of the bar, and colder on its bottom, in the case where the thermal conductivity is anisotropic. The heat gradient would not point in the direction that goes from the hot to cold reservoir.
 
  • #34
Lord Jestocost said:
Robert Alicki and Michal Horodecki in "Information-thermodynamics link revisited" (https://arxiv.org/abs/1905.11057v1):
Thanks, I don't think I read that paper, i'll read it and see if there is something new in there.
Lord Jestocost said:
However, the legitimate questions concern the physical properties of information carriers like 'stone tablet, a spin, a charge, a hole in a punched card, a mark on paper', but not the information itself."
[Bold by LJ]
I agree, which motivates also the stance I have to all this. The physical properties of information carriers are precisely the microstructure of the physical "agents/observers" in my view. In a conservative view, it's easy to think in regulartion quantum information theory that the information carriers are classical, or "macroscopic at least", as otherwise they fail to be reliable. But I think that will not work, so the story has to be more complicated.

/Fredrik
 
  • #35
Philip Koeck said:
I meant the second example where you discuss anisotropic materials.
("... consider anisotropic materials, where kappa is a tensor. Write down Fourier's law under matricial form and you'll get that it's possible for a material to develop a transverse thermal gradient (and so a heat flux in that direction) even though it isn't the hot to cold direction.")

The example with an external current is understandable. It's just like a heat pump.
But in the above case I don't see any external work being done on the solid.
If heat flows spontanously in a direction that's not hot to cold I would expect an increase in entropy somewhere, for example a phase change.
I gave you a recipe to compute the entropy increase (over time) in the general case. You can certainly apply it to the case of a bar placed between 2 reservoirs at different temperatures, where the thermal conductivity of the bar is anistropic. We can keep it simple by considering a 2D case and a crystal orientation such that kappa is a 2x2 matrix like so:
##\kappa =
\begin{bmatrix}
\kappa_{xx} & 0\\
0 & \kappa_{yy}
\end{bmatrix}
##
You can quickly get an analytical formula for the general expression for ##T(x,y)## and you'll see that this temperature distribution depends on both ##x## and ##y## and that the thermal gradient doesn't align along ##\hat x##.
Now onto the entropy question, the continuity eq. is ##-\dot S =\nabla \cdot \vec J_S=-\nabla \cdot \left( \frac{\kappa}{T} \nabla T \right)##. That's the entropy production. You have everything in hands, and you can work out every details for the problem you set up. As you can see, there is an entropy production due to the thermal gradient, this is why I mentioned that there's an entropy production term due to Fourier's conduction term, earlier in this thread. The "external work being done on the solid" might be the entering heat flux, i.e. energy, into the system.
To answer Lord Jescott's question, yes, the driving force is 1/T (or T, depending on your preferred convention).
 
  • Like
Likes dextercioby

Similar threads

  • Other Physics Topics
Replies
14
Views
2K
Replies
2
Views
1K
  • Atomic and Condensed Matter
Replies
2
Views
1K
Replies
5
Views
1K
Replies
32
Views
1K
Replies
2
Views
830
Replies
1
Views
527
  • Thermodynamics
Replies
19
Views
2K
  • Materials and Chemical Engineering
Replies
11
Views
3K
Replies
1
Views
1K
Back
Top