Is there a deeper connection between thermal and information entropy?

In summary, there are multiple interpretations of entropy, including thermodynamic entropy and information entropy. While the idea that entropy is related to disorder is a common misconception, it is not entirely incorrect. Entropy can also be thought of as a measure of uncertainty or lack of information about a system. However, thermodynamic entropy has additional physical implications and is not completely equivalent to information entropy. This is due to the concept of macrostates and microstates, which are important in understanding the true meaning of entropy.
  • #36
Andrew Mason said:
It says that entropy is a measure of the energy that is not available for useful work. That is not correct. These examples will show what I mean:

While I agree with AM, I found this reference which supposedly defends the idea that entropy is a measure of energy unavailable for work:

T. V. Marcella, ‘‘Entropy production and the second law of thermodynamics:
An introduction to second law analysis,’’ Am. J. Phys. 60, 888–895, 1992

I cannot access it, if anyone can, I would be interested in what it says.
 
Science news on Phys.org
  • #37
Rap said:
I think Wikipedia is, at worst, wrong on the subject of entropy as a measure of energy unavailable for work.

Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?
 
  • #38
revo74 said:
Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?

I think what is needed is a quantitative explanation of why it is true. If the explanation works, then fine, if not, then that's another indication that it is flawed.
 
  • #39
revo74 said:
Why is the definition of entropy as a measure of energy unavailable for work so common if it's wrong. Some people argue that this is just a different way of looking at entropy. Other than the exercise AM presented, what other reasons or examples can you provide that demonstrates that this explanation is in-fact a bad one?
You will find there are many misleading explanations or definitions of entropy. Repeating them over and over does not make them more accurate. Entropy = disorder is a misleading unless you have a very specific definition of disorder. "Entropy is a measure of the energy unavailable for useful work" is also inaccurate and misleading.

An increase in entropy (of the system + surroundings) is certainly related to an increase in the amount of heat flow into a system that is not turned into useful work in a thermodynamic process. The greater the increase in total entropy the less efficient the thermodynamic process in terms of the ratio of useful work produced to input heat flow energy.

But you cannot conclude from this that entropy is a measure of the heat flow that is not available to do useful work. The change in entropy relates to only a part of the heat flow that does not produce useful work. Much of that heat flow is not available to do useful work even if there is no increase in entropy (reversible process).

AM
 
  • #40
Andrew Mason said:
You will find there are many misleading explanations or definitions of entropy. Repeating them over and over does not make them more accurate. Entropy = disorder is a misleading unless you have a very specific definition of disorder. "Entropy is a measure of the energy unavailable for useful work" is also inaccurate and misleading.

AM

Ok I am convinced information entropy and thermodynamic entropy doesn't mix well. However, even for the macroscopic, thermodynamic meaning of entropy, I still don't get what it is. I currently imagine it is a measure of energy that goes "elsewhere", to a place where it couldn't be changed to other forms. A system with high entropy means it has few "flow-able" energy, whether it flows within or outside; and a low entropy system means much of the energy could (potentially) flow. An analog to my imagination would be a glass of water with fixed number of molecules. If more water is in the form of ice (entropy), less water (energy) could flow inside the cup or be poured away.

Does that match what macroscopic thermodynamic entropy is?
 
  • #41
lowerlowerhk said:
Ok I am convinced information entropy and thermodynamic entropy doesn't mix well.

Why are you convinced of this?
 
  • #42
Rap said:
Why are you convinced of this?

Because when the term entropy was first invented, people still don't know the molecular nature of thermodynamics and the original meaning has nothing to do with disorder. Before the concept was expanded to other fields, the purpose of inventing the notion of entropy was just to understand the action of steam engines. So should be able to understand what (thermodynamic) entropy in macroscopic scale is without bothering disorder or information.
 
  • #43
lowerlowerhk said:
Because when the term entropy was first invented, people still don't know the molecular nature of thermodynamics and the original meaning has nothing to do with disorder. Before the concept was expanded to other fields, the purpose of inventing the notion of entropy was just to understand the action of steam engines. So should be able to understand what (thermodynamic) entropy in macroscopic scale is without bothering disorder or information.

But then that would mean that we can understand temperature without understanding the energy of a molecule, we can understand pressure without understanding the impact of molecules on a surface. In other words, you are saying we don't need statistical mechanics or atomic theory to understand classical thermodynamics. Actually, that is true, but there is so much more understanding to be gained by understanding all of these things in terms of statistical mechanics. Once you try to gain more understanding of entropy by using statistical mechanics, then you get the connection to information entropy.
 
  • #44
It is important to understand the difference between molecules and classical particles.
 
  • #45
Rap said:
But then that would mean that we can understand temperature without understanding the energy of a molecule, we can understand pressure without understanding the impact of molecules on a surface. In other words, you are saying we don't need statistical mechanics or atomic theory to understand classical thermodynamics. Actually, that is true, but there is so much more understanding to be gained by understanding all of these things in terms of statistical mechanics. Once you try to gain more understanding of entropy by using statistical mechanics, then you get the connection to information entropy.

I understand how useful it is to understand phenomena in microscopic terms. But as a noob in thermodynamics, I don't even have a concept to begin my connection (referring to your last sentence). It is like when talking about pressure, people just think of particle colliding with walls without realizing the corresponding macroscopic phenomena is the walls being "pushed" by a force. When talking about thermodynamic entropy, people just talk about disorder without linking the microscopic phenomena to the macroscopic one. The lack of macro-micro linking is the main reason why the notion of entropy is difficult to grasp for new comers. Although it may be turned out the microscopic aspect is more generalized hence more applicable, as a stepping stone it is still useful to know the macro aspect of entropy.

Anyway, here is what I found in a site:
http://www.science20.com/train_thought/blog/entropy_not_disorder-75081"

Historically, there is a problem of how to tell thermal energy of a system by its temperature. Entropy was invented originally to state the relationship between temperature and thermal energy. A body with the same temperature could have different thermal energy. In common sense higher temperature must means more thermal energy, but it is not always true. In fact there could be body with high thermal energy but low temperature (due to high entropy), and vice versa.This is the macroscopic phenomena of entropy.

To explain this common-sense-violating weirdo, the microscopic explanation comes into play. If the gas molecules have more ways to move, to vibrate, to oscillate, then they could gain more energy without flying faster (having higher KE). That's why bodies with same temperature (same average KE) could have different thermal energy.

And then somebody proposed the more ways to move a system has, the more messy (disorder) it is. It was not until then entropy is related to disorder, then even later, a randomness in information. And finally the term entropy has extended to other fields which has nothing to do with thermal energy.

To end with a analogy, we learn atoms first as tiny balls, then as a micro-solar system with electrons orbiting nucleus, then as electrons cloud orbiting nucleus in a quantum-mechanic way. Although the quantum-mechanic model has the strongest explanatory power, we still learn its former, less predictive model because it is too un-intuitive to grasp for new comers. So similarly, it is a bad idea to just talk about the deepest meaning of entropy and hoping a few genius could retro-deduce it back to the macroscopic aspect.
 
Last edited by a moderator:
  • #46
lowerlowerhk said:
I understand how useful it is to understand phenomena in microscopic terms. But as a noob in thermodynamics, I don't even have a concept to begin my connection (referring to your last sentence). It is like when talking about pressure, people just think of particle colliding with walls without realizing the corresponding macroscopic phenomena is the walls being "pushed" by a force. When talking about thermodynamic entropy, people just talk about disorder without linking the microscopic phenomena to the macroscopic one. The lack of macro-micro linking is the main reason why the notion of entropy is difficult to grasp for new comers. Although it may be turned out the microscopic aspect is more generalized hence more applicable, as a stepping stone it is still useful to know the macro aspect of entropy.

Entropy is not only difficult for newcomers, it is difficult for almost everyone. Forget about "disorder", that only gives you a "feel" for entropy. "disorder" is not rigorously defined, and the idea gets squishy if you push it too far. The link between thermodynamic entropy and information entropy is the beginning of the "macro-micro linking" you are looking for. The thermodynamic entropy is equal to the Boltzmann constant times the information entropy, and the information entropy is the minimum number of yes/no questions you have to ask to determine the microstate, given that you know the macrostate (temperature, pressure, etc.). Sort of like the game of twenty questions. This doesn't give you the complete link, but it goes a LONG way towards it.

lowerlowerhk said:
Anyway, here is what I found in a site:
http://www.science20.com/train_thought/blog/entropy_not_disorder-75081"

This guy uses "diversity" as giving a better feel for entropy. I think it is better. But to be exact, what is "diversity"? If I flip four fair coins, but I don't look at them yet, the info entropy is 4 bits. I have to ask at least 4 questions about them to know how they came out. With 8 coins, its 8 questions. If you say that the 8 coin result is exactly twice as diverse as the 4 coin result, then yes, diversity is a good word because it is exactly proportional to the information entropy. Call it whatever you want, and as long as what you call it is proportional to the information entropy, you will be correct. The thermo entropy is proportional to the info entropy by the Boltzmann constant, so either one is good.

lowerlowerhk said:
Historically, there is a problem of how to tell thermal energy of a system by its temperature. Entropy was invented originally to state the relationship between temperature and thermal energy. A body with the same temperature could have different thermal energy. In common sense higher temperature must means more thermal energy, but it is not always true. In fact there could be body with high thermal energy but low temperature (due to high entropy), and vice versa.This is the macroscopic phenomena of entropy.

To explain this common-sense-violating weirdo, the microscopic explanation comes into play. If the gas molecules have more ways to move, to vibrate, to oscillate, then they could gain more energy without flying faster (having higher KE). That's why bodies with same temperature (same average KE) could have different thermal energy.

And then somebody proposed the more ways to move a system has, the more messy (disorder) it is. It was not until then entropy is related to disorder, then even later, a randomness in information. And finally the term entropy has extended to other fields which has nothing to do with thermal energy.

To end with a analogy, we learn atoms first as tiny balls, then as a micro-solar system with electrons orbiting nucleus, then as electrons cloud orbiting nucleus in a quantum-mechanic way. Although the quantum-mechanic model has the strongest explanatory power, we still learn its former, less predictive model because it is too un-intuitive to grasp for new comers. So similarly, it is a bad idea to just talk about the deepest meaning of entropy and hoping a few genius could retro-deduce it back to the macroscopic aspect.

Thats the part of thermodynamic entropy that information entropy does not give much help on - the connection to the rest of thermodynamics. Thermodynamic entropy is Boltzmann's constant times the information entropy. The internal energy is temperature times Boltzmann's constant times the information entropy. Boltzmann's constant is historically connected to entropy rather than temperature, but its just as good to define a new temperature T'=kT and say that internal energy is the new temperature times the information entropy. Now the real problem is to understand temperature. I think that once information entropy is understood, the real problem lies in understanding temperature. People don't worry about temperature, because its something we intuitively "understand" - hot and cold and all that. But the real problem is the bridge from information entropy to internal energy, and that bridge is the temperature. Anyone who claims to understand thermodynamic entropy MUST understand temperature, and not in the "hot and cold" sense, but in the sense of being the bridge between internal energy and information entropy, two things we can more easily understand.
 
Last edited by a moderator:
  • #47
Rap said:
Thats the part of thermodynamic entropy that information entropy does not give much help on - the connection to the rest of thermodynamics. Thermodynamic entropy is Boltzmann's constant times the information entropy. The internal energy is temperature times Boltzmann's constant times the information entropy. Boltzmann's constant is historically connected to entropy rather than temperature, but its just as good to define a new temperature T'=kT and say that internal energy is the new temperature times the information entropy. Now the real problem is to understand temperature. I think that once information entropy is understood, the real problem lies in understanding temperature. People don't worry about temperature, because its something we intuitively "understand" - hot and cold and all that. But the real problem is the bridge from information entropy to internal energy, and that bridge is the temperature. Anyone who claims to understand thermodynamic entropy MUST understand temperature, and not in the "hot and cold" sense, but in the sense of being the bridge between internal energy and information entropy, two things we can more easily understand.

Temperature is relatively easy to understand. Both the macro meaning (ease of conducting heat away) and micro meaning (average KE in molecules) are intuitive enough to understand and linked quickly.

Entropy, macrosocpically is a (not directly proportional) measure of thermal energy that doesn't rise temperature of a body. Microscopically entropy is a measure of how many ways could the molecules oscillate (more oscillations available means less KE and lower temperature), which is directly proportional to the number of ways to arrange molecules of a body. This is how thermodynamic entropy and information entropy related.

This missing piece of logic that makes thermodynamic entropy and information entropy so unrelated is the hidden relationship between molecule arrangements and available oscillations, which is expressed mathematically be Boltzmann's constant.
 
Last edited:
  • #48
Temperature is relatively easy to understand. Both the macro meaning (ease of conducting heat away) and micro meaning (average KE in molecules) are intuitive enough to understand and linked quickly.

Entropy, macrosocpically is a (not directly proportional) measure of thermal energy that doesn't rise temperature of a body. Microscopically entropy is a measure of how many ways could the molecules oscillate (more oscillations available means less KE and lower temperature), which is directly proportional to the number of ways to arrange molecules of a body. This is how thermodynamic entropy and information entropy related.

This missing piece of logic that makes thermodynamic entropy and information entropy so unrelated is the hidden relationship between molecule arrangements and available oscillations, which is expressed mathematically be Boltzmann's constant.

So was all this a statement that a cathode ray does or does not possesses a defineable and measurable temperature?
 
  • #49
Studiot said:
So was all this a statement that a cathode ray does or does not possesses a defineable and measurable temperature?

I don't see your logic. Cathode ray, ie electrons, are moving, so they have kinetic energy. So you can calculate the average kinetic energy as temperature. Though when talking about beams we usually use power to describe its heat transfer instead of temperature, because you can't put a travel beam of electrons next to an object like a block.

And I don't understand what you want to demonstrate by throwing out the question.
 
Last edited:
  • #50
I notice you avoided my comment, just as you did for post #44 and this thread

https://www.physicsforums.com/showthread.php?t=517005

When my comments were designed to help you tighten up on some loose terminology, which I think is hindering your understanding.

In particular specific heat is a property of the system/material itself, not of the process.

You can only add or subtract heat from a material/system at the (not time) rate of specific heat.

Entropy is a process property. You can add more or less heat to a system and the process equations (first and second laws etc) lead to different entropy (and other variable) solutions.

go well
 
  • #51
Andrew Mason said:
Thermodynamic entropy is the logarithm of the number of microstates of the molecules of a system in phase space (momentum, position in 3D) multiplied by the Boltzmann constant. It seems to me that information entropy, dealing with probability starts with a very different concept - the number of 1s and 0s required to convey an intelligible message.
AM

Wait, what?

1s and 0s ARE microstates. The map of all possible trajectories in the system's phasespace IS the information about the system. If you only have one bit, you only have two microstates (yes, they call them 1 and 0, but names aren't important, we can call them state a and state b).

A system that is in steady state has 0 bits (it has no states to change to so you don't even need a 1 or 0 to express it's state).

"intelligible" has nothing to do with it. The "message" could be a billiard ball impacting another billiard ball. It's the energy, the wave, the propagation of a disturbance, not the matter itself.

Maxwell's demon requires information to do work.

If you can direct me to some authority who says that there is a real and substantial connection between the two concepts I'll be happy to reconsider.

What do you think of what Kolmogorov has done with information theory? (metric entropy, Kolmogorov complexity)
 
  • #52
lowerlowerhk said:
This missing piece of logic that makes thermodynamic entropy and information entropy so unrelated is the hidden relationship between molecule arrangements and available oscillations, which is expressed mathematically be Boltzmann's constant.

Come to think of it, you are right. I was thinking of Boltzmann's constant, as something that could be set to one, by the right definition of temperature, and so not that important. But that is an "information-centric" view. How do you come by that "right definition"? You cannot use a well calibrated thermometer based on the second law, measure Boltzmann's constant, then multiply the temperature by that constant to get your new temperature, because that begs the question. From an information theory point of view, it doesn't matter what the constant is, but from a thermodynamic+information point of view, the value of Boltzmann's constant is the key to understanding thermodynamic entropy.
 
Last edited:
  • #53
Studiot said:
In particular specific heat is a property of the system/material itself, not of the process.

You can only add or subtract heat from a material/system at the (not time) rate of specific heat.
Yes, we both agree on that.
Studiot said:
Entropy is a process property.

go well

Shouldn't entropy S like internal energy U is a state, while dS and dU process?

What you said contradicts what I see in wikipedia and various sources that entropy is a state.

http://en.wikipedia.org/wiki/List_of_thermodynamic_properties"
 
Last edited by a moderator:
  • #54
Rap said:
From an information theory point of view, it doesn't matter what the constant is, but from a thermodynamic point of view, its the key to understanding thermodynamic entropy.

There is something that amuses me here. It is easier to start from thermodynamic entropy to search for a interpretation that links the two kinds of entropy together. But if you begin with information entropy it is much harder to associate information with thermal energy. I tried the second approach and ended up with an instrumentalist conclusion of "it's just a dummy variable that make things complete and there is no point to think what that represent as long as things could work". (To me) it looks like some point view is easier to generalize and extend than other.
 
  • #55
I notice you have also joined this thread.

https://www.physicsforums.com/showthread.php?t=487647&highlight=wikipedia&page=2

There is a long discussion noting the shortcomings of Wikipedia in thermodynamics.
In particular have you read post #23?

Strictly, entropy is a state function, yes.

But consider the thermodynamic implications of this variation on a once popular auto fuel advert.

Take two different makes of car.

Put 1 litre of identical fuel in each.

Car A achieves 7.8 miles on its litre of fuel.

Car B achieves 13.4 miles on its litre of fuel.

Is it the difference due to the system or the fuel?
 
  • #56
lowerlowerhk said:
I don't see your logic. Cathode ray, ie electrons, are moving, so they have kinetic energy. So you can calculate the average kinetic energy as temperature.
Temperature is only defined for a substance in thermal equilibrium. In thermal equilibrium the translational kinetic energies of the molecules follow a Maxwell-Bolztmann distribution. The kinetic energies are not all the same. In a cathode ray the moving electrons are all in one direction and all with the same energy. So its temperature cannot be defined.

AM
 
  • #57
Pythagorean said:
Wait, what?

1s and 0s ARE microstates. The map of all possible trajectories in the system's phasespace IS the information about the system. If you only have one bit, you only have two microstates (yes, they call them 1 and 0, but names aren't important, we can call them state a and state b).

A system that is in steady state has 0 bits (it has no states to change to so you don't even need a 1 or 0 to express it's state).

"intelligible" has nothing to do with it. The "message" could be a billiard ball impacting another billiard ball. It's the energy, the wave, the propagation of a disturbance, not the matter itself.

Maxwell's demon requires information to do work.



What do you think of what Kolmogorov has done with information theory? (metric entropy, Kolmogorov complexity)
There are some analogies between the entropy in information theory and entropy in thermodynamics, but the underlying concepts are very different. They have different origins and apply to different things having vastly different numbers of particles/events. Thermodynamic entropy is inextricably tied to a concept of temperature with an underlying assumption that all microstates are equally probable. Information entropy has nothing even analogous to temperature, as far as I can see, and assumes that the probabilities of individual microstates are not equal. So, as far as I can see the only real connection is that they have same name.

AM
 
  • #58
Good afternnon, Andrew.

Your last two posts could do with some amplification, I feel, else some get the wrong impression.

So in post#56

(absolute) temperature may be only defined for a system in equilibrium, but temperature difference is OK for for non equilibrium situations.

and in post#57

A distinction should be made between the statistics of the average action of a large number of particles, leading to classical thermodynamic relationships in the physical world (which again is OK) and information states in the nonphysical world which can be very different depending upon your system of logic and information.

Further comment on my cathode ray example.

My main aim in producing this was to offer a physical example not involving molecules to reinforce my effort to stress the difference between molecules and particles.

However if the beam impacts upon another physical object energy will be transferred, in accordance with the laws of thermodynamics.
In this case a temperature difference may be established.

go well
 
  • #59
Andrew Mason said:
There are some analogies between the entropy in information theory and entropy in thermodynamics, but the underlying concepts are very different. They have different origins and apply to different things having vastly different numbers of particles/events. Thermodynamic entropy is inextricably tied to a concept of temperature with an underlying assumption that all microstates are equally probable. Information entropy has nothing even analogous to temperature, as far as I can see, and assumes that the probabilities of individual microstates are not equal. So, as far as I can see the only real connection is that they have same name.

AM

Nobody argues that thermodynamic entropy is not tied to temperature, but the thermodynamic entropy is equal to Boltzmann's constant times the information entropy. The information entropy is proportional to the minimum number of yes/no questions you have to ask to determine the microstate, given the macrostate. As mentioned above, the Boltzmann constant is the "bridge" between thermodynamic and information entropy, and the full meaning of thermodynamic entropy is to be found in

#1: The understanding of information entropy, and

#2: The understanding of why the Boltzmann constant has the value it has.

You can understand #1 without reference to temperature. Its #2 that brings the temperature in. You cannot understand #2 without understanding temperature. This is the way that temperature is "unlinked" from information entropy. The question of whether thermodynamic entropy and information entropy both deserve to have the word "entropy" in their name is semantic, not very interesting. The link between the two is unavoidably there, whatever you want to call them. #2 is where the fun is, not in a semantic argument over the naming of things, and not in refusing to divide up the understanding of thermodynamic entropy into #1 and #2.

Also, they do not have "vastly different numbers of particles/events". Information entropy puts no limit on the number of "events" that it will deal with. Nor does it necessarily assume that microstates have different probabilities.
 
Last edited:
  • #60
but the thermodynamic entropy is equal to Boltzmann's constant times the information entropy.

Such a statemen by itself is not proof of linkage any more than the following proves a linakge between a certain stone in my garden an the USS Forrestal.

The weight of the USS Forrestal is exactly 1.2345658763209 * 109 times the weight of a certain stone in my garden.
 
  • #61
Studiot said:
Such a statemen by itself is not proof of linkage any more than the following proves a linakge between a certain stone in my garden an the USS Forrestal.

The weight of the USS Forrestal is exactly 1.2345658763209 * 109 times the weight of a certain stone in my garden.

That's like saying that E=mc^2 by itself is not proof of linkage between energy and mass any more than your USS Forrestal example. You are interpreting it the wrong way. E=mc^2 is a statement that there is a linkage, not proof that there is a linkage. S=kH is a statement that there is a linkage between thermo entropy S and info entropy H, not a proof.
 
  • #62
The following is an interpretation of entropy inspired by the video below.I further develop the content and attempt to reconcile the seeming irrelevance of thermo and info entropy. The result is rather successful. Please tell if it works in explaining the equations.
http://www.youtube.com/watch?v=xJf6pHqLzs0"

Lets begin with a ball falling vertically to the ground at certain speed.

Initially the velocity vector of each particle in the ball is almost the same in both magnitude and direction. The velocity vector has a component which is random in direction as the particles are vibrating, but the magnitude of the random vector component is small and insignificant compared to the vertical velocity vector which is same for all particles.

When a ball collides with the ground, the ball particles at contact surface will receive some momentum from particles at ground surface. As the ground particles are vibrating, they also carry a random-direction velocity vector (statistically, NOT individually). That means the momentum vectors (there are many arrows of vector) transferred to the ball particles are also random in direction.

So the colliding ball particles might receive momentum with various combination, with totally same direction like up,up,up,up,up..., a less ordered combination like up,up,up,up,left..., to totally random direction like left,up,left,right,up... or left,up,down,up,left...
As number of disordered combination are much more than ordered combination, for almost 100% of the time the ball will receive disordered momentum directions.

So receiving the random momentums, the magnitude of random part of the velocity vector becomes more significant compared to its initial velocity vector,and entropy is said to be increased. Thermodynamically speaking, entropy is the significance of the random part of particle velocity vector compared to the whole velocity vector.

Let's verify this interpretation with the thermodynamic entropy equation:

[tex]dS=\frac{dQ}{T}[/tex], assuming the collision occurs on the small scale and does not affect the average KE (ie temperature) of ball particles.

So, after the collision, some ball particles gain some velocity, which results in gain in kinetic energy dQ. Because the direction of gained velocity is random in direction, the more "energy of such kind" they gain, the more significant the magnitude of random velocity will be. In other words, dS is proportional to dQ, which matches the equation.

The amount the significance increment is related to initial velocity of particles. If they already have a very high average velocity (hence very high temperature), the introduction of the random velocity won't bring much change to the particle velocity. In other words, dS is anti-proportional to T, which matches the equation as well.

(note: temperature represents average kinetic energy, which doesn't tell whether the velocities are of random direction or how random they are. But for particles confined in a fixed size container, they are certain to collide with each other and their velocity is statistically random. So for particles confined in a container with fixed size, there is no way to increase their average KE without increasing random velocity.)You could do that the other way:
[tex]Tds=dQ[/tex]
Suppose a body loss some entropy, which means a PORTION (not an amount) or random velocity is loss, so the particles of the body would loss some KE due to decrease in velocity. But entropy only tells the portion, to tell the actual amount of KE loss you need to multiply it by how much KE the particles are having (ie temperature). That is why dQ=Tds.From above parapraphs, thermodynamic entropy is the significance of the random part of particle velocity vector compared to the whole velocity vector.
Let me change "particle velocity vector" to "value" (or information if you like), and the last sentence becomes:
Information entropy is the significance of random part of a value compared to whole value. As I believe info entropy is easy to understand if you don't try to relate it to thermo-entropy (which is what i soon will do), I skip the verification part and leave that to you.

The relationship between thermo-entropy and info-entropy is that, thermo-entropy measures randomness of an very existing phenomena (particle velocity), while info-entropy measures randomness of number. Info-entropy is the mathematical tool, and thermo-entropy is an application of the tool to measure a physical phenomena. The two are both related and unrelated. They are related as thermo-entropy borrows the info-entropy concept. They are unrelated as one tells how particles behave and one tells how numbers behave.

Entropy as statistic phenomena and macroscopic phenomena
Doing work requires particles colliding a wall together (statistically speaking). With increasing randomness in their velocity, the chance of them colliding a wall together become increasingly low, thus doing work becomes statistically impossible. On a macroscopic scale, the statistical impossibility manifests as "less work is done (compared to frictionless predictions)". Futhermore, when entropy of everywhere in the universe increases, the actual work done will be lesser and lesser than predicted, until finally, no work is done.

(Note: the difficulty in learning the notion of entropy comes form:
1. Therm-entropy is related to how come less work is done during a heating process. So learners have very very strong temptation to think entropy as "a measure of how much less work is done" or some kind of energy storing micro-mechanism. As demonstrated by Andrew Manson in post #23 there is no proportionality here.

2. Formalist
Learners usually start learning entropy by looking at the definition, which is of highly condensed language with all the development history from concepts to words to mathematical symbols unmentioned. Without proper conceptual linking an equation would mean no more than "a value equals to something times something")
 
Last edited by a moderator:
  • #63
marmoset said:
Ed Jaynes wrote a lot about this, check out this paper

http://bayes.wustl.edu/etj/articles/theory.1.pdf

Lots of his other papers are online, http://bayes.wustl.edu/etj/node1.html, interesting stuff.

marmoset, thanks for the links, note #74 on thermal efficiency, has in my mind validated the thoughts about multiple heat sinks in a system. I think it is now clear to me how to better describe my thoughts of a two or more system generator.

As for this thread my mind may have jumped ahead of what I have read or understand, but it seems that Boltzmann in general has been thought of as a single particle and single impact event, method of consideration.

Is it possible that a connection between thermal and information entropy goes much further, due to speed and number of electron interactions in the time frame of a single impact of one particle ?
The same as a transformer changes and transmits voltage and current, with some resulting heat value involved, it seems to me that what is happening at the container wall surface and the particle electron cloud area involves magnitudes more of what can be calculated.

Forgive my comments if they are in advance of what others have already said in writings I am about to study in the near future.

Ron
 

Similar threads

  • Thermodynamics
Replies
3
Views
788
  • Thermodynamics
Replies
1
Views
2K
Replies
4
Views
1K
Replies
2
Views
843
Replies
3
Views
968
Replies
9
Views
1K
Replies
3
Views
1K
  • Special and General Relativity
Replies
7
Views
295
Replies
6
Views
2K
  • Thermodynamics
Replies
2
Views
2K
Back
Top