# What is the definition of entropy in SM?

SanfordA
We physicists must be careful to insure that theories begin with correct principles. One basic principle is that all quantities must be capable of being observed or measured. If a theory uses a quantity that cannot be observed, then it is not a physics theory, but a hypothesis or a phenomenological explanation. These are important logical explanations, and possibly quite useful for proper applications, but not a proper theory. We must continue our efforts to find a proper theory.

An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Entropy is phenomenological, not a basic concept of physics, at the current state of physics. In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules. This cannot be measured, and so current ideas about entropy are not part of a physics theory.

The idea is that we can move a molecule of a gas to a higher energy state, and simultaneously move a molecule at this higher energy state to a lower state, so that the total energy is constant. We can imagine performing this experiment using lasers. There are two issues. One is that there is no observable difference between these two states, and so to discuss the number of such states is not meaningful physics. Second, we cannot imagine performing this experiment. A principle of thermodynamics is that we cannot add energy to a system and remove the same energy.

Let us illustrate this with superfluid helium. The entropy is zero. We cannot picture this as individual helium atoms each having position and momentum, the way we picture helium gas, except to say the atoms are in the container. Since there is no possible way to rearrange the atoms, the entropy is zero. The concepts of position and momentum are not valid for the atoms of superfluid helium. By the way, this is a beautiful example of quantum mechanics on the macroscopic level.

Just as we cannot picture the position and momentum of an atom of superfluid helium, we cannot picture the rearrangement, at constant system energy, of atoms of helium gas. If we cannot picture this and cannot measure it, it cannot be a fundamental principle.

Entropy is a very useful concept in thermodynamics. In SM, we can define the probability of particles, p(i) with energy E(i). This is because in QM there are discrete numbers of energy levels. We can define the temperature of a system by measuring heat flow. The Boltzmann assumption is a meaningful statement as it involves meaningful quantities. We can define the partition function and energy U.

We would like to continue and derive the thermodynamics potentials, such as A and S (entropy). If we succeed in this derivation, we can then say entropy is a meaningful concept. We need to search for a derivation of entropy that starts with SM and not with the unphysical idea that entropy is the log of the possible rearrangements. Once we correctly derived entropy, we can then show, as a result of the theory's postulates, things like Gibbs' expression for entropy and say that this shows that entropy is the log of all possible rearrangements. This would be fine; it is not fine to start with this as the first step.

## Answers and Replies

Studiot
In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules.

The statistical derivation doesn't only apply to molecules.

SanfordA
The statistical derivation doesn't only apply to molecules.

Your comment does not answer my question. Consider an ideal gas of helium molecules.

Studiot
Your comment does not answer my question.

But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.

SanfordA
But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.

Do you mean the number of energy levels, and not the number of ways particles may be distributed among these levels?

cosmik debris
An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.

SanfordA
Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.

Well, using Special Relativity we can derive ME from Coulomb's law. What bad start?