What is the definition of entropy in SM?

In summary: The number of energy levels is a valid way to measure entropy, but it is not the only way. The number of ways particles may be distributed among these levels is also a way to measure entropy.
  • #1
SanfordA
15
0
We physicists must be careful to insure that theories begin with correct principles. One basic principle is that all quantities must be capable of being observed or measured. If a theory uses a quantity that cannot be observed, then it is not a physics theory, but a hypothesis or a phenomenological explanation. These are important logical explanations, and possibly quite useful for proper applications, but not a proper theory. We must continue our efforts to find a proper theory.

An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Entropy is phenomenological, not a basic concept of physics, at the current state of physics. In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules. This cannot be measured, and so current ideas about entropy are not part of a physics theory.

The idea is that we can move a molecule of a gas to a higher energy state, and simultaneously move a molecule at this higher energy state to a lower state, so that the total energy is constant. We can imagine performing this experiment using lasers. There are two issues. One is that there is no observable difference between these two states, and so to discuss the number of such states is not meaningful physics. Second, we cannot imagine performing this experiment. A principle of thermodynamics is that we cannot add energy to a system and remove the same energy.

Let us illustrate this with superfluid helium. The entropy is zero. We cannot picture this as individual helium atoms each having position and momentum, the way we picture helium gas, except to say the atoms are in the container. Since there is no possible way to rearrange the atoms, the entropy is zero. The concepts of position and momentum are not valid for the atoms of superfluid helium. By the way, this is a beautiful example of quantum mechanics on the macroscopic level.

Just as we cannot picture the position and momentum of an atom of superfluid helium, we cannot picture the rearrangement, at constant system energy, of atoms of helium gas. If we cannot picture this and cannot measure it, it cannot be a fundamental principle.

Entropy is a very useful concept in thermodynamics. In SM, we can define the probability of particles, p(i) with energy E(i). This is because in QM there are discrete numbers of energy levels. We can define the temperature of a system by measuring heat flow. The Boltzmann assumption is a meaningful statement as it involves meaningful quantities. We can define the partition function and energy U.

We would like to continue and derive the thermodynamics potentials, such as A and S (entropy). If we succeed in this derivation, we can then say entropy is a meaningful concept. We need to search for a derivation of entropy that starts with SM and not with the unphysical idea that entropy is the log of the possible rearrangements. Once we correctly derived entropy, we can then show, as a result of the theory's postulates, things like Gibbs' expression for entropy and say that this shows that entropy is the log of all possible rearrangements. This would be fine; it is not fine to start with this as the first step.
 
Science news on Phys.org
  • #2
In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules.

The statistical derivation doesn't only apply to molecules.
 
  • #3
Studiot said:
The statistical derivation doesn't only apply to molecules.

Your comment does not answer my question. Consider an ideal gas of helium molecules.
 
  • #4
Your comment does not answer my question.

But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.
 
  • #5
Studiot said:
But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.

Do you mean the number of energy levels, and not the number of ways particles may be distributed among these levels?
 
  • #6
SanfordA said:
An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.
 
  • #7
cosmik debris said:
Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.

Well, using Special Relativity we can derive ME from Coulomb's law. What bad start?
 

FAQ: What is the definition of entropy in SM?

What is the definition of entropy in SM?

Entropy in SM stands for entropy in statistical mechanics. It is a measure of the disorder or randomness of a system. In simpler terms, it is a measure of the amount of energy that is unavailable for work in a system.

How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that as the system becomes more disordered, the entropy will increase. Therefore, entropy can be seen as a measure of the direction of time.

What is the difference between microstate and macrostate in relation to entropy?

A microstate refers to the specific arrangement of particles in a system, while a macrostate refers to the overall properties of the system, such as temperature and pressure. The entropy of a system is related to the number of possible microstates that can occur within a given macrostate.

How is entropy calculated in statistical mechanics?

In statistical mechanics, entropy is calculated using the Boltzmann formula: S = k lnW, where S is the entropy, k is the Boltzmann constant, and W is the number of possible microstates in a given macrostate. This formula relates the microscopic properties of a system to its macroscopic properties.

What are some real-world examples of entropy?

Some examples of entropy in everyday life include the melting of ice cubes, the rusting of metal, and the mixing of different gases. These processes all result in an increase in disorder and an increase in entropy.

Back
Top