What is the definition of entropy in SM?

  • Context: Graduate 
  • Thread starter Thread starter SanfordA
  • Start date Start date
  • Tags Tags
    Definition Entropy
Click For Summary

Discussion Overview

The discussion centers on the definition of entropy within the context of statistical mechanics (SM) and its foundational principles. Participants explore the nature of entropy, its measurement, and its theoretical underpinnings, while also referencing related concepts in physics such as special relativity and Maxwell's Equations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant argues that entropy is a phenomenological concept and not a fundamental principle of physics, suggesting that it cannot be measured directly.
  • Another participant states that in statistical mechanics, entropy is defined as the log of the number of possible arrangements of molecules, but this definition may not apply universally.
  • A different viewpoint emphasizes that the fundamental variable in statistical mechanics is the states available to a system, rather than just the molecules themselves.
  • There is a discussion about whether entropy can be considered countable and measurable for finite systems, with some participants asserting that it can be.
  • One participant critiques the initial premise regarding special relativity, arguing that it was developed based on the unexpected behavior of Maxwell's equations, rather than the constancy of the speed of light alone.
  • Another participant defends the original assertion about special relativity, suggesting that it can be derived from Coulomb's law.

Areas of Agreement / Disagreement

Participants express multiple competing views regarding the definition and measurement of entropy, as well as the foundational principles of special relativity. The discussion remains unresolved, with no consensus reached on these topics.

Contextual Notes

Some claims rely on specific interpretations of statistical mechanics and thermodynamics, and there are unresolved questions about the applicability of certain definitions and principles across different contexts.

SanfordA
Messages
15
Reaction score
0
We physicists must be careful to insure that theories begin with correct principles. One basic principle is that all quantities must be capable of being observed or measured. If a theory uses a quantity that cannot be observed, then it is not a physics theory, but a hypothesis or a phenomenological explanation. These are important logical explanations, and possibly quite useful for proper applications, but not a proper theory. We must continue our efforts to find a proper theory.

An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Entropy is phenomenological, not a basic concept of physics, at the current state of physics. In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules. This cannot be measured, and so current ideas about entropy are not part of a physics theory.

The idea is that we can move a molecule of a gas to a higher energy state, and simultaneously move a molecule at this higher energy state to a lower state, so that the total energy is constant. We can imagine performing this experiment using lasers. There are two issues. One is that there is no observable difference between these two states, and so to discuss the number of such states is not meaningful physics. Second, we cannot imagine performing this experiment. A principle of thermodynamics is that we cannot add energy to a system and remove the same energy.

Let us illustrate this with superfluid helium. The entropy is zero. We cannot picture this as individual helium atoms each having position and momentum, the way we picture helium gas, except to say the atoms are in the container. Since there is no possible way to rearrange the atoms, the entropy is zero. The concepts of position and momentum are not valid for the atoms of superfluid helium. By the way, this is a beautiful example of quantum mechanics on the macroscopic level.

Just as we cannot picture the position and momentum of an atom of superfluid helium, we cannot picture the rearrangement, at constant system energy, of atoms of helium gas. If we cannot picture this and cannot measure it, it cannot be a fundamental principle.

Entropy is a very useful concept in thermodynamics. In SM, we can define the probability of particles, p(i) with energy E(i). This is because in QM there are discrete numbers of energy levels. We can define the temperature of a system by measuring heat flow. The Boltzmann assumption is a meaningful statement as it involves meaningful quantities. We can define the partition function and energy U.

We would like to continue and derive the thermodynamics potentials, such as A and S (entropy). If we succeed in this derivation, we can then say entropy is a meaningful concept. We need to search for a derivation of entropy that starts with SM and not with the unphysical idea that entropy is the log of the possible rearrangements. Once we correctly derived entropy, we can then show, as a result of the theory's postulates, things like Gibbs' expression for entropy and say that this shows that entropy is the log of all possible rearrangements. This would be fine; it is not fine to start with this as the first step.
 
Science news on Phys.org
In statistical mechanics, SM, entropy is defined as the log of the number of possible arrangements of the molecules.

The statistical derivation doesn't only apply to molecules.
 
Studiot said:
The statistical derivation doesn't only apply to molecules.

Your comment does not answer my question. Consider an ideal gas of helium molecules.
 
Your comment does not answer my question.

But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.
 
Studiot said:
But is does discuss your proposition.

The statistical variable is states available to a system, not molecules. We may be able to refer this to the number of molecules in certain circumstances but the fundamental variable is the possible states (of interest) the system might find itself, or be observed.

So yes, for any finite system, entropy is therefore countable and therefore measurable.

Do you mean the number of energy levels, and not the number of ways particles may be distributed among these levels?
 
SanfordA said:
An example of a theory that does not begin from basic principles is special relativity as originally expressed by Einstein. He postulated the speed of light is constant in all inertial frames. His mistake is that light is not fundamental, but a consequence of Maxwell's Equations, ME. The correct way to develop this theory is to postulate that ME is valid in all inertial frames.

Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.
 
cosmik debris said:
Hmmm, I think you are already off to a bad start. The theory of relativity was developed from the realisation that Maxwell's equations behaved unexpectedly, and he postulated that the speed of light was independent of the speed of the source.

Well, using Special Relativity we can derive ME from Coulomb's law. What bad start?
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 45 ·
2
Replies
45
Views
6K
  • · Replies 14 ·
Replies
14
Views
7K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
7K
  • · Replies 5 ·
Replies
5
Views
1K