What is the Conceptual Meaning of Boltzmann's Constant?

Click For Summary

Discussion Overview

The discussion revolves around the conceptual meaning of Boltzmann's constant, exploring its relationship with entropy, both thermodynamic and informational. Participants delve into the implications of string theory, the nature of microstructures, and the complexities of measuring information and entropy.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Conceptual clarification

Main Points Raised

  • Some participants question whether changes in the frequencies of vibrating strings can be considered a cause of entropy, linking this to the second law of thermodynamics.
  • There is a discussion about the equivalence of informational entropy and thermodynamic entropy, with some proposing that they can be equal under certain conditions.
  • One participant expresses the view that constructing measures of information is ambiguous and dependent on chosen axioms or properties, suggesting that different entropy measures reflect different perspectives on information.
  • Another participant introduces the idea of dynamic microstructures and how they relate to the construction of observables, emphasizing that statistical treatments are relative to the chosen microstructure.
  • There is a claim that Boltzmann's constant serves as a scaling factor in entropy measures, linking it to the probabilities of microstates and the overall confidence in the measure itself.
  • Some participants highlight the differences between various entropy measures, such as Gibbs and Boltzmann entropy, noting their distinct foundational approaches.

Areas of Agreement / Disagreement

Participants express differing views on the relationship between informational and thermodynamic entropy, the nature of microstructures, and the implications of Boltzmann's constant. The discussion remains unresolved with multiple competing perspectives present.

Contextual Notes

Participants note that the choice of microstructure and the assumptions underlying entropy measures can significantly affect the conclusions drawn. The discussion reflects a range of interpretations and theoretical frameworks without reaching consensus.

Emanresu56
Messages
13
Reaction score
0
Are changes in the frequencies of vibrating strings the cause of entropy? Does entropy also count in the case of changes in the frequency of vibrating strings?
 
Last edited:
Physics news on Phys.org
You mean, do strings have modes that increase informational entropy anywhere?

..yea, that means: do they inform the universe?
 
Emanresu56 said:
Are changes in the frequencies of vibrating strings the cause of entropy?

What do you mean by "the cause of entropy" ? I would guess that you are talking about a cause of the 2nd law of thermodynamics, which states that the entropy in an isolated system never decreases.

Then my answer would be that we do not need to use string theory or even quantum mechanics to explain this law, as it is already well explained in classical physics.

Entropy is a property of an ensemble of strings, which can pass energy between each other, and a higher energy means a higher frequency. The strings exchange energy until they arrive at a maximal entropy equilibrium state.
 
When is informational entropy equal to thermodynamic entropy? What dimensions does information have?
 
personal view on this

sirchasm said:
When is informational entropy equal to thermodynamic entropy? What dimensions does information have?

I personally thinkg the whole notion of information and entropy is really more complicated than what most basic treatments admit.

To construct a measure of information is an ambigous task in general. There are different entropy measures around, and all of them aim to somehow be a measure of information, or missing information, depending on how you see it. But to construct a measure, how do you know (measure??) when you have the "best" measure?

Most constructions start with either axioms (such as cox axioms) that the authors feels is obvious enough to not be questioned by most readers, or various "desired properties". But these constructions are not innocent or universal.

About the problem of the "ergodic hypotheis", most measures of information, start with a choice of microstructure, whose state are supposedly encoding the information. But the funny part is that you often equip with an equiprobabiltiy hypothesis or other wise and ad hoc prior probabiltiy for each state, that affects the measure constructed. The point is that you can choose a microstructure, to tweak your measures as desried.

Unless you can infer a natural preferred microstrcuture and prior from some first principles. In string theory, the basic microstructure implied in the idea that particles are really strings is also such ambigous choice.

I personally think one solution, yey to be elaborated, is a evolutionary and relational approach to the very construction of these measures. This would suggest that there really is no preferred fixed microstructure, the microstructures are also emerging dynamically in relation to it's environment. In particular is the notion of degree of freedom emergent, when one system (=one observer) relates to something unkonwn(=it's unkonwn environment in the general case). Not only does an image emerge, the screen where the image lives must also emerge by a related logic.

/Fredrik
 
In computer science you have an unquestionably given universal microstructure - digital memory. But such simple escapes doesn't do in this case IMO (ie pondering the future of physics and QG), it is much more complicated. I see how hope in finding and universal hilbert space, I think we must consider the idea of dynamical microstructures and hilbert spaces as well, not ONLY dynamical microstates and state vectors moving around in a fixed background hilbert space (~microstructure).

/Fredrik
 


To clarify my comments - the problem of constructing intrinsic measures, is pretty much the same or at worst related to the problem of choosing/constructing what are to be considered observables, because I think the ideal is that since we can only make sensible statements on observables, our model should treat and predict observables. Therefore, any kind of "statistical treatment" stands and falls with the choice of it's microstructure, or put differently, the entire treatment is _relative_ to this choice.

/Fredrik
 
Well, thermodynamic entropy is equivalent to Shannon entropy, when you set k to 1.
Then heat has the same units as information, then heat = information in (0,1) basis.
 
sirchasm said:
Well, thermodynamic entropy is equivalent to Shannon entropy, when you set k to 1.
Then heat has the same units as information, then heat = information in (0,1) basis.

Yes there are a few different issues I see here. First is that some entropies like Gibbs, is constructed from a probability space. Boltzmanns, are constructed instead from a combinatorial approach if microstates. It effectively constructs a kind of discrete probability space.

The microstructure in physical systems, are the possible microstates a system can have, various degress of freedom. But aside that, the other major difference between the physical entropies is like you note boltzmanns constant.

When relating these measures, what is the conceptual meaning of boltzmanns constant?

As I see it, it's meaning becomes obvious only when you put the entropy measure in a larger context. For example, why do we need a measure of disorder in the first place? Well, it's because it helps us figure out what more, or less probable. Ultimately, probabilities are linked to probabilities. There are also entropies (like the relative KL entropy) that relate to transition probabilities.

If you play with these expressions, the constant in place of boltzmanns, typically represents a total "mass" of the probability distribution. Ie. it scales the weight of the entire measure itself. I.e it's a kind of measure of measure.

This is where I think it gets more interesting, and it relates to probabilities in the sense that the probability of something "specifically-unlikely" still depends on the total confidence _in hte measure itself_, which is pretty much the function of boltzmanns constnat IMHO.

This can be exploited also when you consider dynamical, and competing microstructures or probabiltiy spaces, then the constant in front, is a weight between the structures.

So the full relevance of boltzmanns constant enters the picture first when then entropy measure is used in probability calculations, or the measures is leveled against alternative measures of disorder.

/Fredrik
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 33 ·
2
Replies
33
Views
9K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 26 ·
Replies
26
Views
4K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 47 ·
2
Replies
47
Views
9K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
4K