What is the Conceptual Meaning of Boltzmann's Constant?

In summary: In the former case, it's a measure of disorder or confusion, while in the latter it's a measure of the information content of a system.
  • #1
Emanresu56
13
0
Are changes in the frequencies of vibrating strings the cause of entropy? Does entropy also count in the case of changes in the frequency of vibrating strings?
 
Last edited:
Physics news on Phys.org
  • #2
You mean, do strings have modes that increase informational entropy anywhere?

..yea, that means: do they inform the universe?
 
  • #3
Emanresu56 said:
Are changes in the frequencies of vibrating strings the cause of entropy?

What do you mean by "the cause of entropy" ? I would guess that you are talking about a cause of the 2nd law of thermodynamics, which states that the entropy in an isolated system never decreases.

Then my answer would be that we do not need to use string theory or even quantum mechanics to explain this law, as it is already well explained in classical physics.

Entropy is a property of an ensemble of strings, which can pass energy between each other, and a higher energy means a higher frequency. The strings exchange energy until they arrive at a maximal entropy equilibrium state.
 
  • #4
When is informational entropy equal to thermodynamic entropy? What dimensions does information have?
 
  • #5
personal view on this

sirchasm said:
When is informational entropy equal to thermodynamic entropy? What dimensions does information have?

I personally thinkg the whole notion of information and entropy is really more complicated than what most basic treatments admit.

To construct a measure of information is an ambigous task in general. There are different entropy measures around, and all of them aim to somehow be a measure of information, or missing information, depending on how you see it. But to construct a measure, how do you know (measure??) when you have the "best" measure?

Most constructions start with either axioms (such as cox axioms) that the authors feels is obvious enough to not be questioned by most readers, or various "desired properties". But these constructions are not innocent or universal.

About the problem of the "ergodic hypotheis", most measures of information, start with a choice of microstructure, whose state are supposedly encoding the information. But the funny part is that you often equip with an equiprobabiltiy hypothesis or other wise and ad hoc prior probabiltiy for each state, that affects the measure constructed. The point is that you can choose a microstructure, to tweak your measures as desried.

Unless you can infer a natural preferred microstrcuture and prior from some first principles. In string theory, the basic microstructure implied in the idea that particles are really strings is also such ambigous choice.

I personally think one solution, yey to be elaborated, is a evolutionary and relational approach to the very construction of these measures. This would suggest that there really is no preferred fixed microstructure, the microstructures are also emerging dynamically in relation to it's environment. In particular is the notion of degree of freedom emergent, when one system (=one observer) relates to something unkonwn(=it's unkonwn environment in the general case). Not only does an image emerge, the screen where the image lives must also emerge by a related logic.

/Fredrik
 
  • #6
In computer science you have an unquestionably given universal microstructure - digital memory. But such simple escapes doesn't do in this case IMO (ie pondering the future of physics and QG), it is much more complicated. I see how hope in finding and universal hilbert space, I think we must consider the idea of dynamical microstructures and hilbert spaces as well, not ONLY dynamical microstates and state vectors moving around in a fixed background hilbert space (~microstructure).

/Fredrik
 
  • #7


To clarify my comments - the problem of constructing intrinsic measures, is pretty much the same or at worst related to the problem of choosing/constructing what are to be considered observables, because I think the ideal is that since we can only make sensible statements on observables, our model should treat and predict observables. Therefore, any kind of "statistical treatment" stands and falls with the choice of it's microstructure, or put differently, the entire treatment is _relative_ to this choice.

/Fredrik
 
  • #8
Well, thermodynamic entropy is equivalent to Shannon entropy, when you set k to 1.
Then heat has the same units as information, then heat = information in (0,1) basis.
 
  • #9
sirchasm said:
Well, thermodynamic entropy is equivalent to Shannon entropy, when you set k to 1.
Then heat has the same units as information, then heat = information in (0,1) basis.

Yes there are a few different issues I see here. First is that some entropies like Gibbs, is constructed from a probability space. Boltzmanns, are constructed instead from a combinatorial approach if microstates. It effectively constructs a kind of discrete probability space.

The microstructure in physical systems, are the possible microstates a system can have, various degress of freedom. But aside that, the other major difference between the physical entropies is like you note boltzmanns constant.

When relating these measures, what is the conceptual meaning of boltzmanns constant?

As I see it, it's meaning becomes obvious only when you put the entropy measure in a larger context. For example, why do we need a measure of disorder in the first place? Well, it's because it helps us figure out what more, or less probable. Ultimately, probabilities are linked to probabilities. There are also entropies (like the relative KL entropy) that relate to transition probabilities.

If you play with these expressions, the constant in place of boltzmanns, typically represents a total "mass" of the probability distribution. Ie. it scales the weight of the entire measure itself. I.e it's a kind of measure of measure.

This is where I think it gets more interesting, and it relates to probabilities in the sense that the probability of something "specifically-unlikely" still depends on the total confidence _in hte measure itself_, which is pretty much the function of boltzmanns constnat IMHO.

This can be exploited also when you consider dynamical, and competing microstructures or probabiltiy spaces, then the constant in front, is a weight between the structures.

So the full relevance of boltzmanns constant enters the picture first when then entropy measure is used in probability calculations, or the measures is leveled against alternative measures of disorder.

/Fredrik
 

What is string theory?

String theory is a theoretical framework that aims to reconcile quantum mechanics and general relativity by describing all fundamental particles as tiny vibrating strings. It suggests that the universe is made up of these strings rather than point-like particles.

What is entropy?

Entropy is a measure of the disorder or randomness in a system. In physics, it is often associated with the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time.

How do string theory and entropy relate?

In string theory, entropy is associated with the number of different ways that strings can vibrate, and thus the number of possible states of the universe. It is thought that string theory may help explain the origin of entropy and the second law of thermodynamics.

What is the role of entropy in the black hole information paradox?

The black hole information paradox is a problem in physics that arises when trying to reconcile the laws of quantum mechanics with the existence of black holes. Entropy plays a crucial role in this paradox, as it is thought that the information about particles that fall into a black hole is preserved in the form of the black hole's entropy.

Is string theory and entropy supported by evidence?

At this time, there is no direct experimental evidence for string theory and its predictions about entropy. However, many physicists believe that it is a promising theory that may one day be supported by experimental data. Currently, string theory and entropy are still areas of active research and debate in the scientific community.

Similar threads

  • Beyond the Standard Models
Replies
31
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
26
Views
481
Replies
47
Views
4K
  • Beyond the Standard Models
Replies
3
Views
1K
  • Beyond the Standard Models
Replies
6
Views
2K
  • Beyond the Standard Models
Replies
13
Views
1K
  • Beyond the Standard Models
Replies
4
Views
1K
  • Beyond the Standard Models
Replies
22
Views
4K
  • Beyond the Standard Models
Replies
9
Views
2K
Back
Top