Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

String Theory and Entropy

  1. Feb 19, 2009 #1
    Are changes in the frequencies of vibrating strings the cause of entropy? Does entropy also count in the case of changes in the frequency of vibrating strings?
     
    Last edited: Feb 19, 2009
  2. jcsd
  3. Feb 20, 2009 #2
    You mean, do strings have modes that increase informational entropy anywhere?

    ..yea, that means: do they inform the universe?
     
  4. Feb 20, 2009 #3
    What do you mean by "the cause of entropy" ? I would guess that you are talking about a cause of the 2nd law of thermodynamics, which states that the entropy in an isolated system never decreases.

    Then my answer would be that we do not need to use string theory or even quantum mechanics to explain this law, as it is already well explained in classical physics.

    Entropy is a property of an ensemble of strings, which can pass energy between each other, and a higher energy means a higher frequency. The strings exchange energy until they arrive at a maximal entropy equilibrium state.
     
  5. Feb 20, 2009 #4
    When is informational entropy equal to thermodynamic entropy? What dimensions does information have?
     
  6. Feb 20, 2009 #5

    Fra

    User Avatar

    personal view on this

    I personally thinkg the whole notion of information and entropy is really more complicated than what most basic treatments admit.

    To construct a measure of information is an ambigous task in general. There are different entropy measures around, and all of them aim to somehow be a measure of information, or missing information, depending on how you see it. But to construct a measure, how do you know (measure??) when you have the "best" measure?

    Most constructions start with either axioms (such as cox axioms) that the authors feels is obvious enough to not be questioned by most readers, or various "desired properties". But these constructions are not innocent or universal.

    About the problem of the "ergodic hypotheis", most measures of information, start with a choice of microstructure, whose state are supposedly encoding the information. But the funny part is that you often equip with an equiprobabiltiy hypothesis or other wise and ad hoc prior probabiltiy for each state, that affects the measure constructed. The point is that you can choose a microstructure, to tweak your measures as desried.

    Unless you can infer a natural preferred microstrcuture and prior from some first principles. In string theory, the basic microstructure implied in the idea that particles are really strings is also such ambigous choice.

    I personally think one solution, yey to be elaborated, is a evolutionary and relational approach to the very construction of these measures. This would suggest that there really is no preferred fixed microstructure, the microstructures are also emerging dynamically in relation to it's environment. In particular is the notion of degree of freedom emergent, when one system (=one observer) relates to something unkonwn(=it's unkonwn environment in the general case). Not only does an image emerge, the screen where the image lives must also emerge by a related logic.

    /Fredrik
     
  7. Feb 20, 2009 #6

    Fra

    User Avatar

    In computer science you have an unquestionably given universal microstructure - digital memory. But such simple escapes doesn't do in this case IMO (ie pondering the future of physics and QG), it is much more complicated. I see how hope in finding and universal hilbert space, I think we must consider the idea of dynamical microstructures and hilbert spaces as well, not ONLY dynamical microstates and state vectors moving around in a fixed background hilbert space (~microstructure).

    /Fredrik
     
  8. Feb 20, 2009 #7

    Fra

    User Avatar

    Re: personal view on this

    To clarify my comments - the problem of constructing intrinsic measures, is pretty much the same or at worst related to the problem of choosing/constructing what are to be considered observables, because I think the ideal is that since we can only make sensible statements on observables, our model should treat and predict observables. Therefore, any kind of "statistical treatment" stands and falls with the choice of it's microstructure, or put differently, the entire treatment is _relative_ to this choice.

    /Fredrik
     
  9. Feb 20, 2009 #8
    Well, thermodynamic entropy is equivalent to Shannon entropy, when you set k to 1.
    Then heat has the same units as information, then heat = information in (0,1) basis.
     
  10. Feb 21, 2009 #9

    Fra

    User Avatar

    Yes there are a few different issues I see here. First is that some entropies like Gibbs, is constructed from a probability space. Boltzmanns, are constructed instead from a combinatorial approach if microstates. It effectively constructs a kind of discrete probability space.

    The microstructure in physical systems, are the possible microstates a system can have, various degress of freedom. But aside that, the other major difference between the physical entropies is like you note boltzmanns constant.

    When relating these measures, what is the conceptual meaning of boltzmanns constant?

    As I see it, it's meaning becomes obvious only when you put the entropy measure in a larger context. For example, why do we need a measure of disorder in the first place? Well, it's because it helps us figure out what more, or less probable. Ultimately, probabilities are linked to probabilities. There are also entropies (like the relative KL entropy) that relate to transition probabilities.

    If you play with these expressions, the constant in place of boltzmanns, typically represents a total "mass" of the probability distribution. Ie. it scales the weight of the entire measure itself. I.e it's a kind of measure of measure.

    This is where I think it gets more interesting, and it relates to probabilities in the sense that the probability of something "specifically-unlikely" still depends on the total confidence _in hte measure itself_, which is pretty much the function of boltzmanns constnat IMHO.

    This can be exploited also when you consider dynamical, and competing microstructures or probabiltiy spaces, then the constant in front, is a weight between the structures.

    So the full relevance of boltzmanns constant enters the picture first when then entropy measure is used in probability calculations, or the measures is leveled against alternative measures of disorder.

    /Fredrik
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook