Danger Will Robinson!
Fra said:
I am not an expert in any particular topic I just try to follow my own strategy to answer my questions, I like to learn what's relevant along the way. It sure is easy to diverge away in your own thoughts but I find it similarly risky to be persuade by existing formalisms. I try to find the balance to both stay on track but not waste the intrinsic motivation and creativity...I have plenty of time to reflect properly over things and not just bury in mathematics.
Some of this seems to reflect the myth (common among "armchair scientists") that delving into the literature will stifle creativity. The truth is quite the opposite. By studying good textbooks and expository papers about a field you would like to contribute to, you avoid repeating one after another common beginner's mistakes, which makes your progress toward attaining some level of mastery much more efficient. Furthermore, reading really good ideas from those who are already experts in the field makes it much more likely that your own creativity will lead to something genuinely novel and possibly interesting to others.
Fra said:
The problem with information is also that it ultimately builds on probability theory
Once again, one of the major points I tried to make is "it ain't neccessarily so". While at the same time urging you all to stop posting here and read Shannon 1948 and some other good sources of information about Shannon's information theory (the first, by far the most highly developed, and in many ways the most impressive, but nonetheless, not the most suitable for every phenomenon of possible interest in which "information" appears to play a role).
Fra said:
The notion of probability is strictly speaking uncertain, we can only find what we think is the probability. WE can measure the frequency, but what is the "true probability"?
That is only one of the issues I mentioned concerning "uncertainties of probability".
Fra said:
Like Chris note this brings you down to the axioms of probability and the interpretations.
You're doing it again. That is
not what I said!
Mike2 said:
My assumptions seem too esoteric to give up lightly
Too
esoteric to give up lightly?
Mike2 said:
I understand that entropy/information can be measured by observation.
That is
not what I said!
Mike2 said:
But I suspect that we will eventually find a mathematical model for everything, and we will want to also describe entropy in terms of that model.
I trust you mean "model of something other than communication" (or "information"). Or perhaps "theory" of something?
One of my points was that Shannon 1948 is "the very model of a mathematical theory". Therefore, it behooves anyone seeking to build a mathematical theory of
anything to learn what Shannon did.
Mike2 said:
Now I suppose that some mathematical models may have multiple solutions.
Are you sure you are not confusing
mathematical model with
field equation?
The Schwarzschild perfect fluid matched to a Schwarzschild vacuum exterior is a
mathematical model of an isolated nonrotating object, formulated in a certain
physical theory, gtr. It is also a (global, exact)
solution to the Einstein field equation which lies at the heart of that theory.
The Markov chains discussed by Shannon in his 1948 paper form a sequence of
mathematical models of natural language production. As he is careful to stress, this kind of model cannot possibly capture aspects of natural language other than statistics. His ultimate point is that the [i[mathematical theory[/i] he constructs, motivated by this sequence of Markov chains (which provide more and more accurate models of the purely statistical aspects of natural language production), it turns out that statistical structure is the only kind which is needed. Which is why I was careful to stress that in Shannon's theory, a nonzero
mutual information between two Markov chains does not imply any direction of causality, only a statistical correlation in behavior.
Mike2 said:
One question is if there are multiple solutions, then does this always imply an underlying symmetry group?
Can you clarify what you mean by "solution" and "model"?
Mike2 said:
Another question is can the alternatives always be normalized into a probability distribution for the various possibilities?
You should be able to answer that yourself, I think. (This comes up in any good textbook on quantum mechanics, for example.)
Mike2 said:
Or is it more the case that just because there are alternatives doesn't mean we can know how probable one solution is over another? Or is there a natural measure of how likely one solution is in terms of how much of the underlying set is occupied by each solution?
I really, truly, deeply urge you to study Shannon 1948.
http://www.math.uni-hamburg.de/home/gunesch/Entropy/infcode.html
Mike2 said:
If information is so broady defined that it need not even be describable with a number,
I didn't say that!
Mike2 said:
If alternatives can be normalized into a probability distribution, would that mean that the size of the underlying symmetry group relates to entropy? Or do we need more than just the size of the group to form a distribuition?
I discussed a number of quite different theories of information. The whole point was that there are many quite different ways of defining notions of information. Some use very little structure (e.g. Boltzmann's theory), some require the presence of a probability measure (Shannon's theory) or a group action (theory of Planck's "complexions"). So I think you may be mixing up at least two theories and two or three levels of mathematical structure.
Mike2 said:
And could the needed information be gotten also from group properties in order to form a distribution?
In a situation in which the "entropies" defined in two or more information theories makes sense, because the requisite mathematical structure (probability, action) are present, it is reasonable to ask how these quantities are related. As I said, in general they are not numerically the same, but they may approximate each other or even approach each other in some limit.
If you do the exercises I suggested, you should be able to answer your own question.
Mike2 said:
This sounded intriging, of course, but perhaps I read more into that than warranted, my appologies. To me this seemed to contain the seeds for a generalized surface entropy formula for any underlying set X. My understanding of simplicial complexes is that they can approximate any manifold. Or am I misunderstanding your use of complexion?
Are you perhaps confusing the so-called holography principle with something you think I said?
Mike2 said:
If I'm reading you right, it would seem to mean that we have a fundamental definition of entropy (maybe not information) for any mathematical model with symmetry properties.
I said that whenever we have a group action, we have complexions, and these obey essentially the same formal properties as Shannon's entropies, in particular the quotient law. Thus, any structural invariant of these will also respect the formal properties of Shannon's entropies, and thus will admit an interpretation in terms of "information". I briefly mentioned two case in which such "Galois entropies" are obvious (actions on finite sets, and finite dimensional Lie groups of diffeomorphisms), but I did
not imply that such quantities can be found for any group action whatever.
(Regarding axiomatics: note that Shannon's statement of the formal properties he takes as axiomatic are given in the context of probability. This is why what I just said doesn't contradict his famous unicity theorem. The formal properties of which I speak can however be expressed in a more general context that probability theory, namely what I called (in "What is Information?")
join-sets, a kind of weakening of
lattice as in
lattice theory.)