Information Theory and Mathematics

In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
 the information content (measured in the sense of Shannon) of the message: "1+1=2" or any other proven theorem is 0. you have to define what all of the possible messages are first, then figure out the probability of each message, and then you can start asking what the measure of information content is.
 Recognitions: Gold Member Science Advisor Staff Emeritus That follows very simply from what you give. The probability that "1+ 1= 2" (or any axiom) is true is 1.0 and its logarithm is 0.

Information Theory and Mathematics

Doesn't this seem to use a relative sense of probability? For example could we have said at one time this was in fact a very improbable event or do we rely on mathematical realism to tell us this was never improbable?

To give another example can we say structure in mathematics is what gives this a probability of 1?

I really appreciate your responses.

Recognitions: