Information Theory and Mathematics

Click For Summary
In information theory, the information content of a statement like "1+1=2" is considered to be zero because its probability of being true is 1.0, leading to a logarithmic value of zero. The discussion highlights the necessity of defining possible messages and their probabilities before assessing information content. It raises questions about the relativity of probability in mathematical truths and whether mathematical structure contributes to a probability of 1. However, the conversation notes that vague philosophical interpretations of "information" lack specificity and are often redirected to other sections of the forum. Ultimately, without a defined context, the concept of "information" remains ambiguous in mathematics.
wheelersbit
Messages
10
Reaction score
0
In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
 
Mathematics news on Phys.org
the information content (measured in the sense of Shannon) of the message: "1+1=2" or any other proven theorem is 0.

you have to define what all of the possible messages are first, then figure out the probability of each message, and then you can start asking what the measure of information content is.
 
That follows very simply from what you give. The probability that "1+ 1= 2" (or any axiom) is true is 1.0 and its logarithm is 0.
 
Doesn't this seem to use a relative sense of probability? For example could we have said at one time this was in fact a very improbable event or do we rely on mathematical realism to tell us this was never improbable?

To give another example can we say structure in mathematics is what gives this a probability of 1?

I really appreciate your responses.
 
wheelersbit said:
To give another example can we say structure in mathematics is what gives this a probability of 1?

You could say that, but it doesn't have a specific meaning! - and all the vague philosophical threads get locked or moved to the PF Lounge section.

If you are looking for some universal interpretation of "information" in the physical world, you'll have to go to the physics sections. In mathematics, the concept of "information" is no more universal than the concept of "rate of change". If you define a specific variable (such as the position of an object, or the price of a stock) then you can talk about a specific "rate of change". If you define a specific probability distribution, then you can talk about it's information. If you don't get specific then "information" isn't a specific thing.
 
I understand thank you.
 
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 15 ·
Replies
15
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
1
Views
1K
  • · Replies 38 ·
2
Replies
38
Views
4K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K