Information Theory and Mathematics

Click For Summary

Discussion Overview

The discussion revolves around the concept of information in the context of Shannon information theory, particularly focusing on the information content of mathematical statements such as "1+1=2" and the implications of probability in defining information content. The scope includes theoretical exploration and philosophical considerations regarding the nature of mathematical truths and their information content.

Discussion Character

  • Exploratory
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants assert that the information content of "1+1=2" is 0, based on the premise that its probability of being true is 1.0, leading to a logarithmic value of 0.
  • Others question whether this interpretation relies on a relative sense of probability, suggesting that historical context might have influenced perceptions of probability in mathematical truths.
  • A participant proposes that the structure in mathematics could be what gives statements like "1+1=2" a probability of 1, although this assertion is challenged for its vagueness.
  • There is a suggestion that without a specific definition of "information," the concept remains ambiguous, similar to the notion of "rate of change" in mathematics.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the interpretation of information content in mathematical statements, with multiple competing views regarding the role of probability and the meaning of information in mathematics.

Contextual Notes

Limitations include the dependence on definitions of probability and information, as well as the unresolved philosophical implications of mathematical truths.

wheelersbit
Messages
10
Reaction score
0
In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
 
Mathematics news on Phys.org
the information content (measured in the sense of Shannon) of the message: "1+1=2" or any other proven theorem is 0.

you have to define what all of the possible messages are first, then figure out the probability of each message, and then you can start asking what the measure of information content is.
 
That follows very simply from what you give. The probability that "1+ 1= 2" (or any axiom) is true is 1.0 and its logarithm is 0.
 
Doesn't this seem to use a relative sense of probability? For example could we have said at one time this was in fact a very improbable event or do we rely on mathematical realism to tell us this was never improbable?

To give another example can we say structure in mathematics is what gives this a probability of 1?

I really appreciate your responses.
 
wheelersbit said:
To give another example can we say structure in mathematics is what gives this a probability of 1?

You could say that, but it doesn't have a specific meaning! - and all the vague philosophical threads get locked or moved to the PF Lounge section.

If you are looking for some universal interpretation of "information" in the physical world, you'll have to go to the physics sections. In mathematics, the concept of "information" is no more universal than the concept of "rate of change". If you define a specific variable (such as the position of an object, or the price of a stock) then you can talk about a specific "rate of change". If you define a specific probability distribution, then you can talk about it's information. If you don't get specific then "information" isn't a specific thing.
 
I understand thank you.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
Replies
3
Views
2K
Replies
1
Views
1K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K