Information Theory and Mathematics

In summary, Shannon's information theory defines information as the logarithm of the inverse of the probability. The information content of a message, such as "1+1=2" or any proven theorem, is 0 according to this theory. To determine the information content, one must first define all possible messages and their probabilities. This approach relies on a relative sense of probability, which may differ depending on one's interpretation. In mathematics, the concept of information is not universal and must be defined specifically.
  • #1
wheelersbit
10
0
In (Shannon) information theory, information is said to be the log of the inverse of the probability - what is the information content of 1+1=2? Or for that matter any fundamental axiom.
 
Mathematics news on Phys.org
  • #2
the information content (measured in the sense of Shannon) of the message: "1+1=2" or any other proven theorem is 0.

you have to define what all of the possible messages are first, then figure out the probability of each message, and then you can start asking what the measure of information content is.
 
  • #3
That follows very simply from what you give. The probability that "1+ 1= 2" (or any axiom) is true is 1.0 and its logarithm is 0.
 
  • #4
Doesn't this seem to use a relative sense of probability? For example could we have said at one time this was in fact a very improbable event or do we rely on mathematical realism to tell us this was never improbable?

To give another example can we say structure in mathematics is what gives this a probability of 1?

I really appreciate your responses.
 
  • #5
wheelersbit said:
To give another example can we say structure in mathematics is what gives this a probability of 1?

You could say that, but it doesn't have a specific meaning! - and all the vague philosophical threads get locked or moved to the PF Lounge section.

If you are looking for some universal interpretation of "information" in the physical world, you'll have to go to the physics sections. In mathematics, the concept of "information" is no more universal than the concept of "rate of change". If you define a specific variable (such as the position of an object, or the price of a stock) then you can talk about a specific "rate of change". If you define a specific probability distribution, then you can talk about it's information. If you don't get specific then "information" isn't a specific thing.
 
  • #6
I understand thank you.
 

1. What is information theory and how is it related to mathematics?

Information theory is a branch of mathematics that deals with the quantification, storage, and communication of information. It is closely related to mathematics as it uses mathematical principles to study the properties and limitations of information and communication systems.

2. What are some key concepts in information theory?

Some key concepts in information theory include entropy, channel capacity, and data compression. Entropy measures the uncertainty or randomness of a system, while channel capacity determines the maximum amount of information that can be reliably transmitted through a channel. Data compression involves reducing the size of data without losing important information.

3. How is information theory used in real-world applications?

Information theory has many practical applications, such as in data compression algorithms used in computer science, error-correcting codes used in telecommunication systems, and cryptography used in securing sensitive information. It is also used in fields such as neuroscience, biology, and economics to study the flow of information in complex systems.

4. What are some famous theorems in information theory?

Some famous theorems in information theory include the Shannon-Hartley theorem, which gives the maximum channel capacity for a given amount of noise, and Shannon's source coding theorem, which states that data can be compressed to almost its entropy without any loss of information. Other notable theorems include the data processing inequality, channel coding theorem, and rate-distortion theory.

5. How has information theory evolved over time?

Information theory was first introduced by Claude Shannon in the late 1940s and has since evolved into a multidisciplinary field with applications in various industries. In recent years, advancements in technology and the rise of big data have led to new developments in information theory, such as the study of information in complex networks and the application of information theory to machine learning and artificial intelligence.

Similar threads

  • General Math
Replies
2
Views
3K
Replies
2
Views
848
Replies
14
Views
1K
  • General Math
Replies
1
Views
950
Replies
3
Views
300
Replies
1
Views
75
  • General Math
Replies
12
Views
2K
Replies
4
Views
961
  • General Math
Replies
13
Views
3K
Back
Top