Why Is Information Theory So Confusing?

In summary, the person is having difficulty understanding information theory and is looking for help. They suggest reading books or searching the internet for more information.
  • #1
pr0xibus
2
0
Right guys i am on my 3rd year at uni doing computer network management & design, i have passed everything apart from network design and I am having to take a resit.

I don't understand information theory AT ALL.

They went over it in class but made no sense, i also asked the tutor to explain it to me but it still doesn't make any sense.

Can anyone help with this, website,pdf anything would be helpful

Cheers
 
Technology news on Phys.org
  • #2
In order to understand information theory you first need to understand what a random variable is. The entropy of a random variable is the average number of bits required to transmit the value of that random variable from one point to another.

That's the basic idea, but you could try your book or wikipedia for more information.
 
  • #3
Information theory attempts to ask certain questions like :
A] What does it mean for an information to be useful? How can I measure it?
B] Given that there is a noisy medium for this information to be communicated through, what should be the pre-requisites for this medium?
C] How can I find whether one information is dependent on the other? Can I measure it?
D] Can I transform one information into another and back?
etc.

As you can see, these questions are very general. The answer to these questions highly depend on how information gets defined. The very definition of information is very diverse. For example,
1] the text in this reply is information being passed from me to you OR
2] when alice sang to bob about einstein, the song is the information being passed from alice to bob OR
3] when I hit "post quick reply", there is a bunch of "binary data" being passed from my computer to PF server, enabling the message to appear on the board. The binary data is the information being passed from my computer to PF server.

The 3rd example shows the kind of information that you are interested in.

In your case, information can be non-rigorously defined as a set of binary bits being passed from one computer to another over a medium, which we simply call as network.

Now that we have defined information and its medium, we can start asking the questions that we gave earlier.

Your book should cover the answers to the above questions. Entropy should answer (A), channel capacity should answer (B), mutual information should answer (C) etc.

As for books, a short internet search for "Information Theory Lecture Notes" should yield with abundant material.

-- AI
P.S. -> I have taken certain liberties above (for e.g., I say entropy answers [A], wherein I should probably say entropy is one way of answering [A]), but I have taken those to avoid confusing you (any further than I might already have).
 
  • #4
in a nutshell:

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa contains less information than
sneuircheuictgduocrfuigfeuyoguetdoeuciah because it can be replaced by
a*35
 

1. What is information theory?

Information theory is a branch of mathematics and computer science that deals with the quantification, storage, and communication of information. It provides a framework for understanding how information is transmitted and processed, and has applications in fields such as communication systems, data compression, and cryptography.

2. What are the key concepts of information theory?

The key concepts of information theory include entropy, information content, and coding. Entropy refers to the amount of uncertainty or randomness in a message, while information content measures the amount of information contained in a message. Coding is the process of transforming information into a more efficient representation for storage or transmission.

3. How is information theory used in communication systems?

Information theory is used in communication systems to optimize the transmission and reception of information. It allows for the design of efficient coding schemes, modulation techniques, and error correction methods to ensure reliable communication over noisy channels. Information theory also provides a measure of channel capacity, which is the maximum rate at which information can be transmitted with a given level of reliability.

4. What is the relationship between information theory and data compression?

Information theory and data compression are closely related, as both deal with the efficient representation and transmission of information. Information theory provides a theoretical framework for understanding the fundamental limits of data compression, while data compression techniques use coding methods based on information theory to reduce the size of data files.

5. How does information theory apply to cryptography?

Information theory is essential in cryptography, the study of secure communication and data protection. It provides a way to quantify the uncertainty of a message and determine the strength of a cryptographic system. Information theory is also used in the design and analysis of encryption algorithms, which are used to protect sensitive information from unauthorized access.

Similar threads

Replies
2
Views
1K
  • Programming and Computer Science
Replies
1
Views
982
Replies
1
Views
814
  • Programming and Computer Science
Replies
5
Views
3K
Replies
2
Views
882
Replies
8
Views
2K
  • Quantum Physics
Replies
3
Views
714
Replies
1
Views
735
Back
Top