Why Is Information Theory So Confusing?

Click For Summary

Discussion Overview

The discussion centers around the challenges of understanding information theory, particularly in the context of computer network management and design. Participants explore foundational concepts, definitions, and resources to aid comprehension.

Discussion Character

  • Conceptual clarification
  • Debate/contested
  • Homework-related

Main Points Raised

  • One participant expresses difficulty in grasping information theory despite classroom instruction and seeks additional resources.
  • Another participant suggests that understanding random variables is essential for grasping information theory, noting that entropy represents the average bits needed to transmit a random variable.
  • A third participant outlines key questions in information theory, emphasizing the variability in defining "information" and providing examples of different contexts in which information can be understood.
  • This participant also indicates that concepts like entropy, channel capacity, and mutual information relate to the earlier posed questions, suggesting that textbooks and lecture notes could provide further insights.
  • A later post humorously illustrates a point about information content by comparing two strings, implying that redundancy affects the amount of information conveyed.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best approach to understanding information theory, with varying perspectives on foundational concepts and definitions. The discussion remains unresolved regarding the most effective resources or explanations.

Contextual Notes

Participants acknowledge the complexity of defining information and its implications for understanding information theory. There is an indication that definitions may vary based on context, which could affect comprehension.

pr0xibus
Messages
2
Reaction score
0
Right guys i am on my 3rd year at uni doing computer network management & design, i have passed everything apart from network design and I am having to take a resit.

I don't understand information theory AT ALL.

They went over it in class but made no sense, i also asked the tutor to explain it to me but it still doesn't make any sense.

Can anyone help with this, website,pdf anything would be helpful

Cheers
 
Technology news on Phys.org
In order to understand information theory you first need to understand what a random variable is. The entropy of a random variable is the average number of bits required to transmit the value of that random variable from one point to another.

That's the basic idea, but you could try your book or wikipedia for more information.
 
Information theory attempts to ask certain questions like :
A] What does it mean for an information to be useful? How can I measure it?
B] Given that there is a noisy medium for this information to be communicated through, what should be the pre-requisites for this medium?
C] How can I find whether one information is dependent on the other? Can I measure it?
D] Can I transform one information into another and back?
etc.

As you can see, these questions are very general. The answer to these questions highly depend on how information gets defined. The very definition of information is very diverse. For example,
1] the text in this reply is information being passed from me to you OR
2] when alice sang to bob about einstein, the song is the information being passed from alice to bob OR
3] when I hit "post quick reply", there is a bunch of "binary data" being passed from my computer to PF server, enabling the message to appear on the board. The binary data is the information being passed from my computer to PF server.

The 3rd example shows the kind of information that you are interested in.

In your case, information can be non-rigorously defined as a set of binary bits being passed from one computer to another over a medium, which we simply call as network.

Now that we have defined information and its medium, we can start asking the questions that we gave earlier.

Your book should cover the answers to the above questions. Entropy should answer (A), channel capacity should answer (B), mutual information should answer (C) etc.

As for books, a short internet search for "Information Theory Lecture Notes" should yield with abundant material.

-- AI
P.S. -> I have taken certain liberties above (for e.g., I say entropy answers [A], wherein I should probably say entropy is one way of answering [A]), but I have taken those to avoid confusing you (any further than I might already have).
 
in a nutshell:

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa contains less information than
sneuircheuictgduocrfuigfeuyoguetdoeuciah because it can be replaced by
a*35
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
1
Views
3K
  • · Replies 134 ·
5
Replies
134
Views
12K
Replies
4
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K