Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Information Theory

  1. Jun 1, 2009 #1
    Right guys i am on my 3rd year at uni doing computer network management & design, i have passed everything apart from network design and im having to take a resit.

    I dont understand information theory AT ALL.

    They went over it in class but made no sense, i also asked the tutor to explain it to me but it still doesnt make any sense.

    Can anyone help with this, website,pdf anything would be helpful

  2. jcsd
  3. Jun 1, 2009 #2
    In order to understand information theory you first need to understand what a random variable is. The entropy of a random variable is the average number of bits required to transmit the value of that random variable from one point to another.

    That's the basic idea, but you could try your book or wikipedia for more information.
  4. Jun 7, 2009 #3
    Information theory attempts to ask certain questions like :
    A] What does it mean for an information to be useful? How can I measure it?
    B] Given that there is a noisy medium for this information to be communicated through, what should be the pre-requisites for this medium?
    C] How can I find whether one information is dependent on the other? Can I measure it?
    D] Can I transform one information into another and back?

    As you can see, these questions are very general. The answer to these questions highly depend on how information gets defined. The very definition of information is very diverse. For example,
    1] the text in this reply is information being passed from me to you OR
    2] when alice sang to bob about einstein, the song is the information being passed from alice to bob OR
    3] when I hit "post quick reply", there is a bunch of "binary data" being passed from my computer to PF server, enabling the message to appear on the board. The binary data is the information being passed from my computer to PF server.

    The 3rd example shows the kind of information that you are interested in.

    In your case, information can be non-rigorously defined as a set of binary bits being passed from one computer to another over a medium, which we simply call as network.

    Now that we have defined information and its medium, we can start asking the questions that we gave earlier.

    Your book should cover the answers to the above questions. Entropy should answer (A), channel capacity should answer (B), mutual information should answer (C) etc.

    As for books, a short internet search for "Information Theory Lecture Notes" should yield with abundant material.

    -- AI
    P.S. -> I have taken certain liberties above (for e.g., I say entropy answers [A], wherein I should probably say entropy is one way of answering [A]), but I have taken those to avoid confusing you (any further than I might already have).
  5. Jun 7, 2009 #4
    in a nutshell:

    aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa contains less information than
    sneuircheuictgduocrfuigfeuyoguetdoeuciah because it can be replaced by
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook