Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Conditional Entropy, H(g(X)|X)

  1. Sep 4, 2009 #1
    Hi

    I'm trying to convince myself that the conditional entropy of a function of random variable X given X is 0.

    H(g(X)|X)=0

    The book I'm reading (Elements of Information Theory) says that since for any particular value of X, g(X) is fixed, thus the statement is true. But I don't understand why this makes the conditional entropy 0.

    Obviously I don't understand conceptually what conditional entropy really is...

    Can anyone please provide some "gentle" insight into this?
     
  2. jcsd
  3. Sep 4, 2009 #2
    If we know X=x, then we can precisely infer g(X) = g(x) with probability 1, and anything else with probability 0. Hence there is no uncertainty about g(X) once we know X. Written in symbols, the previous sentence is H(g(X) | X) = 0.
     
  4. Sep 4, 2009 #3
    Ah that makes very good conceptual sense. Thank you for the short but insightful explanation.

    Cheers

    Phillip
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook




Loading...