Why Is the Conditional Entropy of g(X) Given X Zero?

AI Thread Summary
The conditional entropy of a function g(X) given the random variable X is zero because knowing the value of X allows for a precise determination of g(X). When X is known, g(X) becomes a fixed value, eliminating any uncertainty about it. This relationship is expressed mathematically as H(g(X) | X) = 0. The discussion highlights the conceptual understanding that once X is known, g(X) can be inferred with certainty. Thus, the conditional entropy reflects the lack of uncertainty in this scenario.
PhillipKP
Messages
65
Reaction score
0
Hi

I'm trying to convince myself that the conditional entropy of a function of random variable X given X is 0.

H(g(X)|X)=0

The book I'm reading (Elements of Information Theory) says that since for any particular value of X, g(X) is fixed, thus the statement is true. But I don't understand why this makes the conditional entropy 0.

Obviously I don't understand conceptually what conditional entropy really is...

Can anyone please provide some "gentle" insight into this?
 
Physics news on Phys.org
If we know X=x, then we can precisely infer g(X) = g(x) with probability 1, and anything else with probability 0. Hence there is no uncertainty about g(X) once we know X. Written in symbols, the previous sentence is H(g(X) | X) = 0.
 
Ah that makes very good conceptual sense. Thank you for the short but insightful explanation.

Cheers

Phillip
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top