Why Is the Conditional Entropy of g(X) Given X Zero?

PhillipKP
Messages
65
Reaction score
0
Hi

I'm trying to convince myself that the conditional entropy of a function of random variable X given X is 0.

H(g(X)|X)=0

The book I'm reading (Elements of Information Theory) says that since for any particular value of X, g(X) is fixed, thus the statement is true. But I don't understand why this makes the conditional entropy 0.

Obviously I don't understand conceptually what conditional entropy really is...

Can anyone please provide some "gentle" insight into this?
 
Physics news on Phys.org
If we know X=x, then we can precisely infer g(X) = g(x) with probability 1, and anything else with probability 0. Hence there is no uncertainty about g(X) once we know X. Written in symbols, the previous sentence is H(g(X) | X) = 0.
 
Ah that makes very good conceptual sense. Thank you for the short but insightful explanation.

Cheers

Phillip
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top