Formula for entropy conditional on >1 variables?

Inquirer
Messages
4
Reaction score
0
Hello,

I want to compute the conditional entropy H(A | B,C), given probability distributions for each of the variables.

It would be nice to have a right-hand side not involving conditionals. H(A|B) = H(A,B) - H(B) but how does it work out if there are more than one conditional variable?

I tried, as a dumb guess, H(A,B,C) - H(B,C), but from using it further down the line, I doubt that's correct.

Thanks!
 
Physics news on Phys.org
I worked through it again and discovered an error or two in my calculations. In the end it looks like

H(A|B,C) = H(A,B,C) - H(B,C)

is correct after all.

This is quite clear when illustrated with a Venn diagram similar to the one https://secure.wikimedia.org/wikipedia/en/wiki/Conditional_entropy" .
 
Last edited by a moderator:
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top