Register to reply

Formula for entropy conditional on >1 variables?

Share this thread:
Oct6-11, 06:34 PM
P: 4

I want to compute the conditional entropy H(A | B,C), given probability distributions for each of the variables.

It would be nice to have a right-hand side not involving conditionals. H(A|B) = H(A,B) - H(B) but how does it work out if there are more than one conditional variable?

I tried, as a dumb guess, H(A,B,C) - H(B,C), but from using it further down the line, I doubt that's correct.

Phys.Org News Partner Science news on
Mysterious source of ozone-depleting chemical baffles NASA
Water leads to chemical that gunks up biofuels production
How lizards regenerate their tails: Researchers discover genetic 'recipe'
Oct7-11, 11:25 AM
P: 4
I worked through it again and discovered an error or two in my calculations. In the end it looks like

H(A|B,C) = H(A,B,C) - H(B,C)

is correct after all.

This is quite clear when illustrated with a Venn diagram similar to the one here.

Register to reply

Related Discussions
Conditional PDF with multiple random variables Calculus & Beyond Homework 4
Conditional expectation on multiple variables Set Theory, Logic, Probability, Statistics 2
Conditional Entropy, H(g(X)|X) Set Theory, Logic, Probability, Statistics 2
Expectation conditional on the sum of two random variables Calculus & Beyond Homework 0