Formula for entropy conditional on >1 variables?

AI Thread Summary
To compute the conditional entropy H(A | B,C), the correct formula is H(A|B,C) = H(A,B,C) - H(B,C). Initial attempts at deriving this formula led to confusion, but after recalculating, the conclusion was confirmed. The relationship is clearer when visualized with a Venn diagram. This formula allows for the computation of conditional entropy without involving additional conditionals. Understanding this concept is crucial for working with multiple variables in probability distributions.
Inquirer
Messages
4
Reaction score
0
Hello,

I want to compute the conditional entropy H(A | B,C), given probability distributions for each of the variables.

It would be nice to have a right-hand side not involving conditionals. H(A|B) = H(A,B) - H(B) but how does it work out if there are more than one conditional variable?

I tried, as a dumb guess, H(A,B,C) - H(B,C), but from using it further down the line, I doubt that's correct.

Thanks!
 
Physics news on Phys.org
I worked through it again and discovered an error or two in my calculations. In the end it looks like

H(A|B,C) = H(A,B,C) - H(B,C)

is correct after all.

This is quite clear when illustrated with a Venn diagram similar to the one https://secure.wikimedia.org/wikipedia/en/wiki/Conditional_entropy" .
 
Last edited by a moderator:
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...
Back
Top