Formula for entropy conditional on >1 variables?

In summary, the conversation discusses how to compute the conditional entropy H(A | B,C) using probability distributions for each variable. The question arises on how to calculate it for multiple conditional variables, and a potential solution is suggested but doubted. After further calculations, it is determined that the correct formula is H(A | B,C) = H(A,B,C) - H(B,C). This is clarified with a Venn diagram.
  • #1
Inquirer
4
0
Hello,

I want to compute the conditional entropy H(A | B,C), given probability distributions for each of the variables.

It would be nice to have a right-hand side not involving conditionals. H(A|B) = H(A,B) - H(B) but how does it work out if there are more than one conditional variable?

I tried, as a dumb guess, H(A,B,C) - H(B,C), but from using it further down the line, I doubt that's correct.

Thanks!
 
Physics news on Phys.org
  • #2
I worked through it again and discovered an error or two in my calculations. In the end it looks like

H(A|B,C) = H(A,B,C) - H(B,C)

is correct after all.

This is quite clear when illustrated with a Venn diagram similar to the one https://secure.wikimedia.org/wikipedia/en/wiki/Conditional_entropy" .
 
Last edited by a moderator:

1. What is the formula for entropy conditional on more than one variable?

The formula for entropy conditional on more than one variable is H(X|Y,Z) = -∑∑ p(x,y,z)log(p(x|y,z)), where X,Y, and Z represent the variables and p(x,y,z) is the joint probability distribution function.

2. What is the difference between conditional and unconditional entropy?

Conditional entropy takes into account the relationship between two or more variables, while unconditional entropy only considers the probability distribution of a single variable.

3. How is conditional entropy used in information theory?

Conditional entropy is used to measure the amount of uncertainty or randomness in a system given some prior knowledge or conditions. It is commonly used in information theory to quantify the amount of information needed to describe a random variable when some other variable is known.

4. Can conditional entropy be negative?

Yes, conditional entropy can be negative. This occurs when the conditional probability of a certain event is greater than the unconditional probability, indicating that there is less uncertainty in the system when the condition is known.

5. What are some applications of conditional entropy in real-world problems?

Conditional entropy has many applications in various fields such as data analysis, machine learning, and information retrieval. It can be used for feature selection, clustering, and classification tasks. In finance, it is used to measure the risk of a portfolio given the performance of other assets. In natural language processing, it is used to evaluate the relevance of a word in a document given the context.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
926
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
887
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
0
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
2K
Replies
7
Views
2K
Back
Top