# Concavity of entropy

1. Jul 23, 2009

### emma83

1. The problem statement, all variables and given/known data
Shannon entropy is a concave function defined as follows:
$$H(X)=-\sum_{x}p(x)\log p(x)$$

Conditional Shannon entropy is defined as follows:
$$H(X|Y)=\sum_{y} p(y) H(X|Y=y)=-\sum_{y} p(y)\sum_{x}p(x|Y=y)\log p(x|Y=y)$$

Can we deduce that:
$$\sum_{y} p(y)H(X|Y=y)\geq H(X|Y=y)$$

2. Relevant equations

3. The attempt at a solution
I would say yes because of the concavity but I am confused with the 2 random variables.

2. Jul 24, 2009

### jmb

I'm assuming the RHS of your final expression is also meant to be summed over $$y$$, otherwise it doesn't make much sense...

The way to go about this is to think about the nature of $$p(y)$$. What constraints do you know about the values that $$p(y)$$ can take?

PS Do you really mean $$\geq$$ in the last line, or do you mean $$\leq$$?