Algebra: proving an inequality

vers
Messages
3
Reaction score
0

Homework Statement


Let
f(x) = (1/2+x) log [(1+x)/x] + (3/2-x) log [(1-x)/(2-x)]
where log is natural logarithm and 0 < x <= 1/2. Show that f(x) >= 0 for all x.

Homework Equations


The only inequality that I can think of is the log sum inequality:
http://en.wikipedia.org/wiki/Log_sum_inequality

The Attempt at a Solution


I try to plot f (see figure), and it seems that the statement is correct. In fact, the figure suggests that f is decreasing and equals 0 at x = 1/2. However, I can't figure out an analytical proof of this.
 

Attachments

  • f_x.jpg
    f_x.jpg
    11 KB · Views: 392
Physics news on Phys.org
I haven't done this out, but, if you find f(1/2) = 0 and that f(x) is decreasing on (0,1/2], what does that tell you about f(x) on (0,1/2]? How would you show that f(1/2) = 0 and f(x) is decreasing on (0,1/2]?

EDIT: On second thought, it may not be completely straightforward to show that f(x) is decreasing on (0,1/2]. If so, let us know what you've tried.
 
Certainly if I can show that f'(x) <=0 then the inequality is proved. However, it seems to me that showing f' being negative is even more difficult than showing f being nonnegative directly.

Tedjn said:
I haven't done this out, but, if you find f(1/2) = 0 and that f(x) is decreasing on (0,1/2], what does that tell you about f(x) on (0,1/2]? How would you show that f(1/2) = 0 and f(x) is decreasing on (0,1/2]?
 
It turns out that the function f can be simplified as
<br /> f(x) = d\left(\frac{1+x}{2} \parallel \frac{x}{2}\right) - d\left(\frac{x}{2} \parallel \frac{1+x}{2}\right)<br />
where
<br /> d(p \parallel q) = p \log \frac{p}{q} + (1 - p) \log \frac{1-p}{1-q}<br />
is the KL divergence between two Bernoulli random variables with respective expectations p and q. So all I need to do now is to show that the difference of these two KL divergences is nonnegative when x <= 1/2. Any idea how to do that?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top