Proving Inequality for Linear Functions: |h(h(x))+h(h(1/x))|>2

  • Thread starter Thread starter utkarshakash
  • Start date Start date
  • Tags Tags
    Inequality
utkarshakash
Gold Member
Messages
852
Reaction score
13

Homework Statement


If f and g are two distinct linear functions defined on R such that they map[-1,1] onto [0,2] and h:R-{-1,0,1}→R defined by h(x)=f(x)/g(x) then show that |h(h(x))+h(h(1/x))|>2

Homework Equations



The Attempt at a Solution


I assume f(x) to be ax+b and g(x) to be lx+m so that h(x) is (ax+b)/(lx+m). From here I can write h(h(x)) and h(h(1/x)) but there is nothing I can see that will help me to prove this inequality. Any ideas?
 
Physics news on Phys.org
utkarshakash said:

Homework Statement


If f and g are two distinct linear functions defined on R such that they map[-1,1] onto [0,2] and h:R-{-1,0,1}→R defined by h(x)=f(x)/g(x) then show that |h(h(x))+h(h(1/x))|>2

Homework Equations



The Attempt at a Solution


I assume f(x) to be ax+b and g(x) to be lx+m so that h(x) is (ax+b)/(lx+m). From here I can write h(h(x)) and h(h(1/x)) but there is nothing I can see that will help me to prove this inequality. Any ideas?
You can be more specific regarding the functions, f & g .

There are only two distinct linear functions which map [-1,1] onto [0,2] .

What are they?
 
SammyS said:
You can be more specific regarding the functions, f & g .

There are only two distinct linear functions which map [-1,1] onto [0,2] .

What are they?

y=x+1 and y=-x+1. Are these correct?
 
utkarshakash said:
y=x+1 and y=-x+1. Are these correct?
Yes. Equivalently, y=1+x and y=1-x .

So, there are only two cases to consider.

Choosing f(x) = 1+x and g(x) = 1-x, for now, what is h(1/x) ?
 
SammyS said:
Yes. Equivalently, y=1+x and y=1-x .

So, there are only two cases to consider.

Choosing f(x) = 1+x and g(x) = 1-x, for now, what is h(1/x) ?

x+1/x-1 which can be reduced to -f(x)/g(x)

And after simplifying further I am left with proving this inequality

|x-(1/x)|>2
 
Last edited:
utkarshakash said:
x+1/x-1 which can be reduced to -f(x)/g(x)
It looks like you get h(1/x) = -f(x)/g(x), for either way of assigning 1-x and 1+x to f(x) and g(x).
And after simplifying further I am left with proving this inequality

|x-(1/x)|>2

I got something similar to x-(1/x), but it is different.

Check your algebra.

It seems to me that one could do this more abstractly using some calculus, etc. (Maybe the Calculus part comes in the next step)
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top