Set of stationary points comparation

  • Thread starter Thread starter tehno
  • Start date Start date
  • Tags Tags
    Points Set
tehno
Messages
375
Reaction score
0
If f(x),g(x),h(x) are nonegative real functions[§] consider functions:
A(x)=\sqrt{f(x)} + \sqrt{g(x)} + \sqrt{h(x)}
B(x)=f(x)+g(x)+h(x)

What can be said concerning the comparation of total number of stationary points of A(x) and B(x)?
If B(x) has ,let say,3 stationary points (and extremes),does that means that A(x) has AT MOST 3 stationary points as well?



[§]=example for functions f(x),g(x),h(x):

f(x)=|x^3 - 5x^2 + 3x - 9|
g(x)=(sinx)^2
h(x)=x^2 + 1
 
Last edited:
Physics news on Phys.org
I think very little can be said. Certainly, it's not true that if B has 3 stationary points, then A has at most 3, nor is anything analogous true. Consider f(x) = g(x) = h(x) = x2/9. Then A(x) = |x|, but B(x) = x2/3. B has only one stationary point, 3, but A has infinitely many stationary points - every nonnegative real is a stationary point of A.
 
AKG said:
I think very little can be said. Certainly, it's not true that if B has 3 stationary points, then A has at most 3, nor is anything analogous true. Consider f(x) = g(x) = h(x) = x2/9. Then A(x) = |x|, but B(x) = x2/3. B has only one stationary point, 3, but A has infinitely many stationary points - every nonnegative real is a stationary point of A.

What do YOU mean by "stationary point". I would mean a point at which the derivative is 0 but |x| never has derivative 0.
 
Sorry, I guess I was thinking about "fixed point". My bad.
 
HallsofIvy said:
What do YOU mean by "stationary point". I would mean a point at which the derivative is 0 but |x| never has derivative 0.
Yes,exactly that was meant in my post.
The stationary points are defined as solutions of the equations :
A'(x)=0,
B'(x)=0

I'm particulary interested in the case when f(x),g(x),h(x) are three polynomials.
HallsofIvy,what can be said then?
 
Last edited:
Well then?_ ^^_

I prooved the claim if f(x),g(x),h(x) are all three polynomials of the second order (ie. quadratics),and if A(x) , B(x) are restricted to the first Cartesian quadrant.
Can someone give proof ( also for the first quadrant restriction) in general case (when f(x),g(x),h(x) are polynomials of arbitrary order)?
 
What you're trying to prove is false in general. Note that a polynomial like xd has normally only one stationary point. But there are infinitely many ways to express this as a sum of three polynomials f, g, and h such that the sum of the square roots of f, g, and h has some stationary points. Think about the "other way", i.e. Let A = |p| + |q| + |r|, and say B = p2 + q2 + r2. Then the requirement that f, g, and h be non-negative polynomials becomes the requirement that p2, q2, and r2 be non-negative polynomials. To meet this condition, it suffices to find p, q, and r polynomials. Aside from the strangeness you might get because of the absolute value signs, A is basically just like a polynomial. Suppose it is like a degree d polynomial. Then B will be a degree 2d polynomial. Now since A is like a degree d polynomial, it can have d-1 stationary points. However, if you find a clever way of breaking A up into parts |p|, |q| and |r|, then B will not just be any old polynomial of degree 2d, but it will be something like x2d, which despite its degree of 2d, has only 1 stationary point.
 
AKG said:
What you're trying to prove is false in general. Note that a polynomial like xd has normally only one stationary point. But there are infinitely many ways to express this as a sum of three polynomials f, g, and h such that the sum of the square roots of f, g, and h has some stationary points. Think about the "other way", i.e. Let A = |p| + |q| + |r|, and say B = p2 + q2 + r2. Then the requirement that f, g, and h be non-negative polynomials becomes the requirement that p2, q2, and r2 be non-negative polynomials. To meet this condition, it suffices to find p, q, and r polynomials. Aside from the strangeness you might get because of the absolute value signs, A is basically just like a polynomial. Suppose it is like a degree d polynomial. Then B will be a degree 2d polynomial. Now since A is like a degree d polynomial, it can have d-1 stationary points. However, if you find a clever way of breaking A up into parts |p|, |q| and |r|, then B will not just be any old polynomial of degree 2d, but it will be something like x2d, which despite its degree of 2d, has only 1 stationary point.

Hi AKG,
Not "in general".
I agree with your point,but rereading my post I think I wasn't clear about
positive restrictions.
Consider:
A(x)=|\sqrt{|a_{m}x^m+a_{m-1}x^{m-1}+...+a_{0}|}|+|\sqrt{|b_{n}x^n +b_{n-1}x^{n-1}+...+b_{0}|}|+|\sqrt{|c_{p}x^p+c_{p-1}x^{p-1}+...+c_{0}|}|
and
B(x)=|a_{m}x^m+a_{m-1}x^{m-1}+..+a_{0}|+|b_{n}x^n+b_{n-1}x^{n-1}+..+b_{0}|+|c_{p}x^p+c_{p-1}x^{p-1}+..+c_{0}|

Consideration to be taken only with nonegative x and if sums of coefficients that belong to terms associating same powers aren't zero.
In that case I think there's no way that number of stationary points of
A(x) can surpass that of B(x).
Do you agree?
 
Last edited:
Consideration to be taken only with nonegative x and if sums of coefficients that belong to terms associating same powers aren't zero.
This sentence makes no sense. I can't make sense of what you wrote, but I'll tell you what I know:

There exist polynomials f, g, and h defined from the non-negative reals to the non-negative reals such that B = f + g + h has fewer fixed points than A = f1/2 + g1/2 + h1/2. Example:

let d be any odd number greater than 2
let f(x) = ((x-1)2d - 0.5)2
let g(x) = (x-1)2d
let h(x) = 0

Then B(x) = (x-1)4d + 1/4, and has only one stationary point at x = 1. A(x) = |(x-1)2d - 0.5| + |x-1|d which has stationary points at x = 1 and at x = 1 - (1/2)1/d. EDIT: looking at the graph of A(x) for d = 3, I see that it also has another stationary point at 1 + (1/2)1/d, and perhaps this holds for any odd d. Whether it does or not, we know for sure that A has at least the two stationary points I mentioned before the edit, and that is still one more than B has.
 
Last edited:
  • #10
AKG said:
This sentence makes no sense. I can't make sense of what you wrote.
Why not?
Restriction means that I consider polynomial functions only
with x>0 and cases where sum of corresponding coefficients of f(x) ,g(x),h(x) is nonzero.
For example :
Let m=n=p=2.
Then
f(x)=a_{2}x^2+a_{1}x + a_{0};g(x)=b_{2}x^2+b_{1}x+b_{0};h(x)=c_{2}x^2+c_{1}x+c_{0}
Assume :
a_{2}+b_{2}+ c_{2}<>0;a_{1}+b_{1}+c_{1}<>0;a_{0}+b_{0}+c_{0}<>0
 
  • #11
Okay, so your question can be reasked: Does there exist a natural d, and a polynomial B(x) = adxd + ... + a0 defined on the non-negative reals such that none of the ai are zero, and such that there exist three non-negative polynomials f(x), g(x), and h(x) defined on the non-negative reals such that f(x) + g(x) + h(x) = B(x), and such that the function A(x) = f(x)1/2 + g(x)1/2 + h(x)1/2 has more stationary points than B(x)? You think the answer is "No." Do you have a proof?
 
  • #12
AKG said:
Okay, so your question can be reasked: Does there exist a natural d, and a polynomial B(x) = adxd + ... + a0 defined on the non-negative reals such that none of the ai are zero, and such that there exist three non-negative polynomials f(x), g(x), and h(x) defined on the non-negative reals such that f(x) + g(x) + h(x) = B(x), and such that the function A(x) = f(x)1/2 + g(x)1/2 + h(x)1/2 has more stationary points than B(x)? You think the answer is "No." Do you have a proof?
Finally,we understand each other!:smile:
No,I don't have proof .I asked for it and would like to see what PFs experts can say about it!

BTW,note that every elementar function in the analysis can be described by serie (ie. in the form of Taylor infinite polynomial).That's reason I'm interested in the proof for the polynomials,under the terms of restriction given above.
 
  • #13
With so many conditions, it seems hard to decide on a fruitful way to start. I would recommend expressing the functions f(x)1/2 as power series using Newton's generalized binomial theorem. See if that gets you anywhere.
 
  • #14
Not so many conditions.
Simply put :whole thing is considered only in first Cartesian quadrant,with general requirement of the nonzero coefficient sums.
:smile:
 
  • #15
To me, the nonzero coefficient sums is quite an unwieldy restriction. Anyways, did the binomial theorem get you anywhere?
 
  • #16
Binominal theorem didn't but I solved (prooved) it using
another methods.
Anyway,thank you for the suggestion and correspondence and about all
AKG.

cheers
 
Back
Top