Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Zeros of f(x) vs. Zeros of f'(x)

  1. Aug 30, 2012 #1
    Is there any similarity between the zeros of a function and the zeros of its derivative? That is, if

    A = set of all x such that f(x) = 0
    B = set of all x such that f'(x) = 0

    then is there any pattern to finding A if B is known (or vice versa)?

  2. jcsd
  3. Aug 30, 2012 #2


    User Avatar
    Science Advisor

    I don't know a definitive answer, but here are some thoughts:

    Mostly so for functions of the form f(x)=(x-a)^n , but there is no necessary

    dependence for all functions: look at, e.g., f(x)=sinx , which is 0 at k*Pi ,

    but f'(x)=cosx is 0 at (2n+1)*Pi/2 . If f(x)=f'(x)=0 , then f has both a 0 and

    a critical point at x . Then you can shift your function by a constant and the

    relationship does not exist any more-- the zero is no more, has been shifted.
  4. Aug 31, 2012 #3


    User Avatar
    Science Advisor

    Hey drewfstr314 and welcome to the forums.

    IMO there are two important classes to think about: the first is where you have a true polynomial and the second is when you don't.

    The examples of non-polynomials include anything with a proper power series where you have all non-zero co-effecients (or even some kind of shifted power series like a taylor series expansion with a non-zero center).

    In the finite-case you should use the relationship for the anti-derivative of a*x^n and use the fact that the original equation can be written in terms of (x - c)(x - d)...(x - last_root) = 0, after which you relate the original expression to a new expression (i.e the antiderivate) in any way you can.

    In the infinite-case you can't quite do the same thing, but a suggestion for this is to look at the exponential function and use a composition of functions and see how the deriatives change.

    For the finite case, you might want to consider trying the following: suppose you have a integral expression for the roots of the integral (in other words, (x - a)(x - b)...(x - n) = 0 for the integral not the derivative). Now you can use the product rule to differentiate this expression and get something in terms of the derivative and how it should equal to 0.

    You also have the reverse situation where you can start from the roots of the derivative (i.e (x - a)(x - b)...(x - n) = 0 but you have one less x and the coeffecients are different).

    So basically the point I'm making is that you can generate quite a few identities for both equations and then equate the right ones together to get expression in terms of the other.

    It's not obviously an answer, but the idea of mathematics is to use as many independent kinds of data as you can and then to bring them all together: the more linearly independent pieces of information you have, the more choices you have to analyze the said problem and the more you can do with it.

    Having at least two inter-changeable non linearly independent pieces is what you need and in some cases, this is referred to as a duality. If you have dualities that have their own dualities then you get more and more independent expressions of the same thing and this generates a systematic way to analyze something since you are generating lots of independent ways of describing something.
  5. Aug 31, 2012 #4
    If f is a polynomial with all roots of multiplicity one, then the derivative does not have a zero in common with f. To see this, express f factored, use product rule for derivatives.
  6. Sep 2, 2012 #5


    User Avatar
    Science Advisor

    One can show that if f is a differentiable function and f(a)= f(b)= 0, there must be at least one 0 of f' between a and b.
  7. Sep 3, 2012 #6
    I guess what I was looking for was something like this:

    [itex]f(x) = x^3 - 4x + 2 \Rightarrow f'(x) = 3x^2 - 4[/itex]

    and the solutions of f'(x)=0 are [itex]x = \pm \frac{2\sqrt3}{3}[/itex]

    Based on this, is there any way to find the zeros of f(x)?

  8. Sep 3, 2012 #7
    I don't think you can use this to find the zeros of f(x). The graph shows that f has 3 zeros. So all you can say (in this case), is that you got three zeros a, b and c and these must satisfy [itex]a\leq - \frac{2\sqrt{3}}{3}\leq b\leq \frac{2\sqrt{3}}{3}\leq c[/itex].

    I don't think you can do much more.
  9. Sep 4, 2012 #8


    User Avatar
    Science Advisor

    Short answer no. For the simple reason that ##g(x) = x^3 -4x + 102## has the same derivative as ##f##, but they don't have the same roots.
  10. Sep 7, 2012 #9
    If ##f## is a polynomial, there are results for bounding the zeros of ##f'## know the zeros of ##f##. The Grace-Heawood theorem says that if ##z_1## and ##z_2## are distinct zeros of ##f##, ##f'## has a zero in the disk with center ##\frac12(z_1+z_2)## and radius ##\frac12|z_1-z_2|\cot(\pi/n)##. I also remember a theorem of Gauss saying that the zeros of ##f'## are precisely the multiple zeros of ##f## and the equilibrium points of a gravitatoryish filed generated by the zeros of ##f## with mass proportional to its multiplicity. The intensity of this gravitatoryish field is inversely proportional to the distance, rather than to its square.
    If we say that gravitatory field is proportional to the square of the distance for the tridimensionality of space, I think law of our gravitatoryish is explained by the bidimensionality of the complex plane.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook