1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Can Taylor series be used to get the roots of a polynomial?

  1. Feb 21, 2017 #1
    I'm using this method:

    First, write the polynomial in this form:
    $$a_nx^n+a_{n-1}x^{n-1}+......a_2x^2+a_1x=c$$
    Let the LHS of this expression be the function ##f(x)##. I'm gonna write the Taylor series of ##f^{-1}(x)## around ##x=0## and then put ##x=c## in it to get ##f^{-1}(c)## which will be the value of ##x##.

    Since, ##f^{-1}(0)=0## here, so we've got the first term of our Taylor series as 0.

    Now, the only thing that remains is calculating the derivatives of ##f^{-1}(x)## at ##x=0##.

    I'm using the fact that $$\frac{d(f^{-1}(x)}{dx}=\frac{1}{f^{'}(f(^{-1}x))}$$

    By differentiating this equation, we can get the second derivative of ##f^{-1}(x)## as:
    $$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{(f^{'}(f(^{-1}x)))^2}*f^{''}(f(^{-1}x))*f^{-1'}(x)$$

    Similarly, we can get the other derivatives by further differentiation of this equation. Then we can evaluate all the derivatives at ##x=0## to get the Taylor series of ##f^{-1}(x)## and evaluate it at ##x=c## to get the value of ##x##.

    1.Is this method correct?

    2.Can something be done to make it better and remove the limitations?
     
    Last edited: Feb 21, 2017
  2. jcsd
  3. Feb 21, 2017 #2

    Stephen Tashi

    User Avatar
    Science Advisor

    ##\frac{d(f^{-1}(x))}{dx} = \frac{1}{f '(f^{-1}(x))}##
     
  4. Feb 21, 2017 #3
    Oh, I got it wrong. I'm gonna edit it. Is there still some hope in this approach?
     
  5. Feb 21, 2017 #4

    Stephen Tashi

    User Avatar
    Science Advisor

    It appears that in order to find ##f^{-1}(x)## by that method, you'd need to know ##f^{-1}(x)## to begin with.

    So, if there is any hope for that method, you need to discover a way to evaluate expressions like ##f'(f^{-1}(x))## at ##x = 0## without knowing the explicit formula for ##f^{-1}(x)##.

    There is also the technicality that ##f^{-1}(x)## doesn't necessarily exist (as a function). A polynomial of degree n can have up to n distinct roots. Of course, if you could develop an algorithm that found just one of the roots, that would be progress.
     
  6. Feb 21, 2017 #5
    But I know the value of ##f^{-1}(x)## at ##x=0##. After doing all the formulas, what I have to do in the end in evaluating that expression at ##x=0## and I've the value at ##x=0##.
    For example, $$f^{-1'}(x)=\frac{d(f^{-1}(x)}{dx}=\frac{1}{f^{'}(f(^{-1}(x))}$$
    $$=\frac{1}{n*a_{n}(f(^{-1}(x)))^{n-1}+(n-1)a_{n-1}(f(^{-1}(x)))^{n-2}+.............+2a_2f^{-1}(x)+a_1}$$
    which gives $$\frac{1}{a_1}$$ at ##x=0## since $$f^{-1}(0)=0$$
    Similarly,
    $$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{(f^{'}(f(^{-1}(x)))^2}*f^{''}(f(^{-1}x))*f^{-1'}(x)$$
    We already have the value of the first derivative at ##x=0## so we can substitute that here, so
    $$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{a_1^2}*2a_2*\frac{1}{a_1}$$
    $$=-\frac{2a_2}{a_1^3}$$
    I think this process can be continued to get more derivatives by the product rule.
     
    Last edited: Feb 21, 2017
  7. Feb 21, 2017 #6

    Stephen Tashi

    User Avatar
    Science Advisor

    I think what you are doing is equivalent to: https://en.wikipedia.org/wiki/Lagrange_inversion_theorem.

    The resulting infinite series has some radius of convergence. So an important question is whether x = c is a value in that radius of convergence.

    Try the method on a quadratic equation and see if you get a series that terminates - or peharps a series whose sum has a closed form expression.
     
  8. Feb 21, 2017 #7
    I looked up this Lagrange inversion theorem. I didn't understand much but it was also about getting Taylor series of inverse functions. Are the coefficients obtained also the same in Lagrange inversion theorem and my method.
    And, I'm new with Taylor series. I've not yet studied it in school. I've just read about it on the internet. So, I still don't understand what does this radius of convergence mean. What does it mean? I mean, we have regularly increasing factorials in the denominators of the Taylor series, so I think the series will always converge. Even if ##(x-c)## is large, then after a large number of terms, I think factorials will become dominating and the denominators will be so large that even the large exponent of ##(x-c)## won't contribute much to the answer. I think the Taylor series will always converge even if ##c## is not in some required range.
     
    Last edited: Feb 21, 2017
  9. Feb 21, 2017 #8

    Stephen Tashi

    User Avatar
    Science Advisor

    A series expressed as a sum of terms that are functions of a variable ##x## may not converge for all possible values of ##x##. If it converges for only those values of ##x## in an interval ##(a,b)##, it is customary to write that interval as ##(c - r, c + r)## where ##r## is the "radius of convergence" and ##c## is the center of the interval. The term "radius" makes more sense when we are dealing with functions of a complex variable. Then there can be a circle of convergence in the complex plane. That circle has a radius.

    Taylor series are an example of a "power series" since the terms involve sums of powers of the variable ##x##. Power series have some interval or circle of convergence. (If they converge for all values of ##x##, we count that as an "infinite" radius of convergence.) A non-power series might not have a interval or circle of convergence. The values at which it converges might form some irregular shape.

    No, it won't. I suggest you make a post on the topic of "Are there smooth functions whose Taylor series diverges?". That is a specific mathematical question - and maybe someone will answer your post besides me!

    There are also examples of functions f(a) whose Taylor series converges at the value of x=a, but does not converge to f(a).


    More is involved that the factorials and the powers of ##(x-c)## you also have to consider the other factors in the terms.
     
  10. Feb 21, 2017 #9

    Stephen Tashi

    User Avatar
    Science Advisor

    That's a good question. I don't know the answer. You like examples. Try the quadratic equation both ways.
     
  11. Feb 21, 2017 #10
    I evaluated the first and second coefficients for ##f(x)=px^2+qx##. They were the same, so I think both the methods give the same series.
    Except the Lagrange formula for the coefficients seems to be a better version of my formula. I think this formula of mine can be simplified to get that formula. So, I re-discovered this Lagrange inversion on my own!
    Thanks. I'll stop asking these type of questions now if they resemble crackpotism.
     
  12. Feb 22, 2017 #11

    Stephen Tashi

    User Avatar
    Science Advisor

    Yes, the real variable version.

    Is there a way the series expansion can reveal two unequal roots of a quadratic?

    They only resemble crackpotism if you assert things are true just because of numerical examples.

    When people first learn some concepts of calculus, this can result in a kind of mental intoxication. The possibilities for doing algebraic manipulations suddenly explode. One day it seems hard just to solve quadratic equations and then suddenly the sky's the limit. You should enjoy the intoxication while it lasts.

    What you will eventually encounter is that algebraic manipulations are not completely reliable - especially when the subject matter deals with "infinitely" small or large things. In particular, you need to be wary of thinking of an function as an algorithm that can be expressed by a concise formulae. The definition of a function is permitted to have all sorts of complicated "if...then..." conditions.

    There is a branch of mathematics called "The Calculus of Finite Differences" where algebra is less inhibited. (e.g.
    https://www.physicsforums.com/threa...sequences-recursive-reps.892720/#post-5619687
    https://www.physicsforums.com/threa...interpolation-polynomial.903217/#post-5688309 )
     
  13. Feb 24, 2017 #12
    Can we first use Newton's method to get an approximate root, then apply Lagrange inversion around that point? I don't know about Newton's method but I've heard about it that it's also a method to get polynomial roots. So, what would be better: To use Newton's method all along, or to use Newton's method first to get an approximate root, then apply Lagrange inversion?
     
  14. Feb 25, 2017 #13

    epenguin

    User Avatar
    Homework Helper
    Gold Member

    I think that in general there can be no such algorithm, that is you cannot find just one root unless there is something special about the polynomial, or you have been given some extra fact about it, like if you know for some reason that one root is twice another. Which is tantamount to having two equations to eliminate between. Obviously if you could find one root by a general algorithm you could find them all. But in general trying to pick out one root is like trying to pin a globule of mercury. Every expression you come up with involves all the roots in indistinguishable, i.e. symmetrical, fashion.
     
  15. Feb 25, 2017 #14

    Stephen Tashi

    User Avatar
    Science Advisor

    Newton's method requires an initial estimate (which can be a guess) for a root. So Newton's method does not locate all the roots of an equation.

    Are you interested in computer programming? Solving problems with numerical methods utilizes many approximations and you seem to be interested in numerical approximations. Programming also makes it easier to study methods like Newton's method because putting the steps of the algorithm into computer code makes the steps of the algorithm memorable.
     
  16. Feb 25, 2017 #15
    Yeah, I've been learning it for a month now. I can only do something like prime factorization in C++ until now. I'd try to develop my skills.
     
  17. Feb 25, 2017 #16

    Stephen Tashi

    User Avatar
    Science Advisor

    If you started with C++ without first learning C, you began your swimming lessons by jumping into deep water!
     
  18. Feb 25, 2017 #17
    I searched on the internet before I started learning and many people said that you need not learn C and that they're the same thing. I'm learning with an eBook file and I've had no problem in learning C++ until now. I've learned upto writing functions until now.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted