# I Can Taylor series be used to get the roots of a polynomial?

#### Kumar8434

I'm using this method:

First, write the polynomial in this form:
$$a_nx^n+a_{n-1}x^{n-1}+......a_2x^2+a_1x=c$$
Let the LHS of this expression be the function $f(x)$. I'm gonna write the Taylor series of $f^{-1}(x)$ around $x=0$ and then put $x=c$ in it to get $f^{-1}(c)$ which will be the value of $x$.

Since, $f^{-1}(0)=0$ here, so we've got the first term of our Taylor series as 0.

Now, the only thing that remains is calculating the derivatives of $f^{-1}(x)$ at $x=0$.

I'm using the fact that $$\frac{d(f^{-1}(x)}{dx}=\frac{1}{f^{'}(f(^{-1}x))}$$

By differentiating this equation, we can get the second derivative of $f^{-1}(x)$ as:
$$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{(f^{'}(f(^{-1}x)))^2}*f^{''}(f(^{-1}x))*f^{-1'}(x)$$

Similarly, we can get the other derivatives by further differentiation of this equation. Then we can evaluate all the derivatives at $x=0$ to get the Taylor series of $f^{-1}(x)$ and evaluate it at $x=c$ to get the value of $x$.

1.Is this method correct?

2.Can something be done to make it better and remove the limitations?

Last edited:

#### Stephen Tashi

I'm using the fact that $$\frac{d(f^{-1}(x)}{dx}=\frac{1}{f^{'}(f(x))}$$
$\frac{d(f^{-1}(x))}{dx} = \frac{1}{f '(f^{-1}(x))}$

#### Kumar8434

$\frac{d(f^{-1}(x))}{dx} = \frac{1}{f '(f^{-1}(x))}$
Oh, I got it wrong. I'm gonna edit it. Is there still some hope in this approach?

#### Stephen Tashi

Is there still some hope in this approach?
It appears that in order to find $f^{-1}(x)$ by that method, you'd need to know $f^{-1}(x)$ to begin with.

So, if there is any hope for that method, you need to discover a way to evaluate expressions like $f'(f^{-1}(x))$ at $x = 0$ without knowing the explicit formula for $f^{-1}(x)$.

There is also the technicality that $f^{-1}(x)$ doesn't necessarily exist (as a function). A polynomial of degree n can have up to n distinct roots. Of course, if you could develop an algorithm that found just one of the roots, that would be progress.

#### Kumar8434

It appears that in order to find $f^{-1}(x)$ by that method, you'd need to know $f^{-1}(x)$ to begin with.

So, if there is any hope for that method, you need to discover a way to evaluate expressions like $f'(f^{-1}(x))$ at $x = 0$ without knowing the explicit formula for $f^{-1}(x)$.

There is also the technicality that $f^{-1}(x)$ doesn't necessarily exist (as a function). A polynomial of degree n can have up to n distinct roots. Of course, if you could develop an algorithm that found just one of the roots, that would be progress.
But I know the value of $f^{-1}(x)$ at $x=0$. After doing all the formulas, what I have to do in the end in evaluating that expression at $x=0$ and I've the value at $x=0$.
For example, $$f^{-1'}(x)=\frac{d(f^{-1}(x)}{dx}=\frac{1}{f^{'}(f(^{-1}(x))}$$
$$=\frac{1}{n*a_{n}(f(^{-1}(x)))^{n-1}+(n-1)a_{n-1}(f(^{-1}(x)))^{n-2}+.............+2a_2f^{-1}(x)+a_1}$$
which gives $$\frac{1}{a_1}$$ at $x=0$ since $$f^{-1}(0)=0$$
Similarly,
$$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{(f^{'}(f(^{-1}(x)))^2}*f^{''}(f(^{-1}x))*f^{-1'}(x)$$
We already have the value of the first derivative at $x=0$ so we can substitute that here, so
$$\frac{d^2(f^{-1}(x))}{dx^2}=-\frac{1}{a_1^2}*2a_2*\frac{1}{a_1}$$
$$=-\frac{2a_2}{a_1^3}$$
I think this process can be continued to get more derivatives by the product rule.

Last edited:

#### Stephen Tashi

I think what you are doing is equivalent to: https://en.wikipedia.org/wiki/Lagrange_inversion_theorem.

The resulting infinite series has some radius of convergence. So an important question is whether x = c is a value in that radius of convergence.

Try the method on a quadratic equation and see if you get a series that terminates - or peharps a series whose sum has a closed form expression.

#### Kumar8434

I think what you are doing is equivalent to: https://en.wikipedia.org/wiki/Lagrange_inversion_theorem.

The resulting infinite series has some radius of convergence. So an important question is whether x = c is a value in that radius of convergence.

Try the method on a quadratic equation and see if you get a series that terminates - or peharps a series whose sum has a closed form expression.
I looked up this Lagrange inversion theorem. I didn't understand much but it was also about getting Taylor series of inverse functions. Are the coefficients obtained also the same in Lagrange inversion theorem and my method.
And, I'm new with Taylor series. I've not yet studied it in school. I've just read about it on the internet. So, I still don't understand what does this radius of convergence mean. What does it mean? I mean, we have regularly increasing factorials in the denominators of the Taylor series, so I think the series will always converge. Even if $(x-c)$ is large, then after a large number of terms, I think factorials will become dominating and the denominators will be so large that even the large exponent of $(x-c)$ won't contribute much to the answer. I think the Taylor series will always converge even if $c$ is not in some required range.

Last edited:

#### Stephen Tashi

So, I still don't understand what does this radius of convergence mean. What does it mean?
A series expressed as a sum of terms that are functions of a variable $x$ may not converge for all possible values of $x$. If it converges for only those values of $x$ in an interval $(a,b)$, it is customary to write that interval as $(c - r, c + r)$ where $r$ is the "radius of convergence" and $c$ is the center of the interval. The term "radius" makes more sense when we are dealing with functions of a complex variable. Then there can be a circle of convergence in the complex plane. That circle has a radius.

Taylor series are an example of a "power series" since the terms involve sums of powers of the variable $x$. Power series have some interval or circle of convergence. (If they converge for all values of $x$, we count that as an "infinite" radius of convergence.) A non-power series might not have a interval or circle of convergence. The values at which it converges might form some irregular shape.

I mean, we have regularly increasing factorials in the denominators of the Taylor series, so I think the series will always converge.
No, it won't. I suggest you make a post on the topic of "Are there smooth functions whose Taylor series diverges?". That is a specific mathematical question - and maybe someone will answer your post besides me!

There are also examples of functions f(a) whose Taylor series converges at the value of x=a, but does not converge to f(a).

Even if $(x-c)$ is large, then after a large number of terms, I think factorials will become dominating and the denominators will be so large that even the large exponent of $(x-c)$ won't contribute much to the answer.
More is involved that the factorials and the powers of $(x-c)$ you also have to consider the other factors in the terms.

#### Stephen Tashi

Are the coefficients obtained also the same in Lagrange inversion theorem and my method.
That's a good question. I don't know the answer. You like examples. Try the quadratic equation both ways.

#### Kumar8434

That's a good question. I don't know the answer. You like examples. Try the quadratic equation both ways.
I evaluated the first and second coefficients for $f(x)=px^2+qx$. They were the same, so I think both the methods give the same series.
Except the Lagrange formula for the coefficients seems to be a better version of my formula. I think this formula of mine can be simplified to get that formula. So, I re-discovered this Lagrange inversion on my own!
Thanks. I'll stop asking these type of questions now if they resemble crackpotism.

#### Stephen Tashi

So, I re-discovered this Lagrange inversion on my own!
Yes, the real variable version.

Is there a way the series expansion can reveal two unequal roots of a quadratic?

Thanks. I'll stop asking these type of questions now if they resemble crackpotism.
They only resemble crackpotism if you assert things are true just because of numerical examples.

When people first learn some concepts of calculus, this can result in a kind of mental intoxication. The possibilities for doing algebraic manipulations suddenly explode. One day it seems hard just to solve quadratic equations and then suddenly the sky's the limit. You should enjoy the intoxication while it lasts.

What you will eventually encounter is that algebraic manipulations are not completely reliable - especially when the subject matter deals with "infinitely" small or large things. In particular, you need to be wary of thinking of an function as an algorithm that can be expressed by a concise formulae. The definition of a function is permitted to have all sorts of complicated "if...then..." conditions.

There is a branch of mathematics called "The Calculus of Finite Differences" where algebra is less inhibited. (e.g.

#### Kumar8434

So an important question is whether x = c is a value in that radius of convergence.
.
Can we first use Newton's method to get an approximate root, then apply Lagrange inversion around that point? I don't know about Newton's method but I've heard about it that it's also a method to get polynomial roots. So, what would be better: To use Newton's method all along, or to use Newton's method first to get an approximate root, then apply Lagrange inversion?

#### epenguin

Homework Helper
Gold Member
Of course, if you could develop an algorithm that found just one of the roots, that would be progress.
I think that in general there can be no such algorithm, that is you cannot find just one root unless there is something special about the polynomial, or you have been given some extra fact about it, like if you know for some reason that one root is twice another. Which is tantamount to having two equations to eliminate between. Obviously if you could find one root by a general algorithm you could find them all. But in general trying to pick out one root is like trying to pin a globule of mercury. Every expression you come up with involves all the roots in indistinguishable, i.e. symmetrical, fashion.

#### Stephen Tashi

Can we first use Newton's method to get an approximate root, then apply Lagrange inversion around that point?
Newton's method requires an initial estimate (which can be a guess) for a root. So Newton's method does not locate all the roots of an equation.

Are you interested in computer programming? Solving problems with numerical methods utilizes many approximations and you seem to be interested in numerical approximations. Programming also makes it easier to study methods like Newton's method because putting the steps of the algorithm into computer code makes the steps of the algorithm memorable.

#### Kumar8434

Newton's method requires an initial estimate (which can be a guess) for a root. So Newton's method does not locate all the roots of an equation.

Are you interested in computer programming? Solving problems with numerical methods utilizes many approximations and you seem to be interested in numerical approximations. Programming also makes it easier to study methods like Newton's method because putting the steps of the algorithm into computer code makes the steps of the algorithm memorable.
Yeah, I've been learning it for a month now. I can only do something like prime factorization in C++ until now. I'd try to develop my skills.

#### Stephen Tashi

Yeah, I've been learning it for a month now. I can only do something like prime factorization in C++ until now. I'd try to develop my skills.
If you started with C++ without first learning C, you began your swimming lessons by jumping into deep water!

#### Kumar8434

If you started with C++ without first learning C, you began your swimming lessons by jumping into deep water!
I searched on the internet before I started learning and many people said that you need not learn C and that they're the same thing. I'm learning with an eBook file and I've had no problem in learning C++ until now. I've learned upto writing functions until now.

"Can Taylor series be used to get the roots of a polynomial?"

### Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving