# Euler's formula

## Main Question or Discussion Point

Hi
I was trying to prove Euler's formula in complex numbers that states : [PLAIN]http://upload.wikimedia.org/math/d/c/4/dc4d000103c1293c5cde00fac04850d6.png. [Broken] I tried to prove it by taylor series but the problem is that I don't know if my proof is right or not. I think taylor series work only when x approaches to some number a, so it doesn't apply to all values of f(x). thus when I write 1 + ix + (ix)2/2! + ... it means that i'm proving the formula only when approaches zero. can anyone help me and correct me if i'm wrong or approve me if i'm right? and does anyone know an accurate proof of the formula?

thanks

Last edited by a moderator:

HallsofIvy
Homework Helper
Hi
I was trying to prove Euler's formula in complex numbers that states : [PLAIN]http://upload.wikimedia.org/math/d/c/4/dc4d000103c1293c5cde00fac04850d6.png. [Broken] I tried to prove it by taylor series but the problem is that I don't know if my proof is right or not. I think taylor series work only when x approaches to some number a, so it doesn't apply to all values of f(x). thus when I write 1 + ix + (ix)2/2! + ... it means that i'm proving the formula only when approaches zero.
There is no limit involved here so I don't know what you mean by "approaches zero". It is true that, for a general analytic function, the Taylor's series, about x= a, is only guarenteed to converge to the function in some neighborhood of a. (If the function is not analytic in some neighborhood of a the series might converge only at a.)

However, it is relatively easy to prove that for the functions you need, ex, sin(x), and cos(x), the Taylor's series, about any x= a, converge to the required function for all x. Simply expanding eix, sin(x), and cos(x) in their Taylor's series about x= 0 and comparing is an accurate proof of the formula.

can anyone help me and correct me if i'm wrong or approve me if i'm right? and does anyone know an accurate proof of the formula?

thanks

Last edited by a moderator:
There is no limit involved here so I don't know what you mean by "approaches zero". It is true that, for a general analytic function, the Taylor's series, about x= a, is only guarenteed to converge to the function in some neighborhood of a. (If the function is not analytic in some neighborhood of a the series might converge only at a.)

However, it is relatively easy to prove that for the functions you need, ex, sin(x), and cos(x), the Taylor's series, about any x= a, converge to the required function for all x. Simply expanding eix, sin(x), and cos(x) in their Taylor's series about x= 0 and comparing is an accurate proof of the formula.
okay so let me rephrase my question this way, how can you prove that the the series we need converge for all values of x? It's almost easy to prove it when x is in some neighborhood of some a, but how can you generalize it to all x's in the domain of the function?

okay so let me rephrase my question this way, how can you prove that the the series we need converge for all values of x? It's almost easy to prove it when x is in some neighborhood of some a, but how can you generalize it to all x's in the domain of the function?
The ratio test.

The ratio test.
how to do that? I only know that if ratio test gives L<1 then the series converges. but I don't know how we can find the radius of convergence with the ratio test.

analytic function?

Mute
Homework Helper
how to do that? I only know that if ratio test gives L<1 then the series converges. but I don't know how we can find the radius of convergence with the ratio test.
The terms of your series are a function of x, so generally the ratio of successive terms will depend on x. So, when you say the ratio test guarantees the series converges if L < 1, L depends on x, so really the series converges if |L(x)| < 1, and there is a range of x for which this condition is true. The radius of convergence, R, is thus the value of x such that |L(R)| = 1. If |L(x)| < 1 for all x, the radius of convergence is infinite. (similarly, if |L(x)| > 1 for all x the radius of convergence is zero).

As an example, take the series $1 + x + x^2 + ... = \sum_{k=0}^\infty x^n$. The ratio test gives $\lim_{n\rightarrow \infty} |x^{n+1}/x^n|\equiv |L(x)| = |x|$, so the series only converges if x < 1.

The terms of your series are a function of x, so generally the ratio of successive terms will depend on x. So, when you say the ratio test guarantees the series converges if L < 1, L depends on x, so really the series converges if |L(x)| < 1, and there is a range of x for which this condition is true. The radius of convergence, R, is thus the value of x such that |L(R)| = 1. If |L(x)| < 1 for all x, the radius of convergence is infinite. (similarly, if |L(x)| > 1 for all x the radius of convergence is zero).

As an example, take the series $1 + x + x^2 + ... = \sum_{k=0}^\infty x^n$. The ratio test gives $\lim_{n\rightarrow \infty} |x^{n+1}/x^n|\equiv |L(x)| = |x|$, so the series only converges if x < 1.
Thank you, now I'm convinced. is there a general way to find the radius of convergence for a wide range of series? I mean I know a bunch of tests to check if a series converges or not, but what if the series was so tricky and those tests didn't work, then what would I do? is there a general theorem or something that covers a large category of series? for example a theorem on some specified space with euclidean metric or something

HallsofIvy
Homework Helper
First the whole notion of "radius of convergence" only applies to power series so we can assume a series of the form $\sum a_n (z- a)^n$.

Now the ratio test requires looking at
$$\left|\frac{a_{n+1}(z- a)^{n+1}}{a_m(z- a)^n}\right|= \left|\frac{a_{n+1}}{a_n}\right||z- a|$$
so it becomes just a matter of the limit
$$\lim_{n\to\infty} \frac{a_{n+1}}{a_n}$$
If that converges to some non-zero number A, then we will have A(z- a)< 1 for z-a< 1/A and the radius of convergence is 1/A. If it converges to 0, of course, the radius of convergence is infinite- the power series converges for all z. If it does not converge, then the radius of convergence is 0. For $e^z$, sin(z), and cos(z), the ratio test reduces to n!/(n+1)! which goes to 0.

But don't confuse "converging" with "converging to that particular function". For example, if $f(x)= e^{-1/x^2}$ for x not equal to 0, and f(0)= 0, then it is fairly easy to prove that f is infinitely differentiable at x= 0 and that the derivative of any order is 0. That means that the Taylor's series for f, at x= 0, is 0+ 0x+ 0x2+ ... That certainly converges for all x- but it converges to 0 and so only converges to f(x) at x= 0.

trambolin's two word post, "analytic function?" is what you need: the basic definition of "analytic" at a point is that the Taylor's series for the function exists and converges to the function in some neighborhood of that point. But how do you prove that $e^x$, $sin(x)$, and $cos(x)$ are analytic?

There are many different ways to do that but the simplest is to use the initial value problems that characterize the functions. Since the Taylor series for them do have infinite radius of convergence (whatever function they sum to!), we can differentiate term by term. The derivative of
$$f(z)= \sum_{n=0}^\infty} \frac{z^n}{n!}$$
is
$$f'(z)= \sum_{n=1}^\infty \frac{z^{n-1}}{(n-1)!}[/itex] which, with change of index m= n-1 is just [tex]f'(z)= \sum_{m=0}^\infty \frac{z^m}{m!}= f(z)[/itex] That is, f(z) satisfies f'(z)= f(z) as well as f(0)= 1. The "existence and and uniqueness theorem" for initial value problems shows that $e^z$ is the only function that satifies those so the Taylor's series does,in fact, converge to $e^z$ for all z. Similarly, you can show that the Taylor's series for sin(z) satisfies f''(z)= -f(z), f(0)= 0, f'(0)= 1 and the Taylor's series for cos(z0 satisfies f''(z)= -z, f(0)= 1, f'(0)= 0, then show that only sin(z) and cos(z), respectively, satisfy those. Last edited by a moderator: First the whole notion of "radius of convergence" only applies to power series so we can assume a series of the form $\sum a_n (z- a)^n$. Now the ratio test requires looking at [tex]\left|\frac{a_{n+1}(z- a)^{n+1}}{a_m(z- a)^n}\right|= \left|\frac{a_{n+1}}{a_n}\right||z- a|$$
so it becomes just a matter of the limit
$$\lim_{n\to\infty} \frac{a_{n+1}}{a_n}$$
If that converges to some non-zero number A, then we will have A(z- a)< 1 for z-a< 1/A and the radius of convergence is 1/A. If it converges to 0, of course, the radius of convergence is infinite- the power series converges for all z. If it does not converge, then the radius of convergence is 0. For $e^z$, sin(z), and cos(z), the ratio test reduces to n!/(n+1)! which goes to 0.

But don't confuse "converging" with "converging to that particular function". For example, if $f(x)= e^{-1/x^2}$ for x not equal to 0, and f(0)= 0, then it is fairly easy to prove that f is infinitely differentiable at x= 0 and that the derivative of any order is 0. That means that the Taylor's series for f, at x= 0, is 0+ 0x+ 0x2+ ... That certainly converges for all x- but it converges to 0 and so only converges to f(x) at x= 0.

trambolin's two word post, "analytic function?" is what you need: the basic definition of "analytic" at a point is that the Taylor's series for the function exists and converges to the function in some neighborhood of that point. But how do you prove that $e^x$, $sin(x)$, and $cos(x)$ are analytic?

There are many different ways to do that but the simplest is to use the initial value problems that characterize the functions. Since the Taylor series for them do have infinite radius of convergence (whatever function they sum to!), we can differentiate term by term. The derivative of
$$f(z)= \sum_{n=0}^\infty} \frac{z^n}{n!}$$
is
[tex]f'(z)= \sum_{n=1}^\infty \frac{z^{n-1}}{(n-1)!}[/itex]
which, with a change of index m= n-1 is juswt
[tex]f'(z)= \sum_{m=0}^\infty \frac{z^m}{m!}[/itex]
That is, f(z) satisfies f'(z)= f(z) as well as f(0)= 1. The "existence and and uniqueness theorem" for initial value problems shows that $e^z$ is the only function that satifies those so the Taylor's series does,in fact, converge to $e^z$ for all z.

Similarly, you can show that the Taylor's series for sin(z) satisfies f''(z)= -f(z), f(0)= 0, f'(0)= 1 and the Taylor's series for cos(z0 satisfies f''(z)= -z, f(0)= 1, f'(0)= 0, then show that only sin(z) and cos(z), respectively, satisfy those.
That's an impressive explanation.
I found series an interesting topic of mathematics but unfortunately we can not study all series, you say we can find the radius of converge only for power series, what do we do for other series? is there a similar definition or something that tells us about the conditions that a non-power series converges?