Proving Euler's Formula with Complex Numbers

  • Thread starter AdrianZ
  • Start date
  • Tags
    Formula
In summary, the conversation discusses the proof of Euler's formula in complex numbers and the use of Taylor's series to prove the formula. The participants also discuss the ratio test and its application in finding the radius of convergence for a series. They also mention the existence of a general theorem that covers a wide range of series.
  • #1
AdrianZ
319
0
Hi
I was trying to prove Euler's formula in complex numbers that states : [PLAIN]http://upload.wikimedia.org/math/d/c/4/dc4d000103c1293c5cde00fac04850d6.png. I tried to prove it by taylor series but the problem is that I don't know if my proof is right or not. I think taylor series work only when x approaches to some number a, so it doesn't apply to all values of f(x). thus when I write 1 + ix + (ix)2/2! + ... it means that I'm proving the formula only when approaches zero. can anyone help me and correct me if I'm wrong or approve me if I'm right? and does anyone know an accurate proof of the formula?

thanks
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
AdrianZ said:
Hi
I was trying to prove Euler's formula in complex numbers that states : [PLAIN]http://upload.wikimedia.org/math/d/c/4/dc4d000103c1293c5cde00fac04850d6.png. I tried to prove it by taylor series but the problem is that I don't know if my proof is right or not. I think taylor series work only when x approaches to some number a, so it doesn't apply to all values of f(x). thus when I write 1 + ix + (ix)2/2! + ... it means that I'm proving the formula only when approaches zero.
There is no limit involved here so I don't know what you mean by "approaches zero". It is true that, for a general analytic function, the Taylor's series, about x= a, is only guarenteed to converge to the function in some neighborhood of a. (If the function is not analytic in some neighborhood of a the series might converge only at a.)

However, it is relatively easy to prove that for the functions you need, ex, sin(x), and cos(x), the Taylor's series, about any x= a, converge to the required function for all x. Simply expanding eix, sin(x), and cos(x) in their Taylor's series about x= 0 and comparing is an accurate proof of the formula.

can anyone help me and correct me if I'm wrong or approve me if I'm right? and does anyone know an accurate proof of the formula?

thanks
 
Last edited by a moderator:
  • #3
HallsofIvy said:
There is no limit involved here so I don't know what you mean by "approaches zero". It is true that, for a general analytic function, the Taylor's series, about x= a, is only guarenteed to converge to the function in some neighborhood of a. (If the function is not analytic in some neighborhood of a the series might converge only at a.)

However, it is relatively easy to prove that for the functions you need, ex, sin(x), and cos(x), the Taylor's series, about any x= a, converge to the required function for all x. Simply expanding eix, sin(x), and cos(x) in their Taylor's series about x= 0 and comparing is an accurate proof of the formula.

okay so let me rephrase my question this way, how can you prove that the the series we need converge for all values of x? It's almost easy to prove it when x is in some neighborhood of some a, but how can you generalize it to all x's in the domain of the function?
 
  • #4
AdrianZ said:
okay so let me rephrase my question this way, how can you prove that the the series we need converge for all values of x? It's almost easy to prove it when x is in some neighborhood of some a, but how can you generalize it to all x's in the domain of the function?

The ratio test.
 
  • #5
l'Hôpital said:
The ratio test.

how to do that? I only know that if ratio test gives L<1 then the series converges. but I don't know how we can find the radius of convergence with the ratio test.
 
  • #6
analytic function?
 
  • #7
AdrianZ said:
how to do that? I only know that if ratio test gives L<1 then the series converges. but I don't know how we can find the radius of convergence with the ratio test.

The terms of your series are a function of x, so generally the ratio of successive terms will depend on x. So, when you say the ratio test guarantees the series converges if L < 1, L depends on x, so really the series converges if |L(x)| < 1, and there is a range of x for which this condition is true. The radius of convergence, R, is thus the value of x such that |L(R)| = 1. If |L(x)| < 1 for all x, the radius of convergence is infinite. (similarly, if |L(x)| > 1 for all x the radius of convergence is zero).

As an example, take the series [itex]1 + x + x^2 + ... = \sum_{k=0}^\infty x^n[/itex]. The ratio test gives [itex]\lim_{n\rightarrow \infty} |x^{n+1}/x^n|\equiv |L(x)| = |x|[/itex], so the series only converges if x < 1.
 
  • #8
Mute said:
The terms of your series are a function of x, so generally the ratio of successive terms will depend on x. So, when you say the ratio test guarantees the series converges if L < 1, L depends on x, so really the series converges if |L(x)| < 1, and there is a range of x for which this condition is true. The radius of convergence, R, is thus the value of x such that |L(R)| = 1. If |L(x)| < 1 for all x, the radius of convergence is infinite. (similarly, if |L(x)| > 1 for all x the radius of convergence is zero).

As an example, take the series [itex]1 + x + x^2 + ... = \sum_{k=0}^\infty x^n[/itex]. The ratio test gives [itex]\lim_{n\rightarrow \infty} |x^{n+1}/x^n|\equiv |L(x)| = |x|[/itex], so the series only converges if x < 1.

Thank you, now I'm convinced. is there a general way to find the radius of convergence for a wide range of series? I mean I know a bunch of tests to check if a series converges or not, but what if the series was so tricky and those tests didn't work, then what would I do? is there a general theorem or something that covers a large category of series? for example a theorem on some specified space with euclidean metric or something
 
  • #9
First the whole notion of "radius of convergence" only applies to power series so we can assume a series of the form [itex]\sum a_n (z- a)^n[/itex].

Now the ratio test requires looking at
[tex]\left|\frac{a_{n+1}(z- a)^{n+1}}{a_m(z- a)^n}\right|= \left|\frac{a_{n+1}}{a_n}\right||z- a|[/tex]
so it becomes just a matter of the limit
[tex]\lim_{n\to\infty} \frac{a_{n+1}}{a_n}[/tex]
If that converges to some non-zero number A, then we will have A(z- a)< 1 for z-a< 1/A and the radius of convergence is 1/A. If it converges to 0, of course, the radius of convergence is infinite- the power series converges for all z. If it does not converge, then the radius of convergence is 0. For [itex]e^z[/itex], sin(z), and cos(z), the ratio test reduces to n!/(n+1)! which goes to 0.

But don't confuse "converging" with "converging to that particular function". For example, if [itex]f(x)= e^{-1/x^2}[/itex] for x not equal to 0, and f(0)= 0, then it is fairly easy to prove that f is infinitely differentiable at x= 0 and that the derivative of any order is 0. That means that the Taylor's series for f, at x= 0, is 0+ 0x+ 0x2+ ... That certainly converges for all x- but it converges to 0 and so only converges to f(x) at x= 0.

trambolin's two word post, "analytic function?" is what you need: the basic definition of "analytic" at a point is that the Taylor's series for the function exists and converges to the function in some neighborhood of that point. But how do you prove that [itex]e^x[/itex], [itex]sin(x)[/itex], and [itex]cos(x)[/itex] are analytic?

There are many different ways to do that but the simplest is to use the initial value problems that characterize the functions. Since the Taylor series for them do have infinite radius of convergence (whatever function they sum to!), we can differentiate term by term. The derivative of
[tex]f(z)= \sum_{n=0}^\infty} \frac{z^n}{n!}[/tex]
is
[tex]f'(z)= \sum_{n=1}^\infty \frac{z^{n-1}}{(n-1)!}[/itex]
which, with change of index m= n-1 is just
[tex]f'(z)= \sum_{m=0}^\infty \frac{z^m}{m!}= f(z)[/itex]
That is, f(z) satisfies f'(z)= f(z) as well as f(0)= 1. The "existence and and uniqueness theorem" for initial value problems shows that [itex]e^z[/itex] is the only function that satifies those so the Taylor's series does,in fact, converge to [itex]e^z[/itex] for all z.

Similarly, you can show that the Taylor's series for sin(z) satisfies f''(z)= -f(z), f(0)= 0, f'(0)= 1 and the Taylor's series for cos(z0 satisfies f''(z)= -z, f(0)= 1, f'(0)= 0, then show that only sin(z) and cos(z), respectively, satisfy those.
 
Last edited by a moderator:
  • #10
HallsofIvy said:
First the whole notion of "radius of convergence" only applies to power series so we can assume a series of the form [itex]\sum a_n (z- a)^n[/itex].

Now the ratio test requires looking at
[tex]\left|\frac{a_{n+1}(z- a)^{n+1}}{a_m(z- a)^n}\right|= \left|\frac{a_{n+1}}{a_n}\right||z- a|[/tex]
so it becomes just a matter of the limit
[tex]\lim_{n\to\infty} \frac{a_{n+1}}{a_n}[/tex]
If that converges to some non-zero number A, then we will have A(z- a)< 1 for z-a< 1/A and the radius of convergence is 1/A. If it converges to 0, of course, the radius of convergence is infinite- the power series converges for all z. If it does not converge, then the radius of convergence is 0. For [itex]e^z[/itex], sin(z), and cos(z), the ratio test reduces to n!/(n+1)! which goes to 0.

But don't confuse "converging" with "converging to that particular function". For example, if [itex]f(x)= e^{-1/x^2}[/itex] for x not equal to 0, and f(0)= 0, then it is fairly easy to prove that f is infinitely differentiable at x= 0 and that the derivative of any order is 0. That means that the Taylor's series for f, at x= 0, is 0+ 0x+ 0x2+ ... That certainly converges for all x- but it converges to 0 and so only converges to f(x) at x= 0.

trambolin's two word post, "analytic function?" is what you need: the basic definition of "analytic" at a point is that the Taylor's series for the function exists and converges to the function in some neighborhood of that point. But how do you prove that [itex]e^x[/itex], [itex]sin(x)[/itex], and [itex]cos(x)[/itex] are analytic?

There are many different ways to do that but the simplest is to use the initial value problems that characterize the functions. Since the Taylor series for them do have infinite radius of convergence (whatever function they sum to!), we can differentiate term by term. The derivative of
[tex]f(z)= \sum_{n=0}^\infty} \frac{z^n}{n!}[/tex]
is
[tex]f'(z)= \sum_{n=1}^\infty \frac{z^{n-1}}{(n-1)!}[/itex]
which, with a change of index m= n-1 is juswt
[tex]f'(z)= \sum_{m=0}^\infty \frac{z^m}{m!}[/itex]
That is, f(z) satisfies f'(z)= f(z) as well as f(0)= 1. The "existence and and uniqueness theorem" for initial value problems shows that [itex]e^z[/itex] is the only function that satifies those so the Taylor's series does,in fact, converge to [itex]e^z[/itex] for all z.

Similarly, you can show that the Taylor's series for sin(z) satisfies f''(z)= -f(z), f(0)= 0, f'(0)= 1 and the Taylor's series for cos(z0 satisfies f''(z)= -z, f(0)= 1, f'(0)= 0, then show that only sin(z) and cos(z), respectively, satisfy those.

That's an impressive explanation.
I found series an interesting topic of mathematics but unfortunately we can not study all series, you say we can find the radius of converge only for power series, what do we do for other series? is there a similar definition or something that tells us about the conditions that a non-power series converges?
 

1. What is Euler's Formula?

Euler's Formula is a mathematical formula that relates the values of trigonometric functions to the complex exponential function. It states that eix = cos(x) + i*sin(x), where e is the base of the natural logarithm, i is the imaginary unit, and x is any real number.

2. How is Euler's Formula related to complex numbers?

Euler's Formula is related to complex numbers because it involves the use of the imaginary unit, i, which is a key component of complex numbers. In fact, one interpretation of Euler's Formula is that it represents a point on the complex plane, where the real part is the cosine value and the imaginary part is the sine value.

3. What is the significance of proving Euler's Formula with complex numbers?

Proving Euler's Formula with complex numbers is significant because it provides a deeper understanding of the relationship between trigonometric functions and the complex exponential function. It also allows for the use of complex numbers in solving mathematical problems that involve trigonometric functions.

4. What are the steps to prove Euler's Formula with complex numbers?

The steps to prove Euler's Formula with complex numbers involve using the Maclaurin series expansion of ex, converting the sine and cosine functions into their respective Taylor series expansions, and then combining these series to arrive at the desired result, eix = cos(x) + i*sin(x).

5. What are some real-world applications of Euler's Formula with complex numbers?

Euler's Formula with complex numbers has many real-world applications, including in electrical engineering, signal processing, and quantum mechanics. It is used to analyze alternating current circuits, decompose signals into their frequency components, and describe the behavior of particles in quantum systems.

Similar threads

Replies
2
Views
1K
Replies
11
Views
2K
Replies
2
Views
298
Replies
4
Views
933
  • Classical Physics
Replies
3
Views
607
Replies
5
Views
1K
  • Calculus
Replies
0
Views
1K
  • Calculus
Replies
12
Views
4K
  • Calculus
Replies
2
Views
431
Back
Top