Proving Euler's Formula using infinite series.

Bruce3
Messages
2
Reaction score
0

Homework Statement


I need to show that both sin(x) and cos(x) are absolutely convergent.

Here's my work so far,

Theorem:
ℯix = cos(x) + i*sin(x) (1)
Proof:
This proof will be one using the power series. Note:
i = i, i^2 = -1 i^3 = -i, i^4 = 1, i^5 = i i^6 = -1 i^7 = -i i^8 = 1, etc. for all positive integers
and that the infinite series for sin(x) and cos(x), from the Taylor Series, are as follows:
sin(x) = 1 - x^2/2! + x^4/4! - x^6/6! + x^8/8! - x^10/10! + …
cos(x) = x/1! - x^3/3! + x^5/5! - x^7/7! + x^9/9! - x^11/11! + ...
Using the power series definition for ℯz, we can say that the power series could be written as:
ℯ^(z) = 1 + z/1! + z^2/2! + z^3/3! + z^4/4! + … + The infinite sum, starting at 0, of z^n/n!
Using equation (2), Euler did, and we will using ix for our exponent, and treat the imaginary i product with x as a real number. This gives:
ℯix = 1/0! + ix/1! + (ix)2/2! + (ix)3/3! + (ix)4/4! + … + The infinite sum, starting at 0, of z^n/n!
This gives the result of:
ℯ^(ix) = 1 + ix/1! -x^2/2! - ix^3/3! + x^4/4! + ix^5/5! -x^6/6! - ix^7/7! + x^8/8! + ...
We can separate the terms to be such that:
ℯ^(ix) = (1 - x^2/2! + x^4/4! -x^6/6! + x^8/8! - …) + (ix/1! - ix^3/3! + ix^5/5! - ix^7/7! + ... )
If we separate the term i out of the terms containing i we can see
ℯ^(ix) = (1 - x^2/2! + x^4/4! -x^6/6! + x^8/8! - …) + i(x/1! - x^3/3! + x^5/5! - x^7/7! + ... )
Interestingly, we can see that the series that has i as a factor and the other that is not, are both infinite series. The former being sin(x) and the latter being cos(x). The only thing we need to prove is that the two series are absolutely convergent.
From Calculus we know that there is a theorem to test absolute convergence:
By looking at the series ℯx we can determine if cos(x) converges absolutely. To do that we have the following rule.


...
A series is absolutely convergent if the sum of the absolute values of the terms is also convergent.
Which means that we can express ℯix as:
ℯix = cos(x) + i*sin(x)
Which is what we were trying to prove.


Homework Equations



All forms of convergence tests (i.e. Ratio Test, Integral Test, Alternating Series test etc.)

The Attempt at a Solution



I have tried to use all three tests, but have failed. Any help is very appreciated.
I know that e^(x) is absolutely convergent I'm just trying to make the connection.
 
Physics news on Phys.org
So just to see if I'm understanding your problem, you're just having trouble showing that sin and cosine are absolutely convergant?

Well, let's look at this.

\sum_{n \mathop = 0}^\infty \left|{\left({-1}\right)^n \frac {x^{2n}}{\left({2n}\right)!}}\right| = \sum_{n \mathop = 0}^\infty \frac {\left|{x}\right|^{2n}}{\left({2n}\right)!}

Next step is to realize that the right hand side is simply the even terms of this series below.

\sum_{n \mathop = 0}^\infty \frac {\left|{x}\right|^n}{n!}

From here, I hope you can finish making the last two final connections you need!
 
MarneMath said:
So just to see if I'm understanding your problem, you're just having trouble showing that sin and cosine are absolutely convergant?

Well, let's look at this.

\sum_{n \mathop = 0}^\infty \left|{\left({-1}\right)^n \frac {x^{2n}}{\left({2n}\right)!}}\right| = \sum_{n \mathop = 0}^\infty \frac {\left|{x}\right|^{2n}}{\left({2n}\right)!}

Next step is to realize that the right hand side is simply the even terms of this series below.

\sum_{n \mathop = 0}^\infty \frac {\left|{x}\right|^n}{n!}

From here, I hope you can finish making the last two final connections you need!

You do not need the absolute convergence of sine and cosine for this proof, the absolute convergence of the exponential function suffices. The proof holds true even if sine and cosine were conditionally convergent (which they are not.)
 
I'm aware, I was mostly focusing on this aspect of the OP's problem:

"The only thing we need to prove is that the two series are absolutely convergent."

Of course, it's easy to show by the ratio test that exp[x] is absolutely convergent, but why go the easy route.

Edit:

"The proof holds true even if sine and cosine were conditionally convergent "

Actually, I'm not sure if this is true generally speaking. I'm willing to say that by the definition of absolute convergences this can be shown to not be true. A non-negative series cannot converge if there exist a non-negative sub sequence that does not converge.
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top