Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Euler's formula

  1. Sep 7, 2010 #1
    Hi
    I was trying to prove Euler's formula in complex numbers that states : [PLAIN]http://upload.wikimedia.org/math/d/c/4/dc4d000103c1293c5cde00fac04850d6.png. [Broken] I tried to prove it by taylor series but the problem is that I don't know if my proof is right or not. I think taylor series work only when x approaches to some number a, so it doesn't apply to all values of f(x). thus when I write 1 + ix + (ix)2/2! + ... it means that i'm proving the formula only when approaches zero. can anyone help me and correct me if i'm wrong or approve me if i'm right? and does anyone know an accurate proof of the formula?

    thanks
     
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Sep 7, 2010 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    There is no limit involved here so I don't know what you mean by "approaches zero". It is true that, for a general analytic function, the Taylor's series, about x= a, is only guarenteed to converge to the function in some neighborhood of a. (If the function is not analytic in some neighborhood of a the series might converge only at a.)

    However, it is relatively easy to prove that for the functions you need, ex, sin(x), and cos(x), the Taylor's series, about any x= a, converge to the required function for all x. Simply expanding eix, sin(x), and cos(x) in their Taylor's series about x= 0 and comparing is an accurate proof of the formula.

     
    Last edited by a moderator: May 4, 2017
  4. Sep 7, 2010 #3
    okay so let me rephrase my question this way, how can you prove that the the series we need converge for all values of x? It's almost easy to prove it when x is in some neighborhood of some a, but how can you generalize it to all x's in the domain of the function?
     
  5. Sep 9, 2010 #4
    The ratio test.
     
  6. Sep 9, 2010 #5
    how to do that? I only know that if ratio test gives L<1 then the series converges. but I don't know how we can find the radius of convergence with the ratio test.
     
  7. Sep 9, 2010 #6
    analytic function?
     
  8. Sep 9, 2010 #7

    Mute

    User Avatar
    Homework Helper

    The terms of your series are a function of x, so generally the ratio of successive terms will depend on x. So, when you say the ratio test guarantees the series converges if L < 1, L depends on x, so really the series converges if |L(x)| < 1, and there is a range of x for which this condition is true. The radius of convergence, R, is thus the value of x such that |L(R)| = 1. If |L(x)| < 1 for all x, the radius of convergence is infinite. (similarly, if |L(x)| > 1 for all x the radius of convergence is zero).

    As an example, take the series [itex]1 + x + x^2 + ... = \sum_{k=0}^\infty x^n[/itex]. The ratio test gives [itex]\lim_{n\rightarrow \infty} |x^{n+1}/x^n|\equiv |L(x)| = |x|[/itex], so the series only converges if x < 1.
     
  9. Sep 10, 2010 #8
    Thank you, now I'm convinced. is there a general way to find the radius of convergence for a wide range of series? I mean I know a bunch of tests to check if a series converges or not, but what if the series was so tricky and those tests didn't work, then what would I do? is there a general theorem or something that covers a large category of series? for example a theorem on some specified space with euclidean metric or something
     
  10. Sep 11, 2010 #9

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    First the whole notion of "radius of convergence" only applies to power series so we can assume a series of the form [itex]\sum a_n (z- a)^n[/itex].

    Now the ratio test requires looking at
    [tex]\left|\frac{a_{n+1}(z- a)^{n+1}}{a_m(z- a)^n}\right|= \left|\frac{a_{n+1}}{a_n}\right||z- a|[/tex]
    so it becomes just a matter of the limit
    [tex]\lim_{n\to\infty} \frac{a_{n+1}}{a_n}[/tex]
    If that converges to some non-zero number A, then we will have A(z- a)< 1 for z-a< 1/A and the radius of convergence is 1/A. If it converges to 0, of course, the radius of convergence is infinite- the power series converges for all z. If it does not converge, then the radius of convergence is 0. For [itex]e^z[/itex], sin(z), and cos(z), the ratio test reduces to n!/(n+1)! which goes to 0.

    But don't confuse "converging" with "converging to that particular function". For example, if [itex]f(x)= e^{-1/x^2}[/itex] for x not equal to 0, and f(0)= 0, then it is fairly easy to prove that f is infinitely differentiable at x= 0 and that the derivative of any order is 0. That means that the Taylor's series for f, at x= 0, is 0+ 0x+ 0x2+ ... That certainly converges for all x- but it converges to 0 and so only converges to f(x) at x= 0.

    trambolin's two word post, "analytic function?" is what you need: the basic definition of "analytic" at a point is that the Taylor's series for the function exists and converges to the function in some neighborhood of that point. But how do you prove that [itex]e^x[/itex], [itex]sin(x)[/itex], and [itex]cos(x)[/itex] are analytic?

    There are many different ways to do that but the simplest is to use the initial value problems that characterize the functions. Since the Taylor series for them do have infinite radius of convergence (whatever function they sum to!), we can differentiate term by term. The derivative of
    [tex]f(z)= \sum_{n=0}^\infty} \frac{z^n}{n!}[/tex]
    is
    [tex]f'(z)= \sum_{n=1}^\infty \frac{z^{n-1}}{(n-1)!}[/itex]
    which, with change of index m= n-1 is just
    [tex]f'(z)= \sum_{m=0}^\infty \frac{z^m}{m!}= f(z)[/itex]
    That is, f(z) satisfies f'(z)= f(z) as well as f(0)= 1. The "existence and and uniqueness theorem" for initial value problems shows that [itex]e^z[/itex] is the only function that satifies those so the Taylor's series does,in fact, converge to [itex]e^z[/itex] for all z.

    Similarly, you can show that the Taylor's series for sin(z) satisfies f''(z)= -f(z), f(0)= 0, f'(0)= 1 and the Taylor's series for cos(z0 satisfies f''(z)= -z, f(0)= 1, f'(0)= 0, then show that only sin(z) and cos(z), respectively, satisfy those.
     
    Last edited: Sep 11, 2010
  11. Sep 11, 2010 #10
    That's an impressive explanation.
    I found series an interesting topic of mathematics but unfortunately we can not study all series, you say we can find the radius of converge only for power series, what do we do for other series? is there a similar definition or something that tells us about the conditions that a non-power series converges?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Euler's formula
Loading...