Prove periodicity of exp/sin/cos from Taylor series?

Click For Summary
SUMMARY

The periodicity of the functions exp(iφ), sin(x), and cos(x) is established through their Taylor series expansions and differential equations. The discussion highlights that sin(x) and cos(x) satisfy the second-order linear differential equation y'' + y = 0, with boundary conditions y(0) = 0 and y(1) = 0, leading to the conclusion that these functions are periodic with a period of 2π. The relationship e^(ix) = cos(x) + i sin(x) further confirms the periodicity of the exponential function. The discussion also references the Sturm-Liouville problem and its eigenvalues, emphasizing the importance of boundary conditions in proving periodicity.

PREREQUISITES
  • Understanding of Taylor series expansions for functions
  • Knowledge of differential equations, specifically linear second-order equations
  • Familiarity with the Sturm-Liouville problem and its eigenvalue theory
  • Basic concepts of periodic functions and their properties
NEXT STEPS
  • Study the Taylor series for exp(ix), sin(x), and cos(x) in detail
  • Learn about the Sturm-Liouville theory and its applications in solving differential equations
  • Explore the implications of boundary conditions in differential equations
  • Investigate the relationship between eigenvalues and periodic functions in mathematical analysis
USEFUL FOR

Mathematicians, physics students, and anyone interested in the analysis of periodic functions and their properties through differential equations and Taylor series.

Gerenuk
Messages
1,027
Reaction score
5
How is it possible to see that exp(i\phi) is periodic with period 2\pi from the Taylor series?

So basically it boils down to if is it easy to see that
\sum_{n=0}^\infty \frac{(-1)^n}{(2n)!}(2\pi)^{2n}=1
? Or any other suggestions?
 
Physics news on Phys.org
How have you defined pi?
If you have defined it as half the period of exp(ix) it should be easy.
 
lurflurf said:
How have you defined pi?
If you have defined it as half the period of exp(ix) it should be easy.
Good point. Actually I like the idea of defining pi this way.
So I only have to prove that such a number exists?
Could I reverse the series to get a series for pi defined this way?
 
From the Taylor's series for sine and cosine, it is easy to show, differentiating term by term that (sin x)"= -sin(x) and (cos(x))"= -cos(x). That is, sin(x) satisfies the differential equation y"= -y with the initial conditions y(0)= 0, y'(0)= 1 and cos(x) satisfies the differential equation y"= -y with the initial conditions y(0)= 1, y'(0)= 0.

y"= -y is a linear second order differential equation and it is easy to show that any solution, y(x), to that differential equation is of the form y(x)= A cos(x)+ B sin(x) where A= y(0) and B= y'(0).

From that, and the theorem
"The set of all eigenvalues of the Sturn-Liouville problem
\frac{d}{dt}\left(p(t)\frac{dy}{dt}\right)+ (\lambda+ q(t))y= 0
with boundary conditions y(0)= 0, y(1)= 0, form an increasing unbounded sequence", with p(t)= 1, q(t)= 0, one can prove the periodicity of sine and cosine. And then, of course, the fact that e^{ix}= cos(x)+ i sin(x) gives the periodicity of eix.
 
HallsofIvy said:
"The set of all eigenvalues of the Sturn-Liouville problem
\frac{d}{dt}\left(p(t)\frac{dy}{dt}\right)+ (\lambda+ q(t))y= 0
with boundary conditions y(0)= 0, y(1)= 0, form an increasing unbounded sequence", with p(t)= 1, q(t)= 0, one can prove the periodicity of sine and cosine. And then, of course, the fact that e^{ix}= cos(x)+ i sin(x) gives the periodicity of eix.

Can you help me with the reasoning? I know that for y=exp(ix)
y''+y=0
I know that in
y''+a*y=0
the eigenvalues a form an unbounded sequence. How do I know that the boundary condition y(1)=0 is satisfied?
What to conclude?
 
I have been hoping for someone to ask!

Theorem: The set of all eigenvalues of d^y/dx^2+ \lambda y= 0, with boundary conditions y(0)= 1, y(1)= 0 form an increasing, unbounded sequence.

(Notice that I am starting from that theorem: I am not showing that eix satisfies it.)

The first thing that tells us is that the problem has eigenvalues. What are they?

If we were to try \lambda= 0, the problem, becomes y"= 0 and integrating twice, y= Ax+ B so y(0)= B= 0 and y(1)= A+ B= A= 0. Both A and B are 0: there is no non-trivial function satisifying the differential equation and the boundary conditions so 0 is not an eigenvalue.

Let's try \lambda< 0. To make that explicit, write \lambda= -\alpha^2 where \alpha can be any non-negative integer. Now the differential equation is y"- \alpha^2y= 0 and we know, from elementary differential equations, that y(t)= Ae^{\alpha t}+ Be^{-\alpha t}. y(0)= A+ B= 0 and y(1)= Ae^{\alpha}+ Be^{-\alpha}. From the first equation, B= -A. Putting that into the second equation and factoring out A, A(e^{alpha}- e^{-\alpha})= 0. Since ex is a one-to-one function, those two exponentials cannot be the same and e^{\alpha}- e^{-\alpha} cannot be 0: A=0 and B= -A= 0. Since A and B are both 0, y is identically 0 and -\alpha^2 is not an eigenvalue. No negative number is an eigenvalue.

Since we know this problem has eigenvalues and they are neither 0 nor negative, the eigenvalues must be positive. Further, the set of eigenvalues forms an increasing sequence. Since every sequence has a first member, and this sequence is increasing, the first member in the sequence is the smallest positive eigenvalue.

Let \lambda_1 be the smallest eigenvalue for this problem. Change to a new variable: x= \sqrt{\lambda_1}t so that d^2y/dt^2= (\sqrt{\lambda_1}(\sqrt{\lambda_1} d^2y/dx^2 so the equation becomes
d^y/dt^2+ \lambda_1y= \lambda_1 d^y/dx^2+ \lambda_1 y= 0
so
d^2y/dtx2+ y= 0
Of course, we also have to change the boundary values. When t= 1, x= \sqrt{\lambda_1}(0)= 0 and when t= 1, x= \sqrt{\lambda_1}(1)= \sqrt{\lambda_1}.

The general solution to y"+ y= 0 is, as we have already seen, y(x)= A cos(x)+ B sin(x). y(0)= A= 0 and then y(\sqrt{\lambda_1})= B sin(\sqrt{\lambda_1})= 0.

That looks a lot like what happened above with the exponentials but there is an important difference: we know that this "smallest eigenvalue" exists so there must be a non-trivial solution: y(x) is not identically 0 which means the two constants, A and B, cannot both be 0. Since A obviously is 0 and Bsin(\sqrt{\lambda_1})= 0 we must have sin(\sqrt{\lambda_1})= 0.

Okay, that proves that sine is NOT one-to-one, sin(0)= sin(\sqrt{\lambda_1}), but what about all of the numbers between 0 and \sqrt{\lambda_1}? Is this enough to prove that sine is periodic?

No, it is not: we need to look at cosine also. We don't have to repeat all of this for the cosine: sin^2(\sqrt{\lambda_1})+ cos^2(\sqrt{\lamda_1})= 1 so, since sin(\sqrt{\lambda_1})= 0 we have cos^2(\sqrt{\lambda_1})= 1 and then cos(\sqrt{\lambda_1}= -1.

Yes, that's right, -1. When I first did this calculation I wrote, automatically, "1" and got myself into a terrible mess. Since cos^2(\sqrt{\lambda_1})= 1 cos(\sqrt{\lambda_1}) must be either -1 or 1. To see that it can't be 1, look at the half angle formula: sin(\sqrt{\lambda_1}/2)= \sqrt{(1/2)(1- cos(\sqrt{\lambda_1})}. IF cos(\sqrt{\lambda_1})= 1 then sin(\sqrt{\lambda_1}/2)= 0. But that would mean that y(x)= sin(x) is a non-trivial function satisfying y"+ y= 0, y(0)= 0 and y(\sqrt{\lambda_1}/2)= 0, meaning that \sqrt{\lambda_1} is an eigenvalue, contradicting the fact that \lambda_1 is the smallest eigenvalue. cos(\sqrt{\lambda_1}) cannot be 1 so it must be -1.

Now use the double angle formulas: cos(2\sqrt{\lambda_1})= cos^2(\sqrt{\lambda_1})- sin^2(\sqrt{\lambda_1})= (-1)^2- 0^2= 1
sin(2\sqrt{\lambda_1})= 2sin(\sqrt{\lambda_1})cos(\sqrt{\lambda_1})= 2(0)(-1)= 0

That is, sin(x) is 0 at 0 and at 2\sqrt{\lamba_1} and cos(x) is 1 at 0 and 2\sqrt{\lambda_1}. Is that enough to prove that sine and cosine are periodic? Yes, it is!

Use the sum formulas: for any x, sin(x+ 2\sqrt{\lambda_1})= sin(x)cos(2\sqrt{\lambda_1})+ cos(x)sin(2\sqrt{\lambda_1})[/itex]= sin(x)(1)+ cos(x)(0)= sin(x).
cos(x+ 2\sqrt{\lambda_1})= cos(x)cos(2\sqrt{\lambda_1})- sin(x)sin(2\sqrt{\lambda_1})= cos(x)(1)- sin(x)(0)= sin(x).

But what is \sqrt{\lamda_1}? Since sin^2(t)+ cos^2(t)= 1 and is periodic with period 2\sqrt{\lambda_1}, we can use x= Rcos(t), y= Rsin(t) as parmetric equations for a circle of radius R. The circumference of that circle is given by the arclength integral:
\int_0^{\sqrt{\lambda_1}} \sqrt{(dx/dt)^2+ (dy/dt)^2}dt= \int_0^{\sqrt{\lambda_1} R dt= 2\sqrt{\lambda_1} R. Of course, that circumference "\pi times the diameter" or "2 \pi R" so \sqrt{\lambda_1}= \pi and sine and cosine are periodic with period 2\pi.
 
Last edited by a moderator:
HallsofIvy said:
Theorem: The set of all eigenvalues of d^y/dx^2+ \lambda y= 0, with boundary conditions y(0)=0, y(1)= 0 form an increasing, unbounded sequence.

...

That looks really complicating. I think it's equivalent to argue that the solution to the differential equation is y(x)=A\sin(\sqrt{\lambda} x) and by the boundary conditions one can see that the function has to repeat at least one value.

Is Sturm-Liouville really this way? With my first search attempt I couldn't find anything about these imposed boundary condition (which basically define the periodicity). I would suspect that to prove this boundary condition version of SL one would actually assume the periodicity of exp(ix).
 
Since you asked about proving periodicity, it would be really silly to assume periodicity! The "Sturm-liouville" theorem I quoted has nothing to do with periodicity. Cetainly, saying that y(0)= y(1)= 0 does not require periodicity. In particular y= x2- x satisfies y(0)= y(1)= 0 without being periodic.
 
HallsofIvy said:
The "Sturm-liouville" theorem I quoted has nothing to do with periodicity.
Could you point me to an internet link with this theorem, where they have the same boundary conditions? I heard about SL before, but not about the boundary condition part. That would be in important ingredient.
What I said is the SL with these boundary condition should be checked, if it doesn't assume periodicity of exp(ix). Otherwise you can't use it as a prove.

HallsofIvy said:
Certainly, saying that y(0)= y(1)= 0 does not require periodicity.
Once you know the sin(x) is the solution to the above equation, then saying y(0)=y(1)=0 is equivalent with saying the the function is periodic (which in fact you did in the prove).
 
  • #10
Gerenuk said:
Could you point me to an internet link with this theorem, where they have the same boundary conditions? I heard about SL before, but not about the boundary condition part. That would be in important ingredient.
What I said is the SL with these boundary condition should be checked, if it doesn't assume periodicity of exp(ix). Otherwise you can't use it as a prove.[\quote]
??If you have heard about S-L problems then you know that they always involve boundary conditions. I have no idea why you say it must assume the periodicity of eit in order to use that theorem.


Once you know the sin(x) is the solution to the above equation, then saying y(0)=y(1)=0 is equivalent with saying the the function is periodic (which in fact you did in the prove).
"Equivalent" in the sense that it can be proven that the function is periodic, yes. That's exactly what I did. Isn't that what you wanted?

If you meant, in your original post, "Please give me some trivial proof of the periodicity of sine and cosine from their Taylor's series definitions". I am afraid I can't help you.
 
  • #11
HallsofIvy said:
??If you have heard about S-L problems then you know that they always involve boundary conditions. I have no idea why you say it must assume the periodicity of eit in order to use that theorem.

I cannot remember all of SL. I'll check that again.
It's weird that you didn't understand what I wrote in the previous reply. One needs to check the derivation of SL to make sure it doesn't already assume periodicity of exp(ix) to deduce results.

But that's what I do next. Maybe then the issue would be resolved.

HallsofIvy said:
"Equivalent" in the sense that it can be proven that the function is periodic, yes. That's exactly what I did. Isn't that what you wanted?
I mean you wouldn't need fancy SL to prove it. Just say I know the sum rule for sin(x).
Let's assume y(0)=y(1)=0 and y(x)=sin(ax). But hey, that assumption(!) already proves that sin(x) is periodic.
 
  • #12
NO! Knowing that a function is 0 at two different points does NOT prove it is periodic.

Use the example I gave before. Would you say :"Let's assume y(0)=y(1)=0 and y(x)= x2.
But hey, that assumption(!) already proves that x2 is periodic"?

In any case, I am not assuming that y(0)= y(1)= 0 for y(x) a trig function. I used the theorem to prove[\b] that
sine of a certain number is 0, which, as I said, does not, by itself, prove that sine is periodic.

If this proof is too complicated for you, I guess you will just have to go with one of the other, simpler, proofs given in response to your question.
 
  • #13
I appreciate your suggestion with SL, but I advice you to read the posts fully before objecting.

I wrote knowing about y(0)=y(a) and knowing the sum rules proves periodicity.
In fact you used it yourself in your prove.
 
  • #14
I have corrected a Latex expression that may have caused the difficulty. However, the point is that I prove that "we must have sin(\sqrt{\lambda_1})= 0", I do not assume it. That is the whole point of the proof.
 
  • #15
HallsofIvy said:
I have corrected a Latex expression that may have caused the difficulty. However, the point is that I prove that "we must have sin(\sqrt{\lambda_1})= 0", I do not assume it. That is the whole point of the proof.

Yes, I know.
I haven't had time to check yet: But are you sure that the prove of the SL theorem you use doesn't make use of the periodicity of exp(ix)?
 
  • #16
This is actually kind of related to another recent thread. If you show that exp( i x ) satisfies the DE g' = g, g(0)=1, (easy to do term by term), then letting g(x)=u(x)+iv(x) for some real u and v, you can compute
\frac{d}{dx} |g|^2= \frac{d}{dx}(u^2+v^2)=2uu&#039;+2vv&#039;=-2uv+2vu=0.<br />

Since the derivative of g is zero and g(0)=1, we have |exp( i x )| = |g(x)| = constant = 1. This shows g lies on the unit circle in the complex plane.

We also know that |g'| = |g|, so
\frac{d}{dx}|g&#039;(x)|^2 = \frac{d}{dx}|g(x)|^2 = 0

Since g'(0) = ig(0) = i, we have |g'(x)| = constant = 1.

Then g is a unit speed parameterization of the unit circle, so the period of g is the arc-length of a circle: 2pi.
 
  • #17
maze said:
Since the derivative of g is zero and g(0)=1, we have |exp( i x )| = |g(x)| = constant = 1. This shows g lies on the unit circle in the complex plane.

That seems an OK argumentation to me. I'd even argue
<br /> |\exp(\mathrm{i}x)|=\exp(\mathrm{i}x)\exp(\mathrm{i}x)^*=\exp(\mathrm{i}x)\exp(-\mathrm{i}x)=1<br />
assuming some basic rules.

That in the end shows the existence of pi. Is it possible to derive some sort of series for this constant pi defined this way? (see my initial question)
 
  • #18
^ Must it be a series you have
pi=lim_{n->infinity} -i n[-1+(-1)^(1/n)]
and the series for Arctan Arcsin ect
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 17 ·
Replies
17
Views
4K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K