Is L a Self-Adjoint Operator with Non-Negative Eigenvalues?

brkomir
Messages
34
Reaction score
0

Homework Statement


We have a linear differential operator ##Ly=-y^{''}## working on all ##y## that can be derived at least twice on ##[-\pi ,\pi ]## and also note that ##y(-\pi )=y(\pi )## and ##y^{'}(-\pi )=y^{'}(\pi )##.
a) Is ##0## eigenvalue for ##L##?
b) Is ##L## symmetric? (I think the right English expression would be self-adjoint)
c) Find all positive eigenvalues and corresponding eigenfunctions. What is the dimension of eigenspace?
d) Does ##L## have any negative eigenvalues?

Homework Equations


The Attempt at a Solution



a) No idea how to do this one. Here's my version:

##Ly=-y^{''}=\lambda y## which gives me ##y^{''}+\lambda y=0##. Solution of this differential equation is obviously ##y(x)=Acos(\sqrt{\lambda }x)+Bsin(\sqrt{\lambda }x)##

Using this we find out that the boundary conditions for ##A\neq 0 ## and ##B\neq 0## are fulfilled only when ##\lambda =0##. Therefore the answer to question a) is YES.

b)

##L## is symmetric if ##[PW(y,x)]\mid _a^b =0##. Note that originally the DE is ##P(z)y^{''}+Q(z)y^{'}+R(z)y=0## and where ##W## is Wronskian determinant.

Obviously ##P=-1##.

##[PW(y,x)]\mid _a^b =[-\begin{vmatrix}
y & z\\
y^{'} & z^{'}
\end{vmatrix}]\mid _a^b##

After short calculus and knowing that ##a=-\pi ## and ##b=\pi ## we find out that the equation above is equal to ##0## and therefore the linear operator ##L## is symmetric.

c)

##Ly=-y^{''}=\lambda^2 y## where I used notation ##\lambda ^2## for eigenvalue instead of ##\lambda ## just because I don't want to write square roots every time.

Anyhow, solution to this DE is ##y(x)=Acos(\lambda x)+Bsin(\lambda x)## using boundary conditions:

##y(\pi )-y(-\pi )=2Bsin(\lambda \pi )=0## and of course for non trivial solutions ##B\neq 0## than:

##\lambda =n## for ##n\in \mathbb{Z}^{+}## where i suspect these are all the positive eigenvalues.

And accordingly, I assume that ##y(x)=Acos(\lambda x)+Bsin(\lambda x)## are eigenfunctions where for each ##\lambda ## the dimension of the eigenspace increases by ##1##, therefore the dimension of eigenspace is ##n##.

(or do I have to consider ##\lambda =0## here too?)

d) Hmmm, if ##\lambda <0 ## than the differential equation ##y^{''}+\lambda y=0## changes to ##y^{''}-\lambda y=0##.

Now ##y## that solves the equation above is ##y(x)=Ae^{\sqrt{\lambda }x}+Be^{-\sqrt{\lambda }x}##.

Again, taking in mind that ##y(-\pi )=y(\pi )## leaves me with

##(B-A)(e^{-\sqrt{\lambda }x\pi }-e^{\sqrt{\lambda }x\pi })=0## which is only true if ##(B-A)=0##.

Therefore the answer is NO, ##L## does not have negative eigenvalues.
 
Physics news on Phys.org
brkomir said:

Homework Statement


We have a linear differential operator ##Ly=-y^{''}## working on all ##y## that can be derived at least twice on ##[-\pi ,\pi ]## and also note that ##y(-\pi )=y(\pi )## and ##y^{'}(-\pi )=y^{'}(\pi )##.
a) Is ##0## eigenvalue for ##L##?
b) Is ##L## symmetric? (I think the right English expression would be self-adjoint)
c) Find all positive eigenvalues and corresponding eigenfunctions. What is the dimension of eigenspace?
d) Does ##L## have any negative eigenvalues?


Homework Equations





The Attempt at a Solution



a) No idea how to do this one. Here's my version:

##Ly=-y^{''}=\lambda y## which gives me ##y^{''}+\lambda y=0##. Solution of this differential equation is obviously ##y(x)=Acos(\sqrt{\lambda }x)+Bsin(\sqrt{\lambda }x)##

Using this we find out that the boundary conditions for ##A\neq 0 ## and ##B\neq 0## are fulfilled only when ##\lambda =0##. Therefore the answer to question a) is YES.

The correct observation is that setting \lambda = 0 gives you y(x) = A \cos 0 + B \sin 0 = A. But why not just solve <br /> Ly = -y&#039;&#039; = 0<br /> directly to get y(x) = Cx + D, where y(-\pi) = y(\pi) requires C = 0 but D can be anything?

b)

##L## is symmetric if ##[PW(y,x)]\mid _a^b =0##. Note that originally the DE is ##P(z)y^{''}+Q(z)y^{'}+R(z)y=0## and where ##W## is Wronskian determinant.

Obviously ##P=-1##.

##[PW(y,x)]\mid _a^b =[-\begin{vmatrix}
y & z\\
y^{'} & z^{'}
\end{vmatrix}]\mid _a^b##

After short calculus and knowing that ##a=-\pi ## and ##b=\pi ## we find out that the equation above is equal to ##0## and therefore the linear operator ##L## is symmetric.

L is self-adjoint with respect to an inner product, presumably \langle f,g \rangle = \int_{-\pi}^{\pi} f(x)g(x)\,dx if we're dealing only with real-valued twice-differentiable functions, if and only if <br /> \langle Lf, g \rangle = \langle f, Lg \rangle<br /> for all relevant f and g. So you need to show that for all twice-differentiable f and g which satisfy the boundary conditions, you have <br /> \int_{-\pi}^{\pi} -f&#039;&#039;(x) g(x) \,dx = \int_{-\pi}^{\pi} -f(x) g&#039;&#039;(x) \,dx.

c)

##Ly=-y^{''}=\lambda^2 y## where I used notation ##\lambda ^2## for eigenvalue instead of ##\lambda ## just because I don't want to write square roots every time.

Best then to set \lambda = k^2 for k &gt; 0.

Anyhow, solution to this DE is ##y(x)=Acos(\lambda x)+Bsin(\lambda x)## using boundary conditions:

##y(\pi )-y(-\pi )=2Bsin(\lambda \pi )=0## and of course for non trivial solutions ##B\neq 0## than:

##\lambda =n## for ##n\in \mathbb{Z}^{+}## where i suspect these are all the positive eigenvalues.

And accordingly, I assume that ##y(x)=Acos(\lambda x)+Bsin(\lambda x)## are eigenfunctions where for each ##\lambda ## the dimension of the eigenspace increases by ##1##, therefore the dimension of eigenspace is ##n##.

The eigenvalue is now \lambda^2 rather than \lambda, so the eigenvalues are \lambda_n^2 = n^2 &gt; 0 for n \in \mathbb{Z}^{+}.

Each eigenspace is two-dimensional: the linearly independent eigenfunctions corresponding to the eigenvalue n^2 are \cos nx and \sin nx.

(or do I have to consider ##\lambda =0## here too?)

d) Hmmm, if ##\lambda <0 ## than the differential equation ##y^{''}+\lambda y=0## changes to ##y^{''}-\lambda y=0##.

If you want to do that, then you need to write y&#039;&#039; + \lambda y = y&#039;&#039; - |\lambda|y = 0. It would be easier to define \lambda = -k^2 for k &gt; 0.

Now ##y## that solves the equation above is ##y(x)=Ae^{\sqrt{\lambda }x}+Be^{-\sqrt{\lambda }x}##.

Again, taking in mind that ##y(-\pi )=y(\pi )## leaves me with

##(B-A)(e^{-\sqrt{\lambda }x\pi }-e^{\sqrt{\lambda }x\pi })=0## which is only true if ##(B-A)=0##.

Therefore the answer is NO, ##L## does not have negative eigenvalues.

So far you've only shown that you must have B = A; you have yet to show that B = A = 0.
 
pasmith said:
The correct observation is that setting \lambda = 0 gives you y(x) = A \cos 0 + B \sin 0 = A. But why not just solve <br /> Ly = -y&#039;&#039; = 0<br /> directly to get y(x) = Cx + D, where y(-\pi) = y(\pi) requires C = 0 but D can be anything?

That is a lot easier, I agree. So the conclusion is that if ##\lambda =0## we can find a function that satisfies boundary conditions. In this case ##y=D## is a constant function, therefore ##\lambda ## can also be ##0##.

b) Is ##L## self-adjoint?

##<f,g>=\int _{-\pi }^{\pi }f(x)g(x)dx## for real functions.

##<Lf,g>=<f,Lg>##

##\int _{-\pi }^{\pi }-f^{''}gdx=\int _{-\pi }^{\pi }-fg^{''}dx##

##\int _{-\pi }^{\pi }(fg^{''}-f^{''}g)dx=0## That here is still a question. We are trying to prove that the LHS of the equation is 0.

##\int _{-\pi }^{\pi }(fg^{''}-f^{''}g)dx=\int _{-\pi }^{\pi }[(g^{'}f)^{'}-f^{'}g^{'}-(f^{'}g)^{'}+f^{'}g^{'}]dx=[g^{'}f-f^{'}g]\mid _{-\pi }^{\pi }##

##[g^{'}f-f^{'}g]\mid _{-\pi }^{\pi }=0## for given boundary conditions. Nice, so yes, ##L## is self-adjoint.

c)

pasmith said:
Best then to set \lambda = k^2 for k &gt; 0.

Ok, I will do that.

pasmith said:
The eigenvalue is now \lambda^2 rather than \lambda, so the eigenvalues are \lambda_n^2 = n^2 &gt; 0 for n \in \mathbb{Z}^{+}.

Each eigenspace is two-dimensional: the linearly independent eigenfunctions corresponding to the eigenvalue n^2 are \cos nx and \sin nx.

So for ##n## eigenfunctions the eigenspace is ##2n## dimensional? I assume that is because ##sin## and ##cos## are already linearly independent, therefore for each ##\lambda \neq 0## I get two additional linearly independent functions.

But, is it really ##\lambda _n^2=n^2## ?? Because, if I am not mistaken for \lambda = k^2 for k &gt; 0 I get

##2Bsin(k\pi )=0## therefore ##k=n## and if anything, than ##\lambda _n=k^2=n^2##. Or not?d)

pasmith said:
So far you've only shown that you must have B = A; you have yet to show that B = A = 0.

The second part of course comes from second condition, which gives me ##(A+B)(e^{\sqrt{\lambda }\pi }-e^{-\sqrt{\lambda }\pi })=0##

Now we have both conditions saying that ##A=B## and ##A=-B## which is only possible if ##A=B=0##
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top