Proving the Constant Wronskian Theorem for Scalar ODEs

heinerL
Messages
19
Reaction score
0
Hello

I'm trying to solve the following problem: given the scalar ODE x''+q(t)x=0 with a continuous function q.

x(t) and y(t) are two solution of the ODE and the wronskian is:

W(t):=x(t)y'(t)-x'(t)y(t). x(t) and y(t) are linear independent if W(t)\neq 0.

I want to show that W(t) is constant and that if x(t_1)=0 \Rightarrow x'(t_1) \neq 0 and y(t_1) \neq 0.

I am completely lost, can you help me?

Thx
 
Physics news on Phys.org
Differentiate x(t)y'(t)- x'(t)y(t) again:
x'y'+ xy''- x"y- x'y'= xy"- x"y
Now, y'= -qy and x"= -qx so that becomes x(-qy)- (-qy)x= 0 for all t. That implies that the Wronskian itself isa constant.
 
HallsofIvy said:
Differentiate x(t)y'(t)- x'(t)y(t) again:
x'y'+ xy''- x"y- x'y'= xy"- x"y
Now, y'= -qy and x"= -qx so that becomes x(-qy)- (-qy)x= 0 for all t. That implies that the Wronskian itself isa constant.

Do you mean y''=-qy and x''=-qx? How do you get this?

And for the second part can i say: Because x(t) and y(t) are linear independent if W(t) \neq 0, \forall t
So let's say x(t) and y(t) are linear independent and x(t_1)=0 \Rightarrow W(t_1)=-x(t_1)*y(t_1) and suppose x'(t_1)=0 \Rightarrow W(t_1)=0 which is a contradiction, that means that if x(t_1)=0 \Rightarrow x'(t_1) \neq 0 \ \ and \ \ y(t_1) \neq 0.

Is this correct?
 
heinerL said:
Do you mean y''=-qy and x''=-qx? How do you get this?

Yeah, this is what Halls meant. It's true of course because x and y satisfy the differential equation.

Your argument for linear independence is correct except when you wrote W(t_1)=-x(t_1)*y(t_1)<br />[/tex] you meant W(t_1)=-x&#039;(t_1)*y(t_1)<br />
 
Ah okey thanks I get it! And yes i meant W(t_1)=-x&#039;(t_1)*y(t_1).

And i found another argument but without proof, do you know how to proof it:

If x(t_1)=x(t_2)=0 \ \ and \ \ x(t) \neq 0 \ \ for \ \ t \in (t_1,t_2) then it follows that y(t) has a exactly one root in (t_1,t_2)
 
I found a way. I'll give a brief sketch for you to try to work out
1) Compare the signs of x'(t1) and x'(t2)
2) Use this to conclude that y(t) has at least one root on (t1, t2)
3) If it has two roots, explain why x(t) has a root inside of (t1, t2)
 
So i try to write down my thoughts:

t_1 &amp; t_2 are roots of x
So
W(t_k)=-x&#039;(t_k)y(t_k)&lt;0 \ \ k =1,2 \Rightarrow y(t_k), x&#039;(t_k) \neq 0

Suppose x&#039;(t_1)&gt;0 \Rightarrow \ \ \ &quot;mean value theorm&quot; \ \ \ x(t)&gt;0
then if x&#039;(t_2)&gt;0 \Rightarrow x(t)&lt;0 which is a contradiction because then x would have another root between t_1 and t_2 therefore x&#039;(t_2)&lt;0

\Rightarrow x&#039;(t_1)&gt;0, x&#039;(t_2)&lt;0 \Rightarrow y(t_1)&gt;0, y(t_2)&lt;0

So y has root in (t_1,t_2)

And there is just one root because if there would be another root one can switch x and y and with above show that there is just one root!

Is this correct?
 
Back
Top