gomunkul51 said:
This was the question:
I. prove that if y1 is an answer to the ODE and y1(0) = y'1(0) = 0.
then: y1 = 0 for all x = (-1,1).
can someone non-cryptically explain why it is so? :)
HallsofIvy thank you for your response but its a semi-theoretical question that I'm trying to understand, it's not homework. also I'm done with my ODE course
and if anyone is kind enough i would love to have a full explanation.
So what you are really asking for is a
proof of the "existence and uniqueness" theorem! You would start by proving:
If f(x,y) is continuous in both variables and "Lipschitz" in y (differentiable with respect to y is sufficient but not necessary) in some neighborhood of (x_0, y_0), then there exist a unique function y(x) satisfying dy/dx= f(x,y) with y(x_0)= y_0.
That's a fairly complicated theorem in its own right but any good differential equations textbook should give it. Look for "existence" or "uniqueness" in the index.
To extend that to second order equations, you do this:
Let u= y' so that u'= y" and the equation y"+ p(x)y'+ q(x)y= 0 becomes u'+ p(x)u+ q(x)y= 0 or u'= -p(x)u- q(x)y. Together with y'= u that gives you two linked first order differential equations. Now Let
Y(x)= \begin{pmatrix}u(x) \\ y(x)\end{pmatrix}
so we can write
\begin{pmatrix}u'(x)\\ y'(x)\end{pmatrix}= Y'= \begin{pmatrix}-p(x) & -q(x) \\1 & 0\end{pmatrix}\begin{pmatrix}u \\ y\end{pmatrix}= AY
so the second order differential equation becomes a first derivative equation in the
vector function Y(x).
As long as p(x) and q(x) are differentiable, this equation satisfies the existence and uniqueness theorem above (of course you have to prove that you can extend it to vector functions- that's not hard). The condition that y(0)= 0, y'(0)= u(0)= 0 becomes
Y(0)= \begin{pmatrix}u(0) \\ y(0)\end{pmatrix}= \begin{pmatrix}0 \\ 0 \end{pmatrix}.
Since the function y(x)= 0 for all x satisfies y''+ p(x)y'+ q(x)y= 0 as well as y(0)= 0, y'(0)= 0, it follows from the "uniquness" that it is the
only function that satisfies that equation with those conditions.
Of more interest, in my opinion, is the fact that there exist a unique function, say y_1(x), satisfying that equation and the conditions y_1(0)= 1, y_1'(0)= 0 and there exist a unique function, y_2(x), satisfying the equation and y_2(0)= 0, y_2'(0)= 1.
Suppose y is
any equation satisfying the differential equation. Then y(0) is some number, say A, and y'(0) is some number, say B. It follows that y(x)= Ay_1(x)+ By_2(x)! Why? Because y'(x)= Ay_1'(x)+ By_2'(x) and y"(x)= Ay_1"(x)+ By_2"(x) so that y"+ p(x)y'+ q(x)y= (Ay_1"+ By_2")+ p(x)(Ay_1'+ By_2')+ q(x)(Ay_1+ By_2)= A(y_1"+ p(x)y_1'+ q(x)y_1)+ B(y_2"+ p(x)y_1'+ q(x)y_2)= A(0)+ B(0)= 0 while Ay_1(0)+ By_2(0)= A(1)+ B(0)= A and Ay_1'(0)+ By_2'(0)= A(0)+ B(1)= B.
That is, y(x) and Ay_1(x)+ By_2(x) satisfy the same differential equation as well as the same "initial conditions" and so, by the uniqueness property, are equal.
Further, it is easy to show that y_1(x) and y_2(x) are "independent": if Ay_1(x)+ By_2(x)= 0 (meaning equal to 0 for all x), then, taking x= 0, A(1)+ B(0)= 0 so A= 0. Differentiating both sides, Ay_1'(x)+ By_2'(x)= 0 and again taking x= 0, A(0)+ B(1)= 0 so B= 0.
That tells us that y_1(x) and y_2(x) for a
basis for the set of all solutions to that equation and so the set of all solutions is a vector space of dimension 2.
That can be extended to show that the set of all solutions to an n^{th} order, linear, homogeneous differential equation is a vector space of dimension n.