No. of solutions of 1st and 2nd order ODE

  • Thread starter Trave11er
  • Start date
  • #1
71
0

Main Question or Discussion Point

Well, surely there is one unique solution to linear 1st order ODE and two linearly independent ones for 2nd order linear ODE, but can someone share the proof of this?
 
Last edited:

Answers and Replies

  • #2
HallsofIvy
Science Advisor
Homework Helper
41,772
911
It seems to me I just did this on a different forum! Did you ask there also?

Strictly speaking, it is not necessarily true- you have to assume nice properties of the coefficients which may be functions of x. In particular,we need the coefficients to be continuous and the leading coefficent not equal to 0. You also need to have the equation "homogeneous"- that is, that there no functions of the independent variable, x, that are not multiplying the dependent variable, y, or derivatives of it.

Assuming that, a first order linear equation can be written dy/dx= f(x, y). As long as f is continuous in x and "Lipschitz" in y ("Lipschitz" is midway between "continuous" and "differentiable" so many texts take "differentiable" as a sufficient condition). Requiring the coefficents to be continuous and the leading coefficient not equal to 0 guarentees that.

Now, it is easy to show that "the set of all solutions to a linear differential equation forms a vector space". To do that, assume f and g are solutions, put y(x)= af(x)+ bg(x), where a and b are numbers, into the equation, then show, using (af+ bg)'= af'+ bg', etc., show that you can separate everything into a( ... terms in f...)+ b(... terms in g...)= a(0)+ b(0)= 0.

Now, as long as the coefficients are as I have said, we can use the "existence and uniqueness" equation to show that there exist a unique solution to the equation satisfying y1(a)= 1 for any specific a. Now, let y be any solution to the first order equation. Let Y= y(a). It is easy to see that the function Yy1(x) satisfies the same differential equation: py'+ qy= p(Yy1')+ q(Yy1)= Y(py1'+ qy1)= Y(0)= 0. But the "uniqueness" theorem then requires that y(x)= Yy1(x).

Similarly with second order differential equations with some variations. Given the second derivative linear, homogeneous d.e. py''+ qy'+ r= 0, with p, q, and r continuous functions, we can let Y1= y', Y2= y and write this as a pair of first order equations. Since Y1= y', y''= Y1' so we have pY1'+ qY+ r= 0 and Y2'= Y1. We can then write that as a "vector" equation, taking our vector as Y= (Y1, Y2) so that our two equations become the single vector equation [tex]\frac{dY}{dx}= \frac{\begin{pmatrix}Y1 \\ Y2\end{pmatrix}}{dx}= \begin{pmatrix}p(x) & q(x) \\ 1 & 0\end{pmatrix}\begin{pmatrix}Y1 \\ Y2\end{pmatrix}[/tex]

One can without too much trouble, cast the "existence and uniqueness" theorem for first order problems into "vector" form- the proof follows the same logic. The main difference is that the "additional condition" you need for uniqueness becomes
[tex]Y(0)= \begin{pmatrix}Y1(0) \\ Y2(0)\end{pmatrix}= \begin{pmatrix}y(a) \\ y'(a)\end{pmatrix}= \begin{pmatrix}A \\ B\end{pmatrix}[/tex]
the standard "initial value" condition for such a problem.

Now, one can show that there exist a unique solution to the differential equation, y1, satisfying y(a)= 1, y'(a)= 0 and another solution, y2, satisfying y(a)= 0, y'(a)= 1.

Finally, given any solution, Y(x) to the differential equation, let A= Y(a) and B= Y'(a) and show that y(x)= Ay1(x)+ By2(x) satisfies the differential equation and y(a)= A, y'(a)= B and so, by the "uniqueness" Y(x)= Ay1(x)+ By2(x).


A general extension of that, to n dimensions, shows that the set of all solutions to an nth order linear, homogeneous d.e. for an nth dimensional vector space and so can be written as a linear combination of n linearly independent solutions.
 
Last edited by a moderator:

Related Threads for: No. of solutions of 1st and 2nd order ODE

  • Last Post
Replies
3
Views
4K
Replies
4
Views
2K
Replies
5
Views
3K
  • Last Post
Replies
4
Views
4K
Replies
6
Views
4K
Replies
14
Views
1K
  • Last Post
Replies
9
Views
951
Top