Uniqueness Theorem for homogenous linear ODEs

Crosson
Messages
1,256
Reaction score
4
Consider the system of linear differential equations:

X' = AX where X is a column vector (of functions) and A is coefficient matrix. We could just as well consider a first order specific case: y'(x) = C(x)y

We know that the soltuion will be a subset of the vector space of continuous functions. We know the function f(x) = 0 (the additive identity) is contained in the set of solutions S. We also know that any scalar multiple of an element in S is also in S, as is any linear combination of elements (all do due the properties of differential operator) . Therefore, because S is a subset of C, and the operations of addition and scalar multiplication are closed in S, S itself is a vector space.

What is the dimension of S? The dimension of S is the number of elements in the column vectors X' = AX, so a first order equation has a solution space of dimension two, etc. Therefore, the solution space of an nth order ODE can be spanned by a basis of n linearly independent vectors.

Then if we find two solutions to a second order equation F(x) and G(x), and we can show they pass the Wronskian test for linear independence, is this sufficient to show that:

S = {aF(x) + bG(x): a,b contained in R}

And thereby show uniqueness?
 
Physics news on Phys.org
is this sufficient


Yeap.
 
Crosson said:
Consider the system of linear differential equations:

X' = AX where X is a column vector (of functions) and A is coefficient matrix. We could just as well consider a first order specific case: y'(x) = C(x)y

We know that the soltuion will be a subset of the vector space of continuous functions. We know the function f(x) = 0 (the additive identity) is contained in the set of solutions S. We also know that any scalar multiple of an element in S is also in S, as is any linear combination of elements (all do due the properties of differential operator) . Therefore, because S is a subset of C, and the operations of addition and scalar multiplication are closed in S, S itself is a vector space.

What is the dimension of S? The dimension of S is the number of elements in the column vectors X' = AX, so a first order equation has a solution space of dimension two, etc. Therefore, the solution space of an nth order ODE can be spanned by a basis of n linearly independent vectors.
But the proof of that requires "existence and uniqueness". Once you have the d.e. written X'= AX you can use the standard existence and uniqueness proof for first order differential equations. (After proving that it extends to "vectors", of course.)

Then if we find two solutions to a second order equation F(x) and G(x), and we can show they pass the Wronskian test for linear independence, is this sufficient to show that:

S = {aF(x) + bG(x): a,b contained in R}

And thereby show uniqueness?
 
Ι don't think he can answer... he's banned! lol
 
Reb said:
Ι don't think he can answer... he's banned! lol

...and this question is over 4 years old!
 
robphy said:
...and this question is over 4 years old!



I kind of like replying to old but interesting questions.







As I answered to an earlier accusation, it's like going on a date with a middle-aged virgin. :P
 
I asked online questions about Proposition 2.1.1: The answer I got is the following: I have some questions about the answer I got. When the person answering says: ##1.## Is the map ##\mathfrak{q}\mapsto \mathfrak{q} A _\mathfrak{p}## from ##A\setminus \mathfrak{p}\to A_\mathfrak{p}##? But I don't understand what the author meant for the rest of the sentence in mathematical notation: ##2.## In the next statement where the author says: How is ##A\to...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top