Differential Equations and a vector analogy (weird question)

Click For Summary

Discussion Overview

The discussion revolves around the relationship between solutions of third order differential equations and concepts from linear algebra, particularly linear independence and the Wronskian. Participants explore the analogy between differential equations and vector spaces, examining how these mathematical structures relate to one another.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • One participant suggests that the Wronskian's non-zero value indicates linear independence of solutions to a differential equation, drawing parallels to the determinant used for vectors.
  • Another participant elaborates on the implications of the Existence and Uniqueness Theorem, stating that if the Wronskian is non-zero, the solutions can generate all possible solutions to the differential equation.
  • A participant acknowledges the connection between linear independence of solutions and the concept of a basis in vector spaces, noting that three fundamental solutions can represent all other solutions.
  • One participant raises a question about the implications of a zero Wronskian, wondering if it means that solutions cannot be definitively classified as linearly dependent.
  • Another participant discusses the analogy of linear independence in two-dimensional vector spaces, suggesting that two solutions are linearly independent unless one is a multiple of the other, similar to vectors.
  • A participant introduces the concept of the volume of an n-dimensional parallelepiped formed by vectors, relating it to the determinant and independence of vectors.
  • One participant reflects on their learning experience with vectors and how it influences their understanding of mathematical concepts.

Areas of Agreement / Disagreement

Participants generally agree on the connections between the concepts of linear independence in differential equations and vector spaces. However, there remains uncertainty regarding the implications of a zero Wronskian and whether it definitively indicates linear dependence or independence.

Contextual Notes

The discussion includes assumptions about the nature of solutions to differential equations and the conditions under which the Wronskian is evaluated. There are unresolved questions regarding the implications of the Wronskian being zero.

1MileCrash
Messages
1,338
Reaction score
41
Suppose I have a third order differential equation, and have three solutions, y1, y2, y3.

I can check to see if they are linearly independent as such: if their wronskian is non-zero, they are linearly independent.

But the wronskian is just a determinant of a matrix.

y1 y2 y3
y1' y2' y3'
y1'' y2'' y3''This is extremely analogous to vectors, because if I have two vectors in some based system (e1,e2,e3), I can see if these two vectors are linearly independent by taking their cross product. If the cross product is non-zero, then the vectors are linearly independent.

But the cross product is just the determinant of a matrix as well.

e1 e2 e3
a1 a2 a3
b1 b2 b3
So if I'm doing the exact same thing, these two ideas must be related, no?If we let some based system be (y1,y2,y3) (the idea of having a system based on the functions seems weird and kind of circular, for in order for them to form a basis they have to be linearly independent in the first place, but I'll keep typing anyway..)

Then showing that the vectors

(y1',y2',y3') and (y1'', y2'', y3'') are linearly independent becomes the exact same task as showing that y1, y2, and y3 are linearly independent solutions to a differential equation.Why?

(I realize that it could be the case that the two "linearly independent" labels are not related for vectors and solutions to DEs in any way, but I chose to assume that there must be some correlation.)
 
Physics news on Phys.org
You are right, they are closely related. Here is a breakdown of the logical relationship here.

1. The space of functions you are talking about is the space of solutions of a homogeneous, linear ordinary differential equation (of order 3 in your example).

2. You have 3 particular solutions and you want to know whether c1y1+c2y2+c3y3 is the "general solution." That means you want to know whether every possible solution can be written this way. In other words, do the functions y1, y2, y3 form a basis for the solutions of the given homogeneous linear ODE?

3. The Existence and Uniqueness Theorem for linear equations says that a solution is uniquely determined from its initial data y(0), y'(0), y''(0). This part is essential. It gives us a way to check whether we have found all possible solutions to the equation. We just hvae to check whether c1y1+c2y2+c3y3 can cover all possible initial data. In other words, y1,y2,y3 generate all solutions if we can solve for c1,c2,c3 given any values for y(0),y'(0),y''(0).

4. Now, it is a linear algebra problem. For any triple (y_0, y_0', y_0''), can we solve for (c_1,c_2,c_3)?:
<br /> \left(\begin{array}{ccc}<br /> y_1(0) &amp; y_2(0) &amp; y_3(0) \\ <br /> y&#039;_1(0) &amp; y&#039;_2(0) &amp; y&#039;_3(0)\\<br /> y_1&#039;&#039;(0) &amp; y_2&#039;&#039;(0) &amp; y_3&#039;&#039;(0)\end{array}\right)<br /> \left(\begin{array}{c} c_1 \\ c_2 \\ c_3 \end{array}\right)<br /> =\left(\begin{array}{c} y_0 \\ y&#039;_0 \\ y&#039;&#039;_0 \end{array}\right)<br />

5. The answer is yes, as long as the Wronskian is not zero. We conclude that if the wronskian is not zero, the functions y1,y2,y3 generate all possible solutions to the equation.

6. It can be shown that the Wronskian has the form W(t) = Ae^{P(t)}, for some function P(t). Because this is exponential, it cannot be zero unless A=0. In other words, it is always 0 or never 0. Therefore, if y1, y2, y3 are independent at one point, they are independent at every point.
 
Wow, thanks for the fantastic reply!

I didn't make the very clear connection- that if y1 y2 and y3 are linearly independent, then any solution is a linear combination of those 3 (or can be expressed as such.) Which is exactly the relationship between a vector basis and some vector that exists in that space.

So these are the all the same idea, a group of three fundamental solutions form a type of "vector space" of all other solutions.

That is just awesome.

Also- if I understand step 5 correctly- it seems to imply that if the wronskian is zero, we can't really say anything about (c1,c2,c3), so does this mean that if the wronskian is 0, we can't really say that the solutions are linearly dependent? (The converse is not true?)
 
The reason why I did a 3rd order DE is because the cross product is not defined for 2d (thus I had no way to relate it to the wronskian) but now I see that doesn't matter. It's still the same idea.

For the two dimensional case (a second order DE and 2d vector space) 2 vectors in R^2 are always linearly independent provided they are not parallel. There would be an analog when considering y1 and y2 as possibly linearly independent solutions.

The wronskian being zero would imply that:

Y1Y2' = Y1'Y2

Y1/Y1' = Y2/Y2'

Which implies that:

Y1 =kY2

Where k is some constant.

Which I would say is just as specific as two lines being parallel. If two vectors are parallel, it is also true that the ratio of their components are the same.

So for a second order DE, two solutions are almost always linearly independent, unless one of them is equivalent to a multiple of the other, just like vectors.

So many ideas in math are really the same idea with infinitely many applications and interpretations. :)
 
Yeah, there are a lot of connections. Cool, isn't it.

Since you are talking in terms of cross products and parallel vectors and such, I thought I would mention one other thing in case you didn't know this already. If you have n vectors v1, ... vn in n dimensional space (\mathbb{R}^n), then those vectors generate an n dimensional parallelepiped (like a parallelogram but n dimensional). The volume of that is equal to (plus or minus) the determinant of the matrix formed by placing the vectors as columns. Those vectors are independent if that volume is not zero. If the volume is 0, it is because the parellelepiped is degenerate and does not fully occupy n dimensions. In other words, the test for independence of vectors is whether the determinant of this matrix is 0 or not.

Anyway, that explains why the Wronskian is always just a determinant no matter what the order of the equation.
 
Yes, we used the idea of the parallelepiped for something.. I believe the scalar triple (AxB) * C gave us the volume of it. 0 volume meant A , B, C arecoplanar, which in 3d just means they are not linearly independent.

We never really went into a parallelepiped of dimensions higher than 3.

That class is really the reason why I tend to see a lot of things in terms of vectors.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K