Something that can't be right about eigenvectors-where is my mistake?

  • Thread starter Thread starter marschmellow
  • Start date Start date
  • Tags Tags
    Mistake
marschmellow
Messages
47
Reaction score
0
Something that can't be right about eigenvectors--where is my mistake?

Xs are eigenvectors, lambdas are eigenvalues, and Cs are constants of integration.

If we rearrange some homogeneous higher-order system into a matrix equation, we get the first equation on the word document. The solution to that equation is the second equation on the word document. But if we set the initial conditions such that all of the constants except C1 are equal to zero, the solution is the third equation, with the components of the eigenvector written out, indexed by m. Each component of Y is the derivative of the previous component, which implies that each component of any eigenvector equals the product of the component that it follows and its corresponding eigenvalue. But almost any particular case you could find would be a counterexample to this claim. I brought this up with my teacher and he reasoned that it couldn't be true but also couldn't find any mistake in my logic. Where did I go wrong?
 

Attachments

Physics news on Phys.org


Each component of Y is the derivative of the previous component

Only if you have derived your matrix by taking a differential equation in 1 dimension and converted to a system of first order equations. Find a counterexample that satisfies the requirements of such a converted matrix. A general matrix, on the other hand, will not give such an eigenvector.

*edit*

If the converted matrix is what you are talking about, remember how it was derived. You would expect the eigenvectors of the matrix to do that. However, you don't care, as you only want one component of the solution - the first one. That would be the solution to your original ODE.
 
Last edited:


Sethric said:
Only if you have derived your matrix by taking a differential equation in 1 dimension and converted to a system of first order equations. Find a counterexample that satisfies the requirements of such a converted matrix. A general matrix, on the other hand, will not give such an eigenvector.

*edit*

If the converted matrix is what you are talking about, remember how it was derived. You would expect the eigenvectors of the matrix to do that. However, you don't care, as you only want one component of the solution - the first one. That would be the solution to your original ODE.


Hmm, you're right. I just tried a few second-order ODEs out, and the claim did hold. I could have sworn that I had found counterexamples from matrices derived from an ODE, but maybe I didn't; maybe they were just ordinary matrices. Well, thanks for your help!
 


I don't think there is any error in your logic, but "setting the initial conditions such that only C_1 is non-zero" is a rather artificial thing to do.

Your "A matrix" for the DE is not unique. You are choosing one particular A matrix, and then choosing boundary conditions derived from its eigenpairs, so the boundary conditions are by definition "self consistent" with the matrix in the way you describe.

But I don't think that leads to anything very interesting, because you could pick a different A matrix with different eigenpairs.
 
Thread 'Direction Fields and Isoclines'
I sketched the isoclines for $$ m=-1,0,1,2 $$. Since both $$ \frac{dy}{dx} $$ and $$ D_{y} \frac{dy}{dx} $$ are continuous on the square region R defined by $$ -4\leq x \leq 4, -4 \leq y \leq 4 $$ the existence and uniqueness theorem guarantees that if we pick a point in the interior that lies on an isocline there will be a unique differentiable function (solution) passing through that point. I understand that a solution exists but I unsure how to actually sketch it. For example, consider a...
Back
Top