Verify that a vector is an eigenvector of a matrix

Click For Summary
To verify if a vector is an eigenvector of a matrix without calculating eigenvalues, one can perform matrix multiplication. Specifically, for a matrix M and vector v, compute Mv and check if the result is a scalar multiple of v. If Mv equals a scalar times v, that scalar is the eigenvalue associated with v. In the discussed example, Mv results in the zero vector, indicating that v is an eigenvector with an eigenvalue of zero. This approach simplifies the verification process significantly.
Benny
Messages
577
Reaction score
0
Hi could someone explain to me how to verify that a vector is an eigenvector of a matrix without explicitly carrying out the calculations which give the eigenvalues of the matrix? Here is an example to illustrate my problem.

Q. Let M = \left[ {\begin{array}{*{20}c}<br /> { - 3} &amp; 1 &amp; { - 2} &amp; 4 \\<br /> { - 2} &amp; 2 &amp; 3 &amp; { - 3} \\<br /> 1 &amp; { - 7} &amp; 7 &amp; { - 1} \\<br /> 3 &amp; 0 &amp; { - 1} &amp; { - 2} \\<br /> \end{array}} \right] and v = \left[ {\begin{array}{*{20}c}<br /> 1 \\<br /> 1 \\<br /> 1 \\<br /> 1 \\<br /> \end{array}} \right].

Verify that v is an eigenvector of the matrix M and find its associated eigenvalue.

[Hint: DO NOT find all eigenvectors and eigenvalues of M]

I can't really think of a way to go about doing this question without carrying out the time consuming procedure of solving \det \left( {M - \lambda I} \right) = 0 for lambda. Perhaps there's a definition I need to recall to do this question?

Also, I've been working through some problems from various sources and the definition of eigenvector seems to differ which is confusing me. As far as I know, solving det(A - (lambda)I) = 0, where I is the identity matrix, for lamda gives the eigenvalues of the matrix A. Solving (A-(lamda)I)x = 0 for the vector x results in either a single vector or an infinite number of vectors (ie. parameters pop up).

In the case of a single vector resulting from the matrix equation, the eigenvector is just that vector isn't it? What about in the case of an infinite number of vector? For example x = (s,2t,t) = s(1,0,0) + t(0,2,1) where s and t are parameters? Are the eigenvectors all of the vectors represented by (s,2t,t)?

Help with any of the questions would be appreciated thanks.
 
Physics news on Phys.org
Benny said:
Hi could someone explain to me how to verify that a vector is an eigenvector of a matrix without explicitly carrying out the calculations which give the eigenvalues of the matrix?

You're thinking WAY too hard! :biggrin:

Go back to the eigenvalue problem itself: A\vec{x}=\lambda\vec{x}, where A is a square matrix and \lambda is a constant. If \vec{x} is an eigenvector of A then a simple matrix multiplication will show it.
 
As you say correctly in the end, the eigenvectors are always determined up to a scalair, so if v is an eigenvector, mv is one too with m a scalar.

As for your question, compute Mv and see if the result is a multiple of the original vector, the scalar it was multiplied with is then the eigenvalue.
 
Thanks for the help Tom and TD.

Mv = (0,0,0,0)^T = 0(1,1,1,1)^T so the eigenvalue is zero. :biggrin:
 
That is correct :cool:
 
The book claims the answer is that all the magnitudes are the same because "the gravitational force on the penguin is the same". I'm having trouble understanding this. I thought the buoyant force was equal to the weight of the fluid displaced. Weight depends on mass which depends on density. Therefore, due to the differing densities the buoyant force will be different in each case? Is this incorrect?

Similar threads

Replies
1
Views
1K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 8 ·
Replies
8
Views
1K