1 Eigenvalue and 2 Eigenvectors

In summary, we discussed the concept of diagonalizable matrices and how they have a complete set of eigenvectors. For a 2x2 symmetric matrix A, there must exist 2 independent eigenvectors corresponding to the one eigenvalue. We also discussed the matrix factorization of A and how it can be used to find the eigenvalues and eigenvectors. To find these values, we can use the characteristic equation and solve for all possible values of d. This concept is commonly used in linear algebra.
  • #1
jakey
51
0
Hi all,

Let's say we have a symmetric matrix A with its corresponding diagonal matrix D. If A has only 1 eigenvalue, how do we show that there exists 2 eigenvectors?

thanks!
 
Physics news on Phys.org
  • #2
What you have written makes no sense. If a matrix has an eigenvalue, then there exist an infinite number of eigenvectors. Do you mean "2 independent eigenvectors"? And are you talking about a 2 by 2 matrix?

A matrix is "diagonalizable" if and only if it has a "complete set of eigenvectors"- that is, there is a basis for the vector space consisting of eigenvalues of the matrix. If A is an n by n matrix, then it must have n independent eigenvectors. If it has only one eigenvalue, then there must exist n independent eigenvectors corresponding to that one eigenvalue.
 
  • #3
HallsofIvy said:
What you have written makes no sense. If a matrix has an eigenvalue, then there exist an infinite number of eigenvectors. Do you mean "2 independent eigenvectors"? And are you talking about a 2 by 2 matrix?

A matrix is "diagonalizable" if and only if it has a "complete set of eigenvectors"- that is, there is a basis for the vector space consisting of eigenvalues of the matrix. If A is an n by n matrix, then it must have n independent eigenvectors. If it has only one eigenvalue, then there must exist n independent eigenvectors corresponding to that one eigenvalue.

Hi HallsofIvy,

I'm sorry, i forgot to mention that it's a 2x2 matrix. Yes, is there a general method to find 2 independent eigenvectors?
 
  • #4
In this case, every vector is an eigenvector. Pick one. Then find another vector orthogonal to it.

If you need to, you can normalize the vectors.
 
  • #5
Given:
A is symmetric

Conclusion:
if A is symmetric then that means A equals its transpose and is of size nxn (THIS IS ALWAYS TRUE)
if A is symmetric then it has n independent eigenvectors (THIS IS ALWAYS TRUE)

The matrix factorization of A is SDS^(-1)
The n columns of S are the n independent eigenvectors and for a symmetric matrix those eigenvectors are orthogonal when A is symmetric and can be made orthonormal (which makes finding the factorization of A a lot easier, via Gram-Schmidt)
The diagonal entries of the Diagonal matrix D are the eigenvalues associated with the eigenvectors

So, to find those diagonal entries and those independent eigenvectors the general form is as follows where A is an nxn matrix, x is an n-dimensional vector, and d is a constant, $ is the n-dimensional zero vector and I is the nxn identity matrix.

Ax=dx
Ax-dx=$
(A-dI)x=$
det(A-dI)=0 solve for all values of d
(this is called the characteristic equation which gives the n-degree polynomial which is used to determine the values of d (your eigenvalues that satisfy the equation))Edit: Hope this helps, I'm taking intro linear algebra this semester so if anything is wrong please let me know.
 

What is an eigenvalue?

An eigenvalue is a scalar value that represents the amount by which a given linear transformation stretches or contracts a vector in a particular direction.

What is an eigenvector?

An eigenvector is a non-zero vector that is only scaled by a transformation, not rotated or flipped. It corresponds to an eigenvalue and represents the direction of the transformation's stretching or contraction.

How do you find eigenvalues and eigenvectors?

To find eigenvalues and eigenvectors, you can solve the characteristic equation for a given matrix or use numerical methods such as the power method or QR decomposition.

Why are eigenvalues and eigenvectors important?

Eigenvalues and eigenvectors are important because they provide a way to decompose a linear transformation into simpler components, making it easier to understand and analyze. They are also used in many applications, such as image processing, data compression, and physics.

Can a matrix have more than one eigenvalue and eigenvector?

Yes, a matrix can have multiple eigenvalues and eigenvectors. The number of eigenvalues and eigenvectors is equal to the dimension of the matrix. However, some matrices may have repeated eigenvalues, resulting in fewer unique eigenvectors.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
808
  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
3
Views
2K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
936
  • Linear and Abstract Algebra
Replies
10
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
604
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
525
Back
Top