How Can a Symmetric Matrix Have 2 Eigenvectors with Only 1 Eigenvalue?

Click For Summary

Discussion Overview

The discussion revolves around the properties of symmetric matrices, specifically addressing the existence of multiple eigenvectors corresponding to a single eigenvalue. Participants explore the implications of having a symmetric 2x2 matrix with one eigenvalue and the conditions under which independent eigenvectors can be identified.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant questions the initial premise, suggesting that if a matrix has an eigenvalue, it must have an infinite number of eigenvectors, and seeks clarification on whether "2 independent eigenvectors" is intended.
  • Another participant states that for an n by n matrix, having one eigenvalue implies the existence of n independent eigenvectors, which may be relevant for the discussion of a 2x2 matrix.
  • A different viewpoint suggests that in the case of a symmetric matrix, every vector can be considered an eigenvector, proposing a method to find two orthogonal vectors as eigenvectors.
  • One participant outlines the properties of symmetric matrices, emphasizing that they have n independent eigenvectors and that these eigenvectors can be orthogonal and made orthonormal, which aids in matrix factorization.
  • Another participant provides a general method for finding eigenvalues and eigenvectors, detailing the characteristic equation and its role in determining eigenvalues.

Areas of Agreement / Disagreement

Participants express differing views on the nature of eigenvectors associated with a single eigenvalue in symmetric matrices. There is no consensus on the interpretation of the initial question or the implications of having one eigenvalue.

Contextual Notes

The discussion includes assumptions about the properties of symmetric matrices and the definitions of eigenvalues and eigenvectors, which may not be universally agreed upon. The mathematical steps and implications of the characteristic equation are also presented without resolution.

jakey
Messages
51
Reaction score
0
Hi all,

Let's say we have a symmetric matrix A with its corresponding diagonal matrix D. If A has only 1 eigenvalue, how do we show that there exists 2 eigenvectors?

thanks!
 
Physics news on Phys.org
What you have written makes no sense. If a matrix has an eigenvalue, then there exist an infinite number of eigenvectors. Do you mean "2 independent eigenvectors"? And are you talking about a 2 by 2 matrix?

A matrix is "diagonalizable" if and only if it has a "complete set of eigenvectors"- that is, there is a basis for the vector space consisting of eigenvalues of the matrix. If A is an n by n matrix, then it must have n independent eigenvectors. If it has only one eigenvalue, then there must exist n independent eigenvectors corresponding to that one eigenvalue.
 
HallsofIvy said:
What you have written makes no sense. If a matrix has an eigenvalue, then there exist an infinite number of eigenvectors. Do you mean "2 independent eigenvectors"? And are you talking about a 2 by 2 matrix?

A matrix is "diagonalizable" if and only if it has a "complete set of eigenvectors"- that is, there is a basis for the vector space consisting of eigenvalues of the matrix. If A is an n by n matrix, then it must have n independent eigenvectors. If it has only one eigenvalue, then there must exist n independent eigenvectors corresponding to that one eigenvalue.

Hi HallsofIvy,

I'm sorry, i forgot to mention that it's a 2x2 matrix. Yes, is there a general method to find 2 independent eigenvectors?
 
In this case, every vector is an eigenvector. Pick one. Then find another vector orthogonal to it.

If you need to, you can normalize the vectors.
 
Given:
A is symmetric

Conclusion:
if A is symmetric then that means A equals its transpose and is of size nxn (THIS IS ALWAYS TRUE)
if A is symmetric then it has n independent eigenvectors (THIS IS ALWAYS TRUE)

The matrix factorization of A is SDS^(-1)
The n columns of S are the n independent eigenvectors and for a symmetric matrix those eigenvectors are orthogonal when A is symmetric and can be made orthonormal (which makes finding the factorization of A a lot easier, via Gram-Schmidt)
The diagonal entries of the Diagonal matrix D are the eigenvalues associated with the eigenvectors

So, to find those diagonal entries and those independent eigenvectors the general form is as follows where A is an nxn matrix, x is an n-dimensional vector, and d is a constant, $ is the n-dimensional zero vector and I is the nxn identity matrix.

Ax=dx
Ax-dx=$
(A-dI)x=$
det(A-dI)=0 solve for all values of d
(this is called the characteristic equation which gives the n-degree polynomial which is used to determine the values of d (your eigenvalues that satisfy the equation))Edit: Hope this helps, I'm taking intro linear algebra this semester so if anything is wrong please let me know.
 

Similar threads

  • · Replies 33 ·
2
Replies
33
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
9K