Simple Proof for the existence of eigenvector

In summary, the conversation discusses the possibility of proving the existence of eigenvectors for a symmetric matrix without delving into details about eigenvalues and characteristic equations. The conversation also explores alternative approaches to proving the existence of eigenvectors and clarifies the relationship between eigenvalues and eigenvectors. Ultimately, the conversation concludes that eigenvalues and eigenvectors are equivalent concepts and both must exist for the other to exist.
  • #1
Seydlitz
263
4
Hello,

My question is this. Is it possible to prove that there exist an eigenvectors for a symmetric matrix without discussing about what eigenvalues are and going into details with characteristic equations, determinants, and so on? This my short proof for that: (The only assumption is ##A## is symmetric)

Suppose there doesn't exist any vector so that ##Av = \lambda v##. This then happens ##Av = b_{1}##. ##A^{T}v = b_{2}.## Clearly ##b_{1} \neq b_{2}## and thus ## Av \neq A^{T}v##. But this implies ##A^{T} \neq A##, whereas we assume ##A## is symmetric. Thus ##v## must exist, as required.

Is this proof legit? It may be too simple but I'm not certain. The alternative would be showing that if the determinant of ##det(A−\lambda I)=0## then ##v## must exist. But then there's no mention of the symmetric property of matrix in this case.
 
Physics news on Phys.org
  • #2
Why does [itex]Av= b_1[/itex], [itex]A^tv= b_2[/itex] immediately imply that [itex]b_1\ne b_2[/itex]? What does that have to do with eigenvalues? I think the difficulty is that you are not really clear on what you are trying to prove! If you are thinking of the vector space over the complex numbers, every matrix has an eigenvalue because of the "fundamental theorem of algebra"- every polynomial equation has at least one (complex) solution. What is true here is that the eigenvalues of a symmetric matrix are real numbers.

Without going into "detail" about determinants, if A is an n by n matrix, then [itex]det(A- \lambda I)= 0[/itex] is an nth order polynomial equation and so has n (not necessarily distinct) complex roots.

I presume you are talking about a space over the real numbers since otherwise we are done- every matrix has at least one complex eigenvalue. To show that the eigenvalues of a symmetric matrix are real numbers, let [itex]\lambda[/itex] be a (possibly complex) eigenvalue of matrix A. Then there exist a unit vector v such that [itex]Av= \lambda v[/itex].

Now, writing "<u, v>" for the inner product of vectors u and v, [itex]\lambda= \lambda\cdot 1= \lambda<v, v>= <\lambda v, v>[/itex][itex]= <Av, v>= <v, Av>= <v,\lambda v>= \overline{<\lambda v, v>}= \overline{\lambda}<v, v>= \overline{\lambda}[/itex].
 
  • #3
HallsofIvy said:
Why does [itex]Av= b_1[/itex], [itex]A^tv= b_2[/itex] immediately imply that [itex]b_1\ne b_2[/itex]? What does that have to do with eigenvalues?


I think the difficulty is that you are not really clear on what you are trying to prove! If you are thinking of the vector space over the complex numbers, every matrix has an eigenvalue because of the "fundamental theorem of algebra"- every polynomial equation has at least one (complex) solution. What is true here is that the eigenvalues of a symmetric matrix are real numbers.

Without going into "detail" about determinants, if A is an n by n matrix, then [itex]det(A- \lambda I)= 0[/itex] is an nth order polynomial equation and so has n (not necessarily distinct) complex roots.

I presume you are talking about a space over the real numbers since otherwise we are done- every matrix has at least one complex eigenvalue. To show that the eigenvalues of a symmetric matrix are real numbers, let [itex]\lambda[/itex] be a (possibly complex) eigenvalue of matrix A. Then there exist a unit vector v such that [itex]Av= \lambda v[/itex].

Now, writing "<u, v>" for the inner product of vectors u and v, [itex]\lambda= \lambda\cdot 1= \lambda<v, v>= <\lambda v, v>[/itex][itex]= <Av, v>= <v, Av>= <v,\lambda v>= \overline{<\lambda v, v>}= \overline{\lambda}<v, v>= \overline{\lambda}[/itex].

Yeah you're right, I don't think I'm sure myself on what I'm trying to prove. I've already had a proof similar to yours that show a symmetric matrix will have real eigenvalues. The thing is I don't know whether proving that will result in showing that eigenvector exist. Now I'm sure the it's equivalent, because clearly if real ##\lambda## doesn't exist then a suitable ##v## will not exist as well.
 
  • #4
You already know the proof that Eigenvalues exist but you don't know about Eigenvectors?

How can you possibly have and eigenvalue without a corresponding eigenvector? That's pretty much what the definition of "eigenvalue" says isn't it?
 
  • #5
Seydlitz said:
Yeah you're right, I don't think I'm sure myself on what I'm trying to prove. I've already had a proof similar to yours that show a symmetric matrix will have real eigenvalues. The thing is I don't know whether proving that will result in showing that eigenvector exist. Now I'm sure the it's equivalent, because clearly if real ##\lambda## doesn't exist then a suitable ##v## will not exist as well.

The whole point of the concept is that [itex]\lambda[/itex] is an eigenvalue of [itex]A[/itex] if and only if there exists [itex]v \neq 0[/itex] such that [itex]Av = \lambda v[/itex].

Thus eigenvalues exist if and only if eigenvectors exist.
 
  • #6
HallsofIvy said:
You already know the proof that Eigenvalues exist but you don't know about Eigenvectors?

How can you possibly have and eigenvalue without a corresponding eigenvector? That's pretty much what the definition of "eigenvalue" says isn't it?

Yes I'm sorry I didn't analyze the definition carefully.
 
  • #7
here is a standard proof from my web page:

“Spectral theorem” (real symmetric matrices are orthogonally diagonalizable) Thm: If k = R and A = A*, then Rn has a basis of mutually orthogonal eigenvectors of A.
Pf: The real valued function f(x) = Ax.x has a maximum on the unit sphere in Rn, at some point y where the gradient df of f is "zero", i.e. df(y) is perpendicular to the tangent space of the sphere
at y. The tangent space at y is the subspace of vectors in Rn perpendicular to y, and df(y) = 2Ay. Hence Ay is perpendicular to the tangent space at y, i.e. Ay = 0 or Ay is parallel to y, so Ay = cy for some c, and y is an eigenvector for A.
Now restrict A to the subspace V of vectors orthogonal to y. If v.y = 0, then Av.y = v.Ay = v.cy = c(v.y) = 0. Hence A preserves V. A still has the property Av.x = v.Ax on V, so the restriction of A to V has an eigenvector in V. (Although V has no natural representation as
Rn-1, the argument for producing an eigenvector depended only the symmetry property Av.x = v.Ax.) Repeating, A has an eigenbasis. QED.
 

1. What is an eigenvector?

An eigenvector is a vector that, when multiplied by a matrix, results in a scalar multiple of itself. In other words, the direction of an eigenvector remains unchanged when multiplied by a matrix.

2. Why is the existence of eigenvectors important?

Eigenvectors are important in many areas of mathematics and science, including physics, engineering, and computer science. They are used to solve systems of linear equations, analyze the behavior of dynamical systems, and perform data analysis and dimensionality reduction.

3. How do you prove the existence of an eigenvector?

The existence of an eigenvector can be proven by finding a non-zero vector that satisfies the equation Av = λv, where A is the matrix and λ is the scalar multiple. This can be done by solving the characteristic equation det(A-λI) = 0, where I is the identity matrix.

4. Can every matrix have an eigenvector?

No, not every matrix has an eigenvector. For a matrix to have an eigenvector, it must be square and have a non-zero determinant. Additionally, not all eigenvalues have corresponding eigenvectors.

5. Are eigenvectors unique?

Yes, eigenvectors are unique up to a scalar multiple. This means that any non-zero scalar multiple of an eigenvector is also an eigenvector for the same eigenvalue. However, there can be multiple eigenvectors for different eigenvalues.

Similar threads

  • Linear and Abstract Algebra
Replies
12
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
937
  • Calculus and Beyond Homework Help
Replies
2
Views
336
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
527
  • Calculus and Beyond Homework Help
Replies
24
Views
799
Back
Top