Extracting eigenvectors from a matrix

Click For Summary

Homework Help Overview

The discussion revolves around demonstrating that a real symmetric matrix has real eigenvalues and orthogonal eigenvectors. The specific matrix under consideration is a block matrix of the form $$ \begin{pmatrix} A & H\\ H & B \end{pmatrix} $$ Participants explore the characteristic equation and the implications of its discriminant, as well as the conditions under which eigenvectors are orthogonal.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants discuss the characteristic equation and its discriminant to establish the reality of eigenvalues. They also examine the orthogonality of eigenvectors, questioning how to prove this without assuming the dot product is zero. Some participants clarify the distinction between eigenvalues and eigenvectors, particularly regarding the zero vector. Others consider the implications of distinct versus repeated eigenvalues on the orthogonality of eigenvectors.

Discussion Status

The discussion is active, with participants providing insights into the nature of eigenvalues and eigenvectors. Some guidance has been offered regarding the conditions under which eigenvectors are orthogonal, particularly when eigenvalues are distinct. There is also exploration of cases where eigenvalues may be equal, prompting further inquiry into the existence of independent eigenvectors.

Contextual Notes

Participants note that the definition of eigenvectors may vary across different texts, leading to a discussion about the implications of including or excluding the zero vector in definitions. There is also mention of the need to consider cases where the matrix is not symmetric, which raises additional questions about the properties of eigenvectors in those contexts.

Seydlitz
Messages
262
Reaction score
4
Hello,

Homework Statement


I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.

$$
\begin{pmatrix}
A & H\\
H & B
\end{pmatrix}
$$

The Attempt at a Solution


For the matrix shown above it's clear that the charateristic equation will be
##\lambda^2-\lambda(A+B)+AB-H^2=0##

I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.
##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##
##=(A-B)^2+4H^2##
Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##

Knowing that ##\lambda## must be real for this matrix.

My only problem now is to show that the eigenvector is orthogonal.

The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.

How can I show,

##\lambda_1\lambda_2x_1x_2+\lambda_1\lambda_2x_1x_2=0##?

I know ##\lambda_1\lambda_2=det(M)##

It could become,

##det(M)(x_1x_2+y_1y_2)=0##

Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0. But the problem is this is not a proof because I explicitly assume the dot product are 0 in the first place..

I tried substituting the complete quadratic equation into the matrix as if I know the lambda but then the matrix cannot be eliminated in simple manner and I got a mess real quick.
 
Physics news on Phys.org
Seydlitz said:
Hello,

Homework Statement


I want to show that a real symmetric matrix will have real eigenvalues and orthogonal eigenvectors.

$$
\begin{pmatrix}
A & H\\
H & B
\end{pmatrix}
$$

The Attempt at a Solution


For the matrix shown above it's clear that the charateristic equation will be
##\lambda^2-\lambda(A+B)+AB-H^2=0##

I can show that the discriminant of the quadratic equation will be greater than 0 implying that the eigenvalues must be real.
##b^2-4ac=(A+B)^2-4(AB-H^2)=A^2+2AB+B^2-4AB+4H^2##
##=(A-B)^2+4H^2##
Since ##A, B, H \in \mathbb{R}##, ##(A-B)^2+4H^2 \geq 0##

Knowing that ##\lambda## must be real for this matrix.

My only problem now is to show that the eigenvector is orthogonal.

The matrix has eigenvalues of ##\lambda_1, \lambda_2##, and hence eigenvectors ##\lambda_1v_1, \lambda_2v_2##.
The eigenvectors are ##v_1## and ##v_2##, not ##\lambda_1 v_1## and ##\lambda_2 v_2##. This matters because the eigenvalue could be 0 and ##\vec{0}## can't be an eigenvector by definition.

I know ##\lambda_1\lambda_2=det(M)##

It could become,

##det(M)(x_1x_2+y_1y_2)=0##

Then it's clear the vectors are orthogonal because ##det(M)## cannot be 0.
det(M) could be 0 if either of the eigenvalues is 0.


Assume ##v_1## and ##v_2## are eigenvectors corresponding to distinct eigenvalues, and then consider the dot products ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2##. Using the fact that M is symmetric, you can show the two products are equal.
 
vela said:
The eigenvectors are ##v_1## and ##v_2##, not ##\lambda_1 v_1## and ##\lambda_2 v_2##. This matters because the eigenvalue could be 0 and ##\vec{0}## can't be an eigenvector by definition.

det(M) could be 0 if either of the eigenvalues is 0.

Assume ##v_1## and ##v_2## are eigenvectors corresponding to distinct eigenvalues, and then consider the dot products ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2##. Using the fact that M is symmetric, you can show the two products are equal.

Ok I'll keep the important notation in mind, it never occurred to me that ##\vec{0}## is not valid.

I also forget about the case if eigenvalues is 0.

So in matrix notation ##v_1 \cdot M v_2## can be written as ##v_1^{\top}Mv_2## and ##(M v_1)\cdot v_2## as ##(Mv_1)^{\top}v_2##.

##(Mv_1)^{\top}v_2## is by transpose theorem and the fact that ##M## is symmetric, ##v_1^{\top}Mv_2##. Because ##v_2## is eigenvector, we can get ##v_1^{\top}v_2=v_1^{\top}v_2## equality, which implies that the dot product is zero. Is this correct?

Edit, I forget that applying the transformation matrix will result in unknown factor of ##\lambda## instead of just 1.

So Because ##v_2## is eigenvector, we can get ##\lambda_1v_1^{\top}v_2## and ##\lambda_2v_1^{\top}v_2## after applying the transformation matrix, hence ##\lambda_1v_1^{\top}v_2=\lambda_2v_1^{\top}v_2##, and that implies the dot product is 0 because there's two different eigenvalues.
 
Last edited:
vela said:
##\vec{0}## can't be an eigenvector by definition.

Seydlitz said:
it never occurred to me that ##\vec{0}## is not valid.
Actually that's a bit of an argument and I take the other side. Yes, the definition of "eigenvalue" is "\lambda" is an eigenvalue of linear operator A if and only if there exist a non-zero vector, v, such that Av= \lambda v.

But some texts books define "eigenvector corresponding to eigenvalue \lambda" as "a non-zero vector, v, such that Av= \lambda v while other textbooks do NOT require "non-zero". I prefer the latter because with the former you have to keep saying "and the 0 vector" in statements about eigenvectors. For example, I think it is preferable to be able to say "the set of all eigenvectors corresponding to eigenvalue \lambda form a vector space" rather than "the set of all eigenvalues together with the zero vector".

In practice, of course, it doesn't make any difference. We still need to use non-zero eigenvectors to form a basis of that subspace.
 
HallsofIvy said:
Actually that's a bit of an argument and I take the other side. Yes, the definition of "eigenvalue" is "\lambda" is an eigenvalue of linear operator A if and only if there exist a non-zero vector, v, such that Av= \lambda v.

But some texts books define "eigenvector corresponding to eigenvalue \lambda" as "a non-zero vector, v, such that Av= \lambda v while other textbooks do NOT require "non-zero". I prefer the latter because with the former you have to keep saying "and the 0 vector" in statements about eigenvectors. For example, I think it is preferable to be able to say "the set of all eigenvectors corresponding to eigenvalue \lambda form a vector space" rather than "the set of all eigenvalues together with the zero vector".

In practice, of course, it doesn't make any difference. We still need to use non-zero eigenvectors to form a basis of that subspace.

I hope I can fully appreciate this difference in definition as I go further with my study. It hasn't yet sunk in now. Maybe it's because the textbook is not aimed for rigorous study of linear algebra.

Additionally if one of the mentors is still reading this thread, I'd like to know if this method of dot product ##v_1 \cdot M v_2## and ##(M v_1)\cdot v_2## can be used to prove the fact about eigenvectors in other cases e.g when M is real but not symmetric, and I want to show that the dot product is not equal, or when M is Hermitian.
 
You've shown if the eigenvalues are distinct, the eigenvectors are orthogonal. Now you have to deal with the case where ##\lambda_1 = \lambda_2##. The first thing you need to consider is whether you can find two independent eigenvectors in this case.
 
vela said:
You've shown if the eigenvalues are distinct, the eigenvectors are orthogonal. Now you have to deal with the case where ##\lambda_1 = \lambda_2##. The first thing you need to consider is whether you can find two independent eigenvectors in this case.

If ##\lambda_1 = \lambda_2##, then ##H## must be 0, and ##A=B##. This is will result in ##kI## matrix, where ##k## is a constant and ##I## unit matrix, ##\lambda_1 = \lambda_2=k## Because it's a multiple of unit matrix, any vector in the vector space considered is eigenvector. In two dimensional case we can find two orthogonal basis vector like ##(0,1)## and ##(1,0)##.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
8
Views
3K
  • · Replies 33 ·
2
Replies
33
Views
3K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K