Geometric multiplicity of an eigenvalue

Ted123
Messages
428
Reaction score
0
Say we have an eigenvalue \lambda and corresponding eigenvectors of the form (x,x,2x)^T.

What is the geometric multiplicity?
 
Physics news on Phys.org
Ted123 said:
Say we have an eigenvalue \lambda and corresponding eigenvectors of the form (x,x,2x)^T.

What is the geometric multiplicity?

Well, what's the definition of geometric multiplicity?
 
Dick said:
Well, what's the definition of geometric multiplicity?

If we have a matrix A and eigenvalue \lambda then by definition the geometric multiplicity of \lambda is the dimension of \text{Ker}(A-\lambda I) which is just the dimension of the eigenspace.

So if we have found an eigenvector, say \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} then what is the geometric multiplicity? i.e. what is the dimension of the eigenspace? Is it the number of non-zero elements of an eigenvector or the number of different non-zero elements?
 
Ted123 said:
If we have a matrix A and eigenvalue \lambda then by definition the geometric multiplicity of \lambda is the dimension of \text{Ker}(A-\lambda I) which is just the dimension of the eigenspace.

So if we have found an eigenvector, say \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} then what is the geometric multiplicity? i.e. what is the dimension of the eigenspace? Is it the number of non-zero elements of an eigenvector or the number of different non-zero elements?

It's neither of those. It's the number of elements in a basis for your eigenspace. You've figured out that all of the eigenvectors are multiples of a single vector. So what's the dimension?
 
Dick said:
It's neither of those. It's the number of elements in a basis for your eigenspace. You've figured out that all of the eigenvectors are multiples of a single vector. So what's the dimension?

How do I write a basis for the eigenspace? Won't the eigenspace for an eigenvalue always be multiples of an eigenvector?
 
Ted123 said:
How do I write a basis for the eigenspace? Won't the eigenspace for an eigenvalue always be multiples of an eigenvector?

You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?
 
Dick said:
You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?

Ack. I misread your post. I thought you'd written [1,1,2]. All of the vectors in [x,x,2x]^T are multiples of the single vector [1,1,2]^T. Yes?
 
Dick said:
You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?

A question I've just done has an eigenvector \begin{bmatrix} 1 \\ 4 \\ 2 \end{bmatrix} for an eigenvalue (from the equation x=\frac{1}{4}y=\frac{1}{2}z).

How do I know whether this is the only linearly independent eigenvector in a basis for the eigenspace? (I know the algebraic multiplicity is 2 and that the geometric multiplicity must be less than or equal to this).
 
Ted123 said:
A question I've just done has an eigenvector \begin{bmatrix} 1 \\ 4 \\ 2 \end{bmatrix} for an eigenvalue (from the equation x=\frac{1}{4}y=\frac{1}{2}z).

How do I know whether this is the only linearly independent eigenvector in a basis for the eigenspace? (I know the algebraic multiplicity is 2 and that the geometric multiplicity must be less than or equal to this).

It's one dimensional again. All the eigenvectors have the form [x,4x,2x]^T. They are all multiples of a single vector [1,4,2]^T. So that single vector is a basis.
 
  • #10
Dick said:
It's one dimensional again. All the eigenvectors have the form [x,4x,2x]^T. They are all multiples of a single vector [1,4,2]^T. So that single vector is a basis.

In another question all the eigenvector equations are multiples of 2x-y-2z=0

In this case the geometric multiplicity is 2 - how do you know there are 2 linearly independent eigenvectors?

\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} and \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix} are 2 linearly independent eigenvectors but isn't\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix} a 3rd?
 
Last edited:
  • #11
Ted123 said:
In another question all the eigenvector equations are multiples of 2x-y-2z=0

In this case the geometric multiplicity is 2 - how do you know there are 2 linearly independent eigenvectors?

Solve for z in terms of x and y. So z=x-y/2. That means if you know x and y then z is determined. So it's a two parameter solution, it's a plane. It's two dimensional. Now you tell me a basis for it. Two linearly independent solutions that span the plane.
 
  • #12
Dick said:
Solve for z in terms of x and y. So z=x-y/2. That means if you know x and y then z is determined. So it's a two parameter solution, it's a plane. It's two dimensional. Now you tell me a basis for it. Two linearly independent solutions that span the plane.

\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} and \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix} are 2 linearly independent eigenvectors but isn't\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix} a 3rd?
 
  • #13
Ted123 said:
\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} and \begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix} are 2 linearly independent eigenvectors but isn't\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix} a 3rd?

[0,2,-1]=(1)*[1,2,0]+(-1)*[1,0,1]. No, the third vector isn't linearly independent of the first two.
 
  • #14
Dick said:
[0,2,-1]=(1)*[1,2,0]+(-1)*[1,0,1]. No, the third vector isn't linearly independent of the first two.

So it is!

If you have a zero in an eigenvector, say\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} then how does this affect things? Does it reduce the dimension by 1?
 
  • #15
Ted123 said:
So it is!

If you have a zero in an eigenvector, say\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} then how does this affect things? Does it reduce the dimension by 1?

I really think you should review vector spaces and linear independence. Why do you think having a zero component in a vector has anything to do with it?
 
Back
Top