# Geometric multiplicity of an eigenvalue

Say we have an eigenvalue $\lambda$ and corresponding eigenvectors of the form $(x,x,2x)^T$.

What is the geometric multiplicity?

## Answers and Replies

Dick
Science Advisor
Homework Helper
Say we have an eigenvalue $\lambda$ and corresponding eigenvectors of the form $(x,x,2x)^T$.

What is the geometric multiplicity?

Well, what's the definition of geometric multiplicity?

Well, what's the definition of geometric multiplicity?

If we have a matrix $A$ and eigenvalue $\lambda$ then by definition the geometric multiplicity of $\lambda$ is the dimension of $\text{Ker}(A-\lambda I)$ which is just the dimension of the eigenspace.

So if we have found an eigenvector, say $\begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}$ then what is the geometric multiplicity? i.e. what is the dimension of the eigenspace? Is it the number of non-zero elements of an eigenvector or the number of different non-zero elements?

Dick
Science Advisor
Homework Helper
If we have a matrix $A$ and eigenvalue $\lambda$ then by definition the geometric multiplicity of $\lambda$ is the dimension of $\text{Ker}(A-\lambda I)$ which is just the dimension of the eigenspace.

So if we have found an eigenvector, say $\begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix}$ then what is the geometric multiplicity? i.e. what is the dimension of the eigenspace? Is it the number of non-zero elements of an eigenvector or the number of different non-zero elements?

It's neither of those. It's the number of elements in a basis for your eigenspace. You've figured out that all of the eigenvectors are multiples of a single vector. So what's the dimension?

It's neither of those. It's the number of elements in a basis for your eigenspace. You've figured out that all of the eigenvectors are multiples of a single vector. So what's the dimension?

How do I write a basis for the eigenspace? Won't the eigenspace for an eigenvalue always be multiples of an eigenvector?

Dick
Science Advisor
Homework Helper
How do I write a basis for the eigenspace? Won't the eigenspace for an eigenvalue always be multiples of an eigenvector?

You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?

Dick
Science Advisor
Homework Helper
You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?

Ack. I misread your post. I thought you'd written [1,1,2]. All of the vectors in [x,x,2x]^T are multiples of the single vector [1,1,2]^T. Yes?

You just wrote a basis for the eigenspace. It's [1,1,2]. It spans the eigenspace. You might want to review the definition of basis and dimension. If the eigenspace had been given by [x,y,2x]^T, it would be two dimensional. What would be a basis for that?

A question I've just done has an eigenvector $$\begin{bmatrix} 1 \\ 4 \\ 2 \end{bmatrix}$$ for an eigenvalue (from the equation $x=\frac{1}{4}y=\frac{1}{2}z$).

How do I know whether this is the only linearly independent eigenvector in a basis for the eigenspace? (I know the algebraic multiplicity is 2 and that the geometric multiplicity must be less than or equal to this).

Dick
Science Advisor
Homework Helper
A question I've just done has an eigenvector $$\begin{bmatrix} 1 \\ 4 \\ 2 \end{bmatrix}$$ for an eigenvalue (from the equation $x=\frac{1}{4}y=\frac{1}{2}z$).

How do I know whether this is the only linearly independent eigenvector in a basis for the eigenspace? (I know the algebraic multiplicity is 2 and that the geometric multiplicity must be less than or equal to this).

It's one dimensional again. All the eigenvectors have the form [x,4x,2x]^T. They are all multiples of a single vector [1,4,2]^T. So that single vector is a basis.

It's one dimensional again. All the eigenvectors have the form [x,4x,2x]^T. They are all multiples of a single vector [1,4,2]^T. So that single vector is a basis.

In another question all the eigenvector equations are multiples of $2x-y-2z=0$

In this case the geometric multiplicity is 2 - how do you know there are 2 linearly independent eigenvectors?

$$\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$$ and $$\begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix}$$ are 2 linearly independent eigenvectors but isn't$$\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix}$$ a 3rd?

Last edited:
Dick
Science Advisor
Homework Helper
In another question all the eigenvector equations are multiples of $2x-y-2z=0$

In this case the geometric multiplicity is 2 - how do you know there are 2 linearly independent eigenvectors?

Solve for z in terms of x and y. So z=x-y/2. That means if you know x and y then z is determined. So it's a two parameter solution, it's a plane. It's two dimensional. Now you tell me a basis for it. Two linearly independent solutions that span the plane.

Solve for z in terms of x and y. So z=x-y/2. That means if you know x and y then z is determined. So it's a two parameter solution, it's a plane. It's two dimensional. Now you tell me a basis for it. Two linearly independent solutions that span the plane.

$$\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$$ and $$\begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix}$$ are 2 linearly independent eigenvectors but isn't$$\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix}$$ a 3rd?

Dick
Science Advisor
Homework Helper
$$\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$$ and $$\begin{bmatrix} 1 \\ 2 \\ 0 \end{bmatrix}$$ are 2 linearly independent eigenvectors but isn't$$\begin{bmatrix}0 \\ 2 \\ -1 \end{bmatrix}$$ a 3rd?

[0,2,-1]=(1)*[1,2,0]+(-1)*[1,0,1]. No, the third vector isn't linearly independent of the first two.

[0,2,-1]=(1)*[1,2,0]+(-1)*[1,0,1]. No, the third vector isn't linearly independent of the first two.

So it is!

If you have a zero in an eigenvector, say$$\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$$ then how does this affect things? Does it reduce the dimension by 1?

Dick
Science Advisor
Homework Helper
So it is!

If you have a zero in an eigenvector, say$$\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}$$ then how does this affect things? Does it reduce the dimension by 1?

I really think you should review vector spaces and linear independence. Why do you think having a zero component in a vector has anything to do with it????