# Eigenvalues: Matrix corresponding to projection

• shiri
In summary: This is the summary for the conversation:In summary, A matrix A corresponds to projection in 2 dimensions onto the line generated by vector v. Lambda = -1 is an eigenvalue for A and vector v is an eigenvector corresponding to lambda = -1. Lambda = 0 is also an eigenvalue for A. Any vector w perpendicular to v is an eigenvector for A corresponding to lambda = -1 and lambda = 0. A matrix does not necessarily have a nonzero eigenvalue.
shiri
Let A be a matrix corresponding to projection in 2 dimensions onto the line generated by a vector v.

A) lambda = −1 is an eigenvalue for A
B) The vector v is an eigenvector for A corresponding to the eigenvalue lambda = −1.
C) lambda = 0 is an eigenvalue for A
D) Any vector w perpendicular to v is an eigenvector for A corresponding to the eigenvalue lambda = −1.
E) Any vector w perpendicular to v is an eigenvector for A corresponding to the eigenvalue lambda = 0.

A) A matrix must have a non-zero eigenvalue.
B) The vector is an eigenvector of corresponding to the eigenvalue.
C) See A
D) When a vector is perpendicular to A, it must be zero for eigenvalue (If it was nonzero eigenvalue, then it is not perpendicular)
E) See D

Thanks for giving reasons this time, but I don't believe A) is true. And I believe your reason even less. Let's just start from there. A matrix doesn't HAVE to have a nonzero eigenvalue, much less (-1). I will say your answers to D and E seem to be correct, but if you think E is true, then why do you think C is false? That's just crazy.

Dick said:
Thanks for giving reasons this time, but I don't believe A) is true. And I believe your reason even less. Let's just start from there. A matrix doesn't HAVE to have a nonzero eigenvalue, much less (-1). I will say your answers to D and E seem to be correct, but if you think E is true, then why do you think C is false? That's just crazy.

When I read the textbook, it stated that "a square matrix A is invertible if and only if lambda = 0 is not an eigenvalue of A."

Therefore, I assume C) was false.

After I read your message, it appears that B, C, E are correct for this problem?

Who said A was invertible? I'm not going to tell you what's correct until you convince me you know what's correct. Why do you think B is true?

Dick said:
Who said A was invertible? I'm not going to tell you what's correct until you convince me you know what's correct. Why do you think B is true?

I would say that eigenvector corresponding to the eigenvalue λ = -1 is any multiple of the basic vector. So, both make the eigenspace corresponding to the eigenvalue.

or

The eigenvector corresponding to the eigenvalue λ = −1 is the solution of the equation Ax = -x

If you project v onto the line generated by the vector v (which is what A does to v), what do you get? They aren't talking about just any vector that happens to have eigenvalue -1. They are talking about that particular v.

Last edited:
Dick said:
If you project v onto the line generated by the vector v (which is what A does to v), what do you get? They aren't talking about just any vector that happens to have eigenvalue -1. They are talking about that particular v.

I would get linearly indepedent eigenvectors, Dick?

Do you know what a projection is? http://www.freewebs.com/xinyeeisme/ProjectionVectors_1000.gif That picture shows you the projections of some vectors onto the line generated by the vector w. What's proj_w(w)?

Dick said:
Do you know what a projection is? http://www.freewebs.com/xinyeeisme/ProjectionVectors_1000.gif That picture shows you the projections of some vectors onto the line generated by the vector w. What's proj_w(w)?

proj_w(w) = ((w (dot product) w)/w)/||w||^2, isn't?

shiri said:
proj_w(w) = ((w (dot product) w)/w)/||w||^2, isn't?

Right. So what is Av if A is the projection onto the line generated by v?

Dick said:
Right. So what is Av if A is the projection onto the line generated by v?

It will be parallel? so, there are many vectors that could correspond to the given?

That means C and E are true, right?

Last edited:
Av is (v.v)v/|v|^2. (v.v) is the same as |v|^2. Av is v!

Dick said:
Av is (v.v)v/|v|^2. (v.v) is the same as |v|^2. Av is v!

Therefore C and E are true, right?

shiri said:
Therefore C and E are true, right?

Av=v has nothing to do with whether C and E are true or false. It's about a different part of the problem.

## What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that describe certain properties of a matrix. Eigenvalues are the numbers that represent the scaling factor of the eigenvectors when multiplied by the matrix. Eigenvectors are the special vectors that, when multiplied by the matrix, result in a scalar multiple of itself.

## How are eigenvalues and eigenvectors related to a projection matrix?

A projection matrix is a special type of matrix that projects a vector onto a subspace. The eigenvalues of a projection matrix are always either 0 or 1, and the eigenvectors corresponding to an eigenvalue of 1 are the vectors that lie on the subspace being projected onto.

## What is the significance of having eigenvalues of 1 in a projection matrix?

Eigenvalues of 1 in a projection matrix indicate that the matrix is idempotent, meaning that multiplying the matrix by itself results in the same matrix. This property is important in applications such as data compression and image processing.

## How can eigenvalues and eigenvectors be calculated for a projection matrix?

To calculate the eigenvalues and eigenvectors of a projection matrix, you can use the standard methods for finding eigenvalues and eigenvectors, such as solving the characteristic equation or using the power method. Alternatively, you can use the fact that the eigenvalues of a projection matrix are always 0 or 1 and find the corresponding eigenvectors by solving the linear system of equations (A-I)x = 0.

## What are some real-world applications of eigenvalues and eigenvectors in projection matrices?

Projection matrices and their corresponding eigenvalues and eigenvectors are used in a variety of fields, such as computer graphics, image processing, and data analysis. For example, in computer graphics, projection matrices are used to create 3D images from 2D images. In data analysis, projection matrices can be used for dimensionality reduction and feature selection.

Replies
5
Views
895
Replies
2
Views
785
Replies
2
Views
935
Replies
8
Views
2K
Replies
5
Views
2K
Replies
9
Views
2K
Replies
2
Views
1K
Replies
8
Views
2K
Replies
6
Views
663
Replies
3
Views
1K