Proving Ker P ∈ Eigenspace for Eigenvalue 0 in Linear Transformation

  • Thread starter Maths2468
  • Start date
  • Tags
    Kernel
In summary, the conversation discusses finding the eigenspace corresponding to an eigenvalue of 0 for a linear transformation P. One suggestion is to use the equation Ax=λx and substitute 0 for λ, but this is deemed confusing. Instead, the definition of a vector being in the kernel of a matrix is discussed, and it is determined that the kernel of A is the set of all x such that A(x)=0. The conversation ends with the suggestion to understand the definitions of eigenvectors and eigenvalues in order to better solve the problem at hand.
  • #1
Maths2468
16
0
Lets say you have a linear transformation P. The eigenvalues of the matrices are 0,1 and 2.
How would you show that ker P belongs to the eigenspace corresponding to 0?

So you have an eigenvalue 0. Let A be the 3X3 matrix.
I was thinking of doing something like Ax=λx and substitute 0 for λ. And then show that x,y,z are equal to 0 and hence the eigenspace is 0. Would this be a good idea?
 
Physics news on Phys.org
  • #2
Maths2468 said:
Lets say you have a linear transformation P. The eigenvalues of the matrices are 0,1 and 2.
How would you show that ker P belongs to the eigenspace corresponding to 0?

So you have an eigenvalue 0. Let A be the 3X3 matrix.
I was thinking of doing something like Ax=λx and substitute 0 for λ. And then show that x,y,z are equal to 0 and hence the eigenspace is 0. Would this be a good idea?

Thinking a little more about it would be the best idea. Isn't the definition of x being in ker P the same as the definition of x being a eigenvector with eigenvalue 0?
 
  • #3
Dick said:
Thinking a little more about it would be the best idea. Isn't the definition of x being in ker P the same as the definition of x being a eigenvector with eigenvalue 0?

yes so I assume the original suggestion was bad.
 
  • #4
Maths2468 said:
yes so I assume the original suggestion was bad.

It was certainly confusing. If you meant the vector x has the components (x,y,z) then Ax=0 doesn't necessarily mean x,y,z=0.
 
  • #5
Dick said:
It was certainly confusing. If you meant the vector x has the components (x,y,z) then Ax=0 doesn't necessarily mean x,y,z=0.

Im sorry. I am really really bad/hopeless at linear mathematics. When I meant that A(x,y,z)=0
 
  • #6
Maths2468 said:
Im sorry. I am really really bad/hopeless at linear mathematics. When I meant that A(x,y,z)=0

If you are given a specific matrix A and you want to find ker A then that's what you do alright. But you don't have to find ker A to see that it's the same as the set of eigenvectors with eigenvalue 0.
 
  • #7
Dick said:
If you are given a matrix A and you want to find ker A then that's what you do alright. But you don't have to find ker A to see that it's the same as the set of eigenvectors with eigenvalue 0.

ok cool. I am starting to understand a little better. How do you know the kernel A is the same as the set of eigenvectors with eigenvalue 0? Where do I go from here?
 
  • #8
What equation does a vector in the kernel of A satisfy? What equation does an eigenvector with ##\lambda=0## satisfy?
 
  • #9
vela said:
What equation does a vector in the kernel of A satisfy? What equation does an eigenvector with ##\lambda=0## satisfy?

what are you asking for the original matrix?
 
  • #10
No. I'm asking you to tell us how to express the phrase "##\vec{x}## is in the kernel of a matrix A" mathematically. Similarly, how do you say "##\vec{x}## is an eigenvector of matrix A with eigenvalue 0" mathematically?
 
  • #11
vela said:
No. I'm asking you to tell us how to express the phrase "##\vec{x}## is in the kernel of a matrix A" mathematically. Similarly, how do you say "##\vec{x}## is an eigenvector of matrix A with eigenvalue 0" mathematically?

for "vector x is in the kernel of A" ker(A)={x belongs to X: T(x)=0}
I am not sure about the other one.
Great question by the way, really forcing me to think and understand.
 
  • #12
Maths2468 said:
for "vector x is in the kernel of A" ker(A)={x belongs to X: T(x)=0}
Your definition for the kernel of A isn't correct as there's no mention of A.

I am not sure about the other one.
Look up the definition of an eigenvector and eigenvalue. In math, you really should know the definitions.
 
  • Like
Likes 1 person
  • #13
vela said:
Your definition for the kernel of A isn't correct as there's no mention of A.Look up the definition of an eigenvector and eigenvalue. In math, you really should know the definitions.
What should it be?
to calculate eigenvalue you use Ax=lambda x
I know it is a relatively new topic we have started and I can not stand it. But I try
 
  • #14
Maths2468 said:
What should it be?
to calculate eigenvalue you use Ax=lambda x
I know it is a relatively new topic we have started and I can not stand it. But I try

Try to take the think and understand challenge. What's wrong with saying
for "vector x is in the kernel of A" ker(A)={x belongs to X: T(x)=0}? What does T have to do with it? And, yes, x is an eigenvector if Ax=lambda x. What value of lambda are you interested in?
 
  • #15
Dick said:
Try to take the think and understand challenge. What's wrong with saying
for "vector x is in the kernel of A" ker(A)={x belongs to X: T(x)=0}? What does T have to do with it? And, yes, x is an eigenvector if Ax=lambda x. What value of lambda are you interested in?

The value of lambda I am interested in is 0.

AT the end should it be T(A)=0?
Is this stuff needed to answer the question?
 
  • #16
Maths2468 said:
The value of lambda I am interested in is 0.

AT the end should it be T(A)=0?
No, it isn't! There was no "T" in your question and you cannot just throw one in without defining it! The kernel of A is the set of all x such that A(x)= 0, not "T(x)= 0" as you had before.

Is this stuff needed to answer the question?
Yes, knowing the definition of "Kernel" is needed to answer questions about the "kernel" of a linear transformation!
 

1. What is the kernel of eigenspace?

The kernel of eigenspace is the set of all vectors that get mapped to the zero vector by a given linear transformation. In other words, it is the set of all solutions to the equation Ax=0, where A is a square matrix and x is a vector.

2. How is the kernel of eigenspace related to eigenvectors?

The kernel of eigenspace contains all the eigenvectors associated with a particular eigenvalue. This is because eigenvectors are vectors that are not affected by a linear transformation, and therefore they get mapped to themselves (and thus the zero vector) by the transformation.

3. Is the kernel of eigenspace always non-empty?

No, the kernel of eigenspace can be empty if the linear transformation has no eigenvectors. This can happen if the matrix A is not diagonalizable or if all its eigenvalues are zero.

4. How can the kernel of eigenspace be calculated?

The kernel of eigenspace can be calculated by finding the nullspace of the matrix A, which is the set of all solutions to Ax=0. This can be done using row reduction methods or by finding the eigenvalues and eigenvectors of A.

5. What is the dimension of the kernel of eigenspace?

The dimension of the kernel of eigenspace is equal to the multiplicity of the eigenvalue associated with the eigenvectors in that space. This means that the number of eigenvectors associated with an eigenvalue is equal to the dimension of the eigenspace.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
916
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
3K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
Back
Top