Null Space and Eigenvalues/Eigenvectors

In summary, we discussed a linear operator of dimension n with a non-trivial null space of dimension k < n. This implies the existence of k eigenvalues, all with a value of 0. The corresponding eigenvectors are linearly independent vectors in the null space. It is more accurate to say that 0 is an eigenvalue of geometric multiplicity k, rather than talking about "k eigenvalues". The algebraic multiplicity of an eigenvalue is the number of times it appears as a root in the characteristic equation, while the geometric multiplicity is the dimension of the subspace formed by its eigenvectors. The algebraic multiplicity is always greater than or equal to the geometric multiplicity.
  • #1
psholtz
136
0
Suppose I have a linear operator of dimension n, and suppose that this operator has a non-trivial null space. That is:

[tex]A \cdot x = 0[/tex]

Suppose the dimension of the null space is k < n, that is, I can find 0 < k linearly independent vectors, each of which yields the 0 vector when the linear operator A is applied to it.

Is it fair to say that this operator then has k eigenvalues, of value 0? and that the k eigenvectors corresponding to this eigenvalue=0 are linearly independent vectors of the null space?
 
Physics news on Phys.org
  • #2
hi psholtz! :wink:
psholtz said:
… Is it fair to say that this operator then has k eigenvalues, of value 0? and that the k eigenvectors corresponding to this eigenvalue=0 are linearly independent vectors of the null space?

yes :smile:

(what is worrying you about that? :confused:)
 
  • #3
Nothing worrying me about that..

Just wanted to make sure I had it straight.. :smile:

thanks!
 
  • #4
It would be more standard to say that 0 is an eigenvalue of geometric multiplicity k (which would imply that it had algebraic multiplicity greater than or equal to k: the characteristic equation has a factor [itex]x^n[/itex] for [itex]n\ge k[/itex]) rather than to talk about "k eigenvalues", all of value k, as if they were different eigenvalues that happened to have the same value. Its value is the only property an eigenvalue has!
 
  • #5
Yes, thanks..

So algebraic multiplicity is the number of times the eigenvalue appears as a root in the characteristic equation.

Geometric multiplicity is the dimension of the subspace formed by the eigenvectors of that particular eigenvalue..

And the algebraic multiplicity is always going to be greater than or equal to the geometric multiplicity, correct?
 

1. What is null space?

Null space, also known as the kernel, is the set of all vectors that map to the zero vector when multiplied by a given matrix. In other words, it is the set of all solutions to the homogeneous equation Ax = 0, where A is the given matrix.

2. How do you find the null space of a matrix?

To find the null space of a matrix, you can use row reduction techniques to put the matrix into reduced row echelon form. The pivot columns in the resulting matrix will correspond to the columns of the original matrix that do not contribute to the null space. The remaining columns form a basis for the null space.

3. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are a pair of concepts that are closely related to each other. An eigenvector is a nonzero vector that, when multiplied by a given matrix, results in a scalar multiple of itself. The corresponding scalar is known as the eigenvalue. In other words, the eigenvector remains in the same direction after the matrix transformation, but it may be scaled by the eigenvalue.

4. How do you find eigenvalues and eigenvectors?

To find eigenvalues and eigenvectors, you can use the characteristic polynomial of a matrix. By solving for the roots of the polynomial, you can find the eigenvalues. Then, by plugging in each eigenvalue into the equation (A-λI)x = 0, you can solve for the corresponding eigenvector.

5. Why are eigenvalues and eigenvectors important?

Eigenvalues and eigenvectors are important because they provide insight into the behavior of a matrix transformation. They can be used to better understand the scaling and rotational properties of a transformation, and they are essential in applications such as principal component analysis, image compression, and differential equations. They also have applications in quantum mechanics and other areas of physics.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
852
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
4K
  • Linear and Abstract Algebra
Replies
1
Views
875
  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
850
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
929
  • Linear and Abstract Algebra
Replies
6
Views
845
  • Linear and Abstract Algebra
Replies
1
Views
2K
Back
Top