Proving PsubA(A) = 0 for Any Square Matrix Using Jordan Canonical Form

Click For Summary

Homework Help Overview

The discussion revolves around proving that the characteristic polynomial evaluated at a square matrix equals zero, specifically using the Jordan Canonical Form. Participants are exploring the implications of the Cayley-Hamilton theorem and the properties of characteristic polynomials in relation to matrix operations.

Discussion Character

  • Exploratory, Conceptual clarification, Assumption checking

Approaches and Questions Raised

  • Participants discuss the validity of substituting a matrix for the eigenvalue in the characteristic polynomial and the implications of this substitution. There are attempts to relate the problem to triangular matrices and the use of Schur's theorem. Questions arise regarding notation and the handling of matrix products in the context of the characteristic polynomial.

Discussion Status

There is an ongoing exploration of different approaches to the problem, with some participants suggesting the use of Jordan canonical form and others reflecting on the implications of the Cayley-Hamilton theorem. While some guidance has been offered regarding triangular matrices, no consensus has been reached on the final approach.

Contextual Notes

Participants are working under the constraints of homework rules and are encouraged to explore reasoning without arriving at complete solutions. The discussion includes references to previous problems that may inform their current understanding.

Wildcat
Messages
114
Reaction score
0

Homework Statement



Let A be any square matrix and PsubA(lambda) be its characteristic polynomial, show that
PsubA(A) = 0.



Homework Equations





The Attempt at a Solution



I can show this for a general 2x2 matrix case with entries a, b,c,d and understand how it would be true of all square matrices, but I'm just not sure how to show this is true for any square matrix. We are studying Jordan Canonical Form of a matrix so, I'm thinking I should somehow use that. Any help would be appreciated.
 
Physics news on Phys.org
Isn't det(A - \lambdaI) the characteristic polynomial?

If \lambda is an eigenvalue of A, then the expression above evaluates to zero. I.e., PA(\lambda) = 0.
 
I'm sorry Mark, but I can't agree with your proof. The Cayley-Hamilton theorem seems more difficult than that.

The characteristic polynomial of a 2x2-matrix is

det\left(\begin{array}{cc} a-\lambda & b\\ c & d-\lambda\end{array}\right)

You can't go on substituting A for lambda. Because then you will have a matrix in a matrix...
 
Well, clearly a - A and d - A don't make sense, since you can't subtract a matrix from a scalar, but what's wrong with A - AI?
 
Well, \lambda I is a scalar product, while AI is a matrix product. It's not obvious to me that you can suddenly change the meaning of a product.

For example, consider the polynomial det(\lambda I)=0, then this polynomial is actually \lambda^n=0 (with n the dimension of I). Thus a root of this polynomial is a nilpotent matrix.
However, if you immediately substitute A for lambda, then you get det(A)=0. And thus a root of this polynomial would be a noninvertible matrix.

Since not every noninvertible matrix is nilpotent, we get two different answers. So the two methods are not equivalent.
 
I retract what I said before. I remembered most of what Cayley-Hamilton says (roughly, square matrices satisfy their own characteristic equations), and misapplied it here.
 
micromass said:
Well, \lambda I is a scalar product, while AI is a matrix product. It's not obvious to me that you can suddenly change the meaning of a product.

For example, consider the polynomial det(\lambda I)=0, then this polynomial is actually \lambda^n=0 (with n the dimension of I). Thus a root of this polynomial is a nilpotent matrix.
However, if you immediately substitute A for lambda, then you get det(A)=0. And thus a root of this polynomial would be a noninvertible matrix.

Since not every noninvertible matrix is nilpotent, we get two different answers. So the two methods are not equivalent.


There was a hint to use Schur's theorem to show that A may be assumed to be upper triangular, then the characteristic polynomial would be (a11 - λ1)(a22 - λ2) ...(ann-λn) right?
 
Yes, so you only need to prove things for triangular matrices.

Your characteristic polynomial is indeed correct. Now try to fill in A (your triangular matrix) in the polynomial. Do you get 0?
 
micromass said:
Yes, so you only need to prove things for triangular matrices.

Your characteristic polynomial is indeed correct. Now try to fill in A (your triangular matrix) in the polynomial. Do you get 0?

yes, because that will make the a11 entry 0 in the first matrix then the a22 0 in the 2nd matrix and so on which will result in the zero matrix??
 
  • #10
Well, you still have to multiply all those matrices...
 
  • #11
micromass said:
Well, you still have to multiply all those matrices...

are you saying I need to show that or are you trying to move me in another direction? I know the det =0 for each (ann-λn). can I use that?
 
  • #12
No, I'm not trying to push you in another direction. You were going in a great direction!
 
  • #13
micromass said:
No, I'm not trying to push you in another direction. You were going in a great direction!

thats where I'm having trouble. When I multiply the 1st two together the first two diagonal entries become zero then times the 3rd makes the 3rd diagonal entry zero and so on, but how do I notate that elegantly? Isn't that what I'm supposed to do??
 
  • #14
That is exactly what you should do!

However, I do not think that there is a clean notation for this. It's going to get messy no matter what...
 
  • #15
micromass said:
That is exactly what you should do!

However, I do not think that there is a clean notation for this. It's going to get messy no matter what...

In 2 previous problems (not on here) I had to show that if J is any diagonal matrix then
PsubJ(J)=0 also I had to show that any Jordan block J PsubJ(J)=0 where PsubJ(λ) is its characteristic polynomial. With this being proven, can I use the fact that A has a Jordan canonical form and A is similar to J? Looking at my notes, I think this may be what I need to use to prove this problem. So I need to use A=QJQ^-1 replace A with this expression. I'm getting stuck though. Micromass, have you any ideas on this?
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 25 ·
Replies
25
Views
4K
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 22 ·
Replies
22
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
15
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K