Proof of Nilpotent Matrix: Strictly Upper Triangular Matrices

  • Thread starter Thread starter geoffrey159
  • Start date Start date
  • Tags Tags
    Matrix
geoffrey159
Messages
535
Reaction score
72

Homework Statement


Show that strictly upper triangular ##n\times n## matrices are nilpotent.

Homework Equations

The Attempt at a Solution



Let ##f## be the endomorphism represented by the strict upper triangular matrix ##M## in basis ##{\cal B} = (e_1,...,e_n)##.
We have that ##f(e_k) \in \text{span}(e_1,...,e_{k-1})##, ##(f\circ f)(e_k)\in f(\text{span}(e_1,...,e_{k-1}))= \text{span}(f(e_1),...,f(e_{k-1})) \subset \text{span}(e_1,...,e_{k-2}) ##... Repeating this process, we are sure that ##f^{(k)}(e_k) = 0##. So ##\ell\ge n \Rightarrow M^\ell = 0 ##, right?
 
Physics news on Phys.org
geoffrey159 said:

Homework Statement


Show that strictly upper triangular ##n\times n## matrices are nilpotent.

Homework Equations

The Attempt at a Solution



Let ##f## be the endomorphism represented by the strictly upper triangular matrix ##M## in basis ##{\cal B} = (e_1,...,e_n)##.
We have that ##f(e_k) \in \text{span}(e_1,...,e_{k-1})##, ##(f\circ f)(e_k)\in f(\text{span}(e_1,...,e_{k-1}))= \text{span}(f(e_1),...,f(e_{k-1})) \subset \text{span}(e_1,...,e_{k-2}) ##... Repeating this process, we are sure that ##f^{(k)}(e_k) = 0##. So ##\ell\ge n \Rightarrow M^\ell = 0 ##, right?

So you want to show ##M_{ij}^k = 0## for some positive integer ##k##.

If I'm reading your post correctly, you're saying ##f(e_k) = \text{Some strictly upper triangular matrix M}## for any basis vector in ##\cal B##.

I think it looks okay, but the notation is a little confusing. When you write ##f^{(k)}(e_k) = 0##, some people may get confused, and so I think it is better to write it as:

$${(f(e_k))}^k = 0$$

To signify you want the ##\text{k}^{th}## power of the morphism of the ##{e_k}^{th}## basis vector.

If ##{(f(e_k))}^k = M_{ij}^k = 0## for some positive integer ##k##, then you can go as far as to say ##M_{ij}^{\ell} = 0, \forall \ell \geq k##. This is intuitive because eventually with so many powers of the matrix, there will be enough zeroes to multiply and produce the zero matrix. Then you can assume every matrix power afterwards is the zero matrix.
 
Last edited:
  • Like
Likes geoffrey159
There is an alternate way to prove this:

Suppose ##A \in M_{n \times n}( \mathbb{F} )## is a strictly upper triangular matrix. Then ##A## has characteristic polynomial ##x^n##. Using the the fact:

$$p( \lambda ) = \text{det}(A - \lambda I)$$

You can deduce ##A^n = 0##.
 
  • Like
Likes geoffrey159
Hello,

Zondrina said:
So you want to show ##M_{ij}^k = 0## for some positive integer ##k##.

Yes
Zondrina said:
If I'm reading your post correctly, you're saying ##f(e_k) = \text{Some strictly upper triangular matrix M}## for any basis vector in ##\cal B##.

##f(e_k)## is the k-th column of matrix ##M##, which is strictly upper triangular

Zondrina said:
I think it looks okay, but the notation is a little confusing. When you write ##f^{(k)}(e_k) = 0##, some people may get confused, and so I think it is better to write it as:

$${(f(e_k))}^k = 0$$
To signify you want the ##\text{k}^{th}## power of the morphism of the ##{e_k}^{th}## basis vector.
By ##f^{(k)}##, I meant the k-th composition by ##f## (##f\circ ... \circ f## k times), not the k-th power.

Zondrina said:
If ##{(f(e_k))}^k = M_{ij}^k = 0## for some positive integer ##k##, then you can go as far as to say ##M_{ij}^{\ell} = 0, \forall \ell \geq k##. This is intuitive because eventually with so many powers of the matrix, there will be enough zeroes to multiply and produce the zero matrix. Then you can assume every matrix power afterwards is the zero matrix.

##f^{(k)}(e_k) = 0## means the k-th column is zero after k multiplications of M by itself, not that the whole matrix is zero, right ?
Zondrina said:
There is an alternate way to prove this:

Suppose ##A \in M_{n \times n}( \mathbb{F} )## is a strictly upper triangular matrix. Then ##A## has characteristic polynomial ##x^n##. Using the the fact:

$$p( \lambda ) = \text{det}(A - \lambda I)$$

You can deduce ##A^n = 0##.

I don't understand, could you elaborate please?
 
geoffrey159 said:
I don't understand, could you elaborate please?
Look up the Cayley-Hamilton theorem.
 
  • Like
Likes geoffrey159
geoffrey159 said:
By ##f^{(k)}##, I meant the k-th composition by ff (f∘...∘ff\circ ... \circ f k times), not the k-th power.


I was quite confused by the notation when I first looked at it.

If you want help with the alternate proof, show some of your thoughts.
 
  • Like
Likes geoffrey159
vela said:
Look up the Cayley-Hamilton theorem.
Zondrina said:
I was quite confused by the notation when I first looked at it.

If you want help with the alternate proof, show some of your thoughts.

Sorry, I was very lazy yesterday.
Your idea is the most simple math proof ever ! My thought on this is (Wikipedia's thought :biggrin:) is that the caracteristic polynomial is zero in ##A##, so that ## 0 = p(A) = (-1)^n A^n \iff A^n = 0##.
What a nice idea !
 

Similar threads

Back
Top