If A is nilpotent square matrix then I+A is invertibl

In summary, the Theorem states that if A is a nilpotent square matrix, then (I + A) is an invertible matrix. The proof involves constructing an inverse matrix B using the formula B = (1-A)(1+A^(2^1))(1+A^(2^2))...(1+A^[2^(n-1)]), where n is the smallest integer so that 2^n > k. By multiplying B with (1+A) and (1-A), it can be shown that B(1+A) = (1+A)B = 1, proving that B is indeed an inverse of (I + A). Additionally, it is necessary to show that each of the 1+A^(2
  • #1
SiddharthM
176
0
Theorem: If A is a nilpotent square matrix (that is for some natural number k>0, A^k =0) then (I + A) is an invertible matrix.

Pf: Let B denote the inverse which will constructed directly. Let n be the smallest integer so that 2^n>k.

B=(1-A)(1+A^(2^1))(1+A^(2^2))...(1+A^(2^n))

then, B(1+A)=(1+A)B=1-A^(2^n)=1 - (A^(k))(A^(2n-k)) = 1
now if
QED

I wonder if this is a sufficient proof? I've never taken a linear algebra course so I really don't know!
 
Last edited:
Physics news on Phys.org
  • #2
You've skipped far too many steps.

B(1+A)=(1+A)B=1-A^(2^n)

These two equalities are not sufficiently obvious to be written without justification.
 
  • #3
if you write it out it's easy 2 see. i could prove the equation by induction.

the idea is after you multiply (1+A) with (1-A) you get 1-A^2 - this is our base case.
Now for inductive step; we have B(1+A)/(1+A^(2^n))= 1 -A^(2^n), solving for B(1+A) we get B(1+A)=1-A^[2^(n+1)].

HA! I made a computational error, but this doesn't matter, as the last term in B is then the identity!
 
  • #4
Can someone redo this but with big pictures and a james earl jones voice over because its a question id like to know how to do, not sure my pea sized brain gets your proof tho :P
 
  • #5
B as written above (actually B=(1-A)(1+A^(2^1))(1+A^(2^2))...(1+A^[2^(n-1)])) is an inverse of I + A.

if you multiply it out you get I.
 
  • #6
SiddharthM said:
if you write it out it's easy 2 see. i could prove the equation by induction.

The point was not necessarily that we couldn't see it was true (or not), but that if you're attempting a proof like this and want people to check your work it is a good idea to justify all of your assertions.

What is (1+x)^-1?
 
  • #7
the inverse of the real (or complex) # 1+x. So we know A cannot be the negative identity because A^k = (plus/minus) Identity, for all k>0. So 1+A is not the zero matrix.

Yeah, it's probably a good idea I start trying to be more meticulous with the algebra being as that I'm new to it.

I've recently been using Latex to write out solved problems from rudin's Principles and have become rather 'fast' with my proofs, making the reader fill in what I feel are obvious gaps. I've come to hate prefacing an argument with "because the definition is equivalent to *blank*" if the equivalence is a well known theorem proved in the text BEFORE the problem set or if I've proven it above. But yeah, again, because I'm new to the subject...
 
  • #8
I meant what is the talyor series of 1/(1+x). I should have been clearer.
 
  • #9
SiddharthM said:
if you write it out it's easy 2 see. i could prove the equation by induction.

the idea is after you multiply (1+A) with (1-A) you get 1-A^2 - this is our base case.
Now for inductive step; we have B(1+A)/(1+A^(2^n))= 1 -A^(2^n), solving for B(1+A) we get B(1+A)=1-A^[2^(n+1)].

HA! I made a computational error, but this doesn't matter, as the last term in B is then the identity!
The fact that multiplying out (1+A)B gives you an iterative difference-of-squares thing is certainly one thing that should have been stated.

But B(1+A) is a different story; remember that matrix algebra is noncommutative! It's not enough to simply multiply out (1+A)B to get I, you also have to multiply out B(1+A). Again, because matrix algebra is noncommutative, you cannot simply divide by a matrix; you have to multiply (either on the left or on the right) by its inverse. Oh, and have you even shown that each of the 1+A^(2^m) are invertible?

(p.s. you already gave n a specific purpose in your opening post, so you shouldn't use it here for a new purpose)
 
  • #10
But B(1+A) is a different story; remember that matrix algebra is noncommutative

Whoa! I blanked on this one, lol - yes you are completely right Hurkyl I have no way of showing that B(1+A) gives us the same difference of squares thing.

Matt Grime:

boooyaa! Let n = k-1.

B=Sum of (-1)^i A^i as i runs through {0,1,2...,n}

B(1+A) = B+BA =1 - A+,-...+/- A^n + A -A^2 +A^3 -,+...-/+A^n +/- A^k

We see that this telescopes to leave us with

B(1+A) =1 +/- A^k =1 +/- 0=1

for (1+A)B we can argue similarly because BA=AB (this is due to the fact that B is the sum of matrices each of which A commutes with, like the identity and integer powers of A).

Thanks Matt Grime!
 
  • #11
let B = -A for simplicity, and then just factor I = I^k - B^k =(I-B)(I+B+...+B^k-1).
 
  • #12
mathwonk,

word
 

1. What is a nilpotent square matrix?

A nilpotent square matrix is a square matrix where a certain power of the matrix equals the zero matrix. In other words, there exists a positive integer k such that A^k = 0, where A is the nilpotent matrix.

2. What does it mean for a matrix to be invertible?

A matrix is invertible if it has an inverse matrix, denoted as A^-1, where A*A^-1 = I, the identity matrix. In other words, multiplying a matrix by its inverse results in the identity matrix.

3. How can we prove that A + I is invertible if A is nilpotent?

We can prove this by using the fact that (I+A)^n = I + nA + ... + An for any positive integer n, and using the definition of a nilpotent matrix (A^k = 0) to simplify the expression. This will result in (I+A)^n = I, which means that (I+A)^-1 = I, making A + I invertible.

4. Can we generalize this statement to matrices other than square matrices?

No, this statement only applies to square matrices. For non-square matrices, the concept of a nilpotent matrix is not well-defined.

5. How is this property useful in matrix operations and applications?

This property is useful in solving systems of linear equations, as well as in simplifying matrix expressions. It can also be used in various applications such as computer graphics and cryptography.

Similar threads

  • Linear and Abstract Algebra
Replies
8
Views
779
  • Linear and Abstract Algebra
Replies
1
Views
729
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
997
  • Linear and Abstract Algebra
Replies
19
Views
2K
Replies
7
Views
827
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
Back
Top