If ##A^3=0## then ##A-Z## is nonsingular

  • Thread starter Thread starter Hall
  • Start date Start date
  • Tags Tags
    Matrices
Click For Summary
SUMMARY

The discussion centers on the mathematical assertion that if \( A^3 = 0 \), then \( A - Z \) is nonsingular. Participants explore various matrix forms, particularly \( A = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \), and the implications of eigenvalues and Jordan normal forms. Key conclusions include the invertibility of \( A - I \) and the necessity of demonstrating that left and right inverses are equivalent in non-commutative settings. The conversation highlights the complexity of the topic, with some participants suggesting simpler approaches to the proof.

PREREQUISITES
  • Understanding of matrix algebra, specifically properties of nilpotent matrices.
  • Familiarity with Jordan normal form and its implications in linear algebra.
  • Knowledge of eigenvalues and eigenvectors in the context of matrix theory.
  • Concept of matrix invertibility and the relationship between left and right inverses.
NEXT STEPS
  • Study the properties of nilpotent matrices and their implications in linear algebra.
  • Learn about Jordan normal forms and their applications in matrix theory.
  • Research the relationship between left and right inverses in non-commutative algebra.
  • Explore eigenvalues and eigenvectors, focusing on their role in determining matrix properties.
USEFUL FOR

Mathematicians, students of linear algebra, and anyone interested in advanced matrix theory, particularly those exploring nilpotent matrices and their properties.

Hall
Messages
351
Reaction score
87
Homework Statement
Let ##A## be an ##n \times## square matrix and ##Z## denote the ##n\times n## identity matrix. If ##A^3=0## then prove that (or present a counter example) ##A-Z## is nonsingular.
Relevant Equations
A matrix is nonsingular if and only if its corresponding Linear transformation is invertible.
I'm really unable to have a start, because I cannot think of any matrix (other than ##O##) such that its cube is the zero matrix. I tried to assume A = ##\begin{bmatrix} a &c \\b &d \end{bmatrix} ## and computed ##A^3## and set it to ##O## to get an idea how the elements would look like, but the resulting equations will be non-linear.
 
Physics news on Phys.org
The first thing I thought of was to factorise ##A^3 - I##.
 
Transform ##A## into the Jordan normal form. Also, ##\exp(A) = 1+A+(1/2)A^2## is regular, but I'm not sure whether this helps. To me ##A## is an element of the Heisenberg algebra, and ##A+1## an element of the Heisenberg group, but I'm not sure how to formalize that.
 
  • Wow
Likes   Reactions: PeroK
fresh_42 said:
Transform ##A## into the Jordan normal form. Also, ##\exp(A) = 1+A+(1/2)A^2## is regular, but I'm not sure whether this helps. To me ##A## is an element of the Heisenberg algebra, and ##A+1## an element of the Heisenberg group, but I'm not sure how to formalize that.
It's a bit simpler than that!
 
Hall said:
I'm really unable to have a start, because I cannot think of any matrix (other than ##O##) such that its cube is the zero matrix.
Try ##
A = \begin{bmatrix}
0 &1 &0 \\
0& 0 &1 \\
0 &0 &0
\end{bmatrix}
##
It shifts the first coordinate to the second and the second coordinate to the third and the third to zero. Three applications to any vector should give the zero vector.
And if ##B## is any invertable matrix, it is easy to show that ##C = BAB^{-1}## also satisfies ##C^3=O##.
 
Last edited by a moderator:
  • Like
Likes   Reactions: Hall
In terms of 2x2 matrices, we have:
$$\begin{bmatrix}
1 \ \ \ \ \ \ b \\
{-\frac 1 b} \ \ {-1}
\end{bmatrix}^2 = 0
$$
 
Or simply consider eigenvectors of ##A.##
 
PeroK said:
The first thing I thought of was to factorise ##A^3 - I##.
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
 
Hall said:
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
That's not quite right.

Once you've fixed it, what do you know about ##A^3##?
 
  • Like
Likes   Reactions: Hall
  • #10
PeroK said:
In terms of 2x2 matrices, we have:
$$\begin{bmatrix}
1 \ \ \ \ \ \ b \\
{-\frac 1 b} \ \ {-1}
\end{bmatrix}^2 = 0
$$
There are actually no 2x2 matrices where ##A^3 = 0## but ##A^2 \ne 0##.
 
  • Like
Likes   Reactions: FactChecker
  • #11
Hall said:
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
##A^3-I^3=(A-I)(A^2+A+I)##
 
  • Like
Likes   Reactions: Hall and FactChecker
  • #12
PeroK said:
It's a bit simpler than that!
My bad, I confused the forum. Sorry, @Hall!
 
  • #13
PeroK said:
That's not quite right.

Once you've fixed it, what do you know about ##A^3##?
$$
A^3 -I^3 = (A-I) (A^2 +A +I)$$
$$
(A-I)(-A^2-A-I) = I$$
Hence, (A-I) is invertible because it has an inverse ##(-A^2 -A -I)##.
 
  • Like
Likes   Reactions: FactChecker and PeroK
  • #14
Hall said:
$$
A^3 -I^3 = (A-I) (A^2 +A +I)$$
$$
(A-I)(-A^2-A-I) = I$$
Hence, (A-I) is invertible because it has an inverse ##(-A^2 -A -I)##.
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation. Hint: Use ##(-A^2-A-I)(A-I) = I## instead and at the end apply ##A## once more.
 
  • #15
fresh_42 said:
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation.
What the OP has posted is sufficient, given that he already proved in another thread that ##AB = I## is sufficient for matrices ##A## and ##B## to be inverses of each other.

In any case, we could just change the order of the factors:
$$
A^3 -I^3 = (A-I) (A^2 +A +I) = (A^2 + A + I)(A-I)$$
Hence:$$(A-I)B = B(A-I) = I$$ with ##B = -A^2 -A -I##
 
  • Like
Likes   Reactions: Hall
  • #16
fresh_42 said:
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation.
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
 
  • #17
Hall said:
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
See:

https://www.physicsforums.com/threads/prove-b-is-invertible-if-ab-i.1012582/
 
  • #18
Hall said:
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
Yes, in this case where the inverse is a polynomial of ##A.## Honestly? I do not have the group theory proof in mind to say which group axioms besides associativity are actually necessary to prove left inverse equals right inverse. It is not automatically the case.

And there is still the problem with ##A+I## instead of ##A-I.## I guess we may assume characteristic zero.
 
  • #19
fresh_42 said:
And there is still the problem with ##A+I## instead of ##A-I.##
Why do we need to do this one as well? I don't see it in the OP. In any case:

If ##A^3 = 0##, then ##(-A)^3 = 0## and ##-A - I = -(A + I)## is invertible; hence, ##A + I## is invertible. QED
 
  • Wow
Likes   Reactions: fresh_42
  • #20
PeroK said:
Why do we need to do this one as well? I don't see it in the OP. In any case:

If ##A^3 = 0##, then ##(-A)^3 = 0## and ##-A - I = -(A + I)## is invertible; hence, ##A + I## is invertible. QED
Where have you shown the invertibility of ##A+I## or ##-A-I##?

\begin{align*}
v\in \operatorname{ker}(A+I) &\Longrightarrow Av=-v\\
&\Longrightarrow (-A^2-A-I)(A-I)v=(-A^2-A-I)(A.v-Iv)\\
&\ldots
\end{align*}
 
  • #21
fresh_42 said:
Where have you shown the invertibility of ##A+I##?
It's a corollary of the invertibility of ##A - I##, given that ##A^3 = 0##.
 
  • #22
fresh_42 said:
Where have you shown the invertibility of ##A+I##?
Post #8.

$$A^3 + I^3= (A+I)(A^2 - A +I)$$
$$ I = (A+I)(A^2-A+I)$$
 
  • Like
Likes   Reactions: PeroK and fresh_42
  • #23
Hall said:
Post #8.
There was a mistake in your formula in post #8. Look at the ##I\cdot I## term!
PeroK said:
It's a corollary of the invertibility of ##A - I##, given that ##A^3 = 0##.
And where have you or some magician shown it?

Edit: Now it is in post #22. But nowhere before.
 
  • #24
PeroK said:
Why do we need to do this one as well? I don't see it in the OP.
I had ##A+Z## in mind. Bad hair day ...
 
  • #25
This whole thread seems stupidly complicated.

Suppose ##A-Z## is singular. Then there exists ##v\neq 0## such that ##(A-Z)v=0## -> ##Av=v##.

Apply ##A## twice more and be done with it. No factorization required.
 
  • Sad
  • Like
Likes   Reactions: PeroK and fresh_42
  • #26
Office_Shredder said:
This whole thread seems stupidly complicated.
Sorry to be so stupid, but I'm just trying to help the OP and factorisation was the first thing I thought of. I can't see it's particularly more complicated than considering an eigenvector.

Most of the muddle stemmed from the thread being derailed somewhat.
 
  • Like
Likes   Reactions: Hall
  • #27
PeroK said:
Sorry to be so stupid, but I'm just trying to help the OP and factorisation was the first thing I thought of. I can't see it's particularly more complicated than considering an eigenvector.

Most of the muddle stemmed from the thread being derailed somewhat.
I think the question maker had exactly the factorization method in mind, because this question was there in chapter before the chapters on Determinants and Eigenvectors.
 
  • Like
Likes   Reactions: PeroK
  • #28
PeroK said:
Most of the muddle stemmed from the thread being derailed somewhat.
And not just somewhat. Given that ##A^3 = 0## it's easy to see that
##-I = A^3 - I = (A - I)(A^2 + A + I) \Rightarrow I = A^3 - I = (A - I)(-A^2 - A - I)##
Further, one can show that ##(-A^2 - A - I)(A - 1) = -A^3 + I = I##
Clearly ##A - I## is invertible, which was shown in several other posts in this thread, and was the point that @PeroK was making. Any discussion of Jordan normal form, Heisenberg algebra, or eigenvectors was completely unnecessary.
 
  • Like
Likes   Reactions: PeroK
  • #29
Office_Shredder said:
This whole thread seems stupidly complicated.

Suppose ##A-Z## is singular. Then there exists ##v\neq 0## such that ##(A-Z)v=0## -> ##Av=v##.

Apply ##A## twice more and be done with it. No factorization required.
In a post following the one I quoted, the OP stated that determinants and eigenvalues/eigenvectors haven't yet been presented (see below). The proof outline shown above is a good one, but with no knowledge yet of eigenvectors, about the only technique remaining was by using factorization.
Hall said:
this question was there in chapter before the chapters on Determinants and Eigenvectors.

The thread seems to have run its course, so I'm closing it now.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K