If ##A^3=0## then ##A-Z## is nonsingular

  • Thread starter Hall
  • Start date
  • Tags
    Matrices
In summary, the conversation is about finding a matrix other than the zero matrix whose cube is the zero matrix. Some potential matrices are suggested and discussed, including the matrix $$A = \begin{bmatrix}0 &1 &0 \\0& 0 &1 \\0 &0 &0\end{bmatrix}$$and the idea of factoring ##A^3 - I##. It is then suggested to change the order of the factors and use ##B = -A^2 -A -I## to show that ##B## is the inverse of ##A-I##. It is also mentioned that for matrices that do not commute, it is necessary to show that the left and right inverses
  • #1
Hall
351
87
Homework Statement
Let ##A## be an ##n \times## square matrix and ##Z## denote the ##n\times n## identity matrix. If ##A^3=0## then prove that (or present a counter example) ##A-Z## is nonsingular.
Relevant Equations
A matrix is nonsingular if and only if its corresponding Linear transformation is invertible.
I'm really unable to have a start, because I cannot think of any matrix (other than ##O##) such that its cube is the zero matrix. I tried to assume A = ##\begin{bmatrix} a &c \\b &d \end{bmatrix} ## and computed ##A^3## and set it to ##O## to get an idea how the elements would look like, but the resulting equations will be non-linear.
 
Physics news on Phys.org
  • #2
The first thing I thought of was to factorise ##A^3 - I##.
 
  • #3
Transform ##A## into the Jordan normal form. Also, ##\exp(A) = 1+A+(1/2)A^2## is regular, but I'm not sure whether this helps. To me ##A## is an element of the Heisenberg algebra, and ##A+1## an element of the Heisenberg group, but I'm not sure how to formalize that.
 
  • Wow
Likes PeroK
  • #4
fresh_42 said:
Transform ##A## into the Jordan normal form. Also, ##\exp(A) = 1+A+(1/2)A^2## is regular, but I'm not sure whether this helps. To me ##A## is an element of the Heisenberg algebra, and ##A+1## an element of the Heisenberg group, but I'm not sure how to formalize that.
It's a bit simpler than that!
 
  • #5
Hall said:
I'm really unable to have a start, because I cannot think of any matrix (other than ##O##) such that its cube is the zero matrix.
Try ##
A = \begin{bmatrix}
0 &1 &0 \\
0& 0 &1 \\
0 &0 &0
\end{bmatrix}
##
It shifts the first coordinate to the second and the second coordinate to the third and the third to zero. Three applications to any vector should give the zero vector.
And if ##B## is any invertable matrix, it is easy to show that ##C = BAB^{-1}## also satisfies ##C^3=O##.
 
Last edited by a moderator:
  • Like
Likes Hall
  • #6
In terms of 2x2 matrices, we have:
$$\begin{bmatrix}
1 \ \ \ \ \ \ b \\
{-\frac 1 b} \ \ {-1}
\end{bmatrix}^2 = 0
$$
 
  • #8
PeroK said:
The first thing I thought of was to factorise ##A^3 - I##.
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
 
  • #9
Hall said:
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
That's not quite right.

Once you've fixed it, what do you know about ##A^3##?
 
  • Like
Likes Hall
  • #10
PeroK said:
In terms of 2x2 matrices, we have:
$$\begin{bmatrix}
1 \ \ \ \ \ \ b \\
{-\frac 1 b} \ \ {-1}
\end{bmatrix}^2 = 0
$$
There are actually no 2x2 matrices where ##A^3 = 0## but ##A^2 \ne 0##.
 
  • Like
Likes FactChecker
  • #11
Hall said:
As those two matrices commute, we can have
$$
A^3 - I^3= (A+I)(A^2 - A + I)$$
##A^3-I^3=(A-I)(A^2+A+I)##
 
  • Like
Likes Hall and FactChecker
  • #12
PeroK said:
It's a bit simpler than that!
My bad, I confused the forum. Sorry, @Hall!
 
  • #13
PeroK said:
That's not quite right.

Once you've fixed it, what do you know about ##A^3##?
$$
A^3 -I^3 = (A-I) (A^2 +A +I)$$
$$
(A-I)(-A^2-A-I) = I$$
Hence, (A-I) is invertible because it has an inverse ##(-A^2 -A -I)##.
 
  • Like
Likes FactChecker and PeroK
  • #14
Hall said:
$$
A^3 -I^3 = (A-I) (A^2 +A +I)$$
$$
(A-I)(-A^2-A-I) = I$$
Hence, (A-I) is invertible because it has an inverse ##(-A^2 -A -I)##.
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation. Hint: Use ##(-A^2-A-I)(A-I) = I## instead and at the end apply ##A## once more.
 
  • #15
fresh_42 said:
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation.
What the OP has posted is sufficient, given that he already proved in another thread that ##AB = I## is sufficient for matrices ##A## and ##B## to be inverses of each other.

In any case, we could just change the order of the factors:
$$
A^3 -I^3 = (A-I) (A^2 +A +I) = (A^2 + A + I)(A-I)$$
Hence:$$(A-I)B = B(A-I) = I$$ with ##B = -A^2 -A -I##
 
  • Like
Likes Hall
  • #16
fresh_42 said:
This is certainly a good start, and it would be sufficient in a multiplicative group.

But we do not have commutativity and automatic inverses, so strictly speaking you have to show that left and right inverse exist and are the same. Also, don't we need ##A+I## and not ##A-I,## and why can't the product of two singular matrices be a regular matrix, ##I## in that case.

I think you can assume a vector ##v## in the kernel of ##A+I## and show that it has to be the zero vector by this equation.
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
 
  • #17
Hall said:
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
See:

https://www.physicsforums.com/threads/prove-b-is-invertible-if-ab-i.1012582/
 
  • #18
Hall said:
Can we make this argument: If a matrix has a left inverse then it also has a right inverse and that right inverse is same as the left one?
$$
(A-I)(-A^2-A -I) = I $$
The left inverse of ##(-A^2 -A-I)## is ##(A-I)##, so we have: $$
(-A^2 -A -I) (A-I) =I$$
And therefore, the left inverse of ##A-I## exists.
Yes, in this case where the inverse is a polynomial of ##A.## Honestly? I do not have the group theory proof in mind to say which group axioms besides associativity are actually necessary to prove left inverse equals right inverse. It is not automatically the case.

And there is still the problem with ##A+I## instead of ##A-I.## I guess we may assume characteristic zero.
 
  • #19
fresh_42 said:
And there is still the problem with ##A+I## instead of ##A-I.##
Why do we need to do this one as well? I don't see it in the OP. In any case:

If ##A^3 = 0##, then ##(-A)^3 = 0## and ##-A - I = -(A + I)## is invertible; hence, ##A + I## is invertible. QED
 
  • Wow
Likes fresh_42
  • #20
PeroK said:
Why do we need to do this one as well? I don't see it in the OP. In any case:

If ##A^3 = 0##, then ##(-A)^3 = 0## and ##-A - I = -(A + I)## is invertible; hence, ##A + I## is invertible. QED
Where have you shown the invertibility of ##A+I## or ##-A-I##?

\begin{align*}
v\in \operatorname{ker}(A+I) &\Longrightarrow Av=-v\\
&\Longrightarrow (-A^2-A-I)(A-I)v=(-A^2-A-I)(A.v-Iv)\\
&\ldots
\end{align*}
 
  • #21
fresh_42 said:
Where have you shown the invertibility of ##A+I##?
It's a corollary of the invertibility of ##A - I##, given that ##A^3 = 0##.
 
  • #22
fresh_42 said:
Where have you shown the invertibility of ##A+I##?
Post #8.

$$A^3 + I^3= (A+I)(A^2 - A +I)$$
$$ I = (A+I)(A^2-A+I)$$
 
  • Like
Likes PeroK and fresh_42
  • #23
Hall said:
Post #8.
There was a mistake in your formula in post #8. Look at the ##I\cdot I## term!
PeroK said:
It's a corollary of the invertibility of ##A - I##, given that ##A^3 = 0##.
And where have you or some magician shown it?

Edit: Now it is in post #22. But nowhere before.
 
  • #24
PeroK said:
Why do we need to do this one as well? I don't see it in the OP.
I had ##A+Z## in mind. Bad hair day ...
 
  • #25
This whole thread seems stupidly complicated.

Suppose ##A-Z## is singular. Then there exists ##v\neq 0## such that ##(A-Z)v=0## -> ##Av=v##.

Apply ##A## twice more and be done with it. No factorization required.
 
  • Sad
  • Like
Likes PeroK and fresh_42
  • #26
Office_Shredder said:
This whole thread seems stupidly complicated.
Sorry to be so stupid, but I'm just trying to help the OP and factorisation was the first thing I thought of. I can't see it's particularly more complicated than considering an eigenvector.

Most of the muddle stemmed from the thread being derailed somewhat.
 
  • Like
Likes Hall
  • #27
PeroK said:
Sorry to be so stupid, but I'm just trying to help the OP and factorisation was the first thing I thought of. I can't see it's particularly more complicated than considering an eigenvector.

Most of the muddle stemmed from the thread being derailed somewhat.
I think the question maker had exactly the factorization method in mind, because this question was there in chapter before the chapters on Determinants and Eigenvectors.
 
  • Like
Likes PeroK
  • #28
PeroK said:
Most of the muddle stemmed from the thread being derailed somewhat.
And not just somewhat. Given that ##A^3 = 0## it's easy to see that
##-I = A^3 - I = (A - I)(A^2 + A + I) \Rightarrow I = A^3 - I = (A - I)(-A^2 - A - I)##
Further, one can show that ##(-A^2 - A - I)(A - 1) = -A^3 + I = I##
Clearly ##A - I## is invertible, which was shown in several other posts in this thread, and was the point that @PeroK was making. Any discussion of Jordan normal form, Heisenberg algebra, or eigenvectors was completely unnecessary.
 
  • Like
Likes PeroK
  • #29
Office_Shredder said:
This whole thread seems stupidly complicated.

Suppose ##A-Z## is singular. Then there exists ##v\neq 0## such that ##(A-Z)v=0## -> ##Av=v##.

Apply ##A## twice more and be done with it. No factorization required.
In a post following the one I quoted, the OP stated that determinants and eigenvalues/eigenvectors haven't yet been presented (see below). The proof outline shown above is a good one, but with no knowledge yet of eigenvectors, about the only technique remaining was by using factorization.
Hall said:
this question was there in chapter before the chapters on Determinants and Eigenvectors.

The thread seems to have run its course, so I'm closing it now.
 

1. What does the statement "If ##A^3=0## then ##A-Z## is nonsingular" mean?

This statement is a mathematical expression that suggests that if the cube of a matrix A is equal to zero, then the matrix A-Z is nonsingular.

2. What is a nonsingular matrix?

A nonsingular matrix, also known as an invertible or non-degenerate matrix, is a square matrix that has an inverse. This means that it can be multiplied by another matrix to produce the identity matrix.

3. How can you prove that ##A-Z## is nonsingular if ##A^3=0##?

To prove that ##A-Z## is nonsingular if ##A^3=0##, we can use the fact that the determinant of a product of matrices is equal to the product of the determinants of the individual matrices. Since the determinant of ##A-Z## is equal to the determinant of A multiplied by the determinant of Z, and the determinant of A is equal to zero, the determinant of ##A-Z## must also be equal to zero. This means that ##A-Z## is not invertible and therefore nonsingular.

4. Can this statement be applied to all matrices?

No, this statement only applies to square matrices. A square matrix is a matrix with the same number of rows and columns. In this case, A and Z are both square matrices.

5. What are some practical applications of this statement?

This statement has many applications in fields such as engineering, physics, and computer science. It can be used to solve systems of linear equations, find the inverse of a matrix, and perform various transformations on data sets. It is also useful in analyzing and optimizing complex systems.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
400
  • Calculus and Beyond Homework Help
Replies
2
Views
993
  • Calculus and Beyond Homework Help
Replies
3
Views
575
  • Calculus and Beyond Homework Help
Replies
4
Views
965
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
14
Views
2K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
  • Calculus and Beyond Homework Help
Replies
16
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
976
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
Back
Top