Proving statements about matrices | Linear Algebra

  • Thread starter JD_PM
  • Start date
  • #1
JD_PM
1,128
158
Homework Statement:
Prove or give a counterexamples for the following statements
Relevant Equations:
N/A
Hi guys! :)

I was solving some linear algebra true/false (i.e. prove the statement or provide a counterexample) questions and got stuck in the following

a) There is no ##A \in \Bbb R^{3 \times 3}## such that ##A^2 = -\Bbb I_3## (typo corrected)

I think this one is true, as there is no squared matrix that yields a -ive number. Is that enough justification?

b) If the matrix elements of a matrix ##A## are all integers and ##\det(A) = \pm 1## then the matrix elements of ##A^{-1}## are also integers.

This seems to be true as I tried to find a counterexample with several ##2 \times 2## matrices with ##\det(A) = \pm 1## and all their inverses contained integer numbers only. However I do not really see how to actually prove the statement. Could you please provide a hint?

c) If the matrix elements of a square matrix ##A## are all zero or 1 then ##\det A =1, 0## or ##-1##.

As above, I tried to find a counterexample but I found none so I suspect the statement is true. But as with b), I do not see how to actually prove it. Could you please provide a hint for this one as well?


Thanks!

:biggrin:
 
Last edited:

Answers and Replies

  • #2
Gaussian97
Homework Helper
679
405
Hi JD_PM,
a) There is no ##A \in \Bbb R^{3 \times 3}## such that ##A^2 = \Bbb I_3##

I think this one is true, as there is no squared matrix that yields a -ive number. Is that enough justification?
No, this is not a valid mathematical proof, if you think this is enough then you shouldn't have any problem putting it into a valid mathematical way.
b) If the matrix elements of a matrix ##A## are all integers and ##\det(A) = \pm 1## then the matrix elements of ##A^{-1}## are also integers.

This seems to be true as I was tried to find a counterexample with several ##2 \times 2## matrices with ##\det(A) = \pm 1## and all their inverses contained integer numbers only. However I do not really see how to actually prove the statement. Could you please provide a hint?

c) If the matrix elements of a square matrix ##A## are all zero or 1 then ##\det A =1, 0## or ##-1##.

As above, I tried to find a counterexample but I found none so I suspect the statement is true. But as with b), I do not see how to actually prove it. Could you please provide a hint for this one as well?
What have you tried so far? Or you've just tried a few examples and see that they have those properties?
As ideas, when working with square matrices you can try to use induction over the dimension, or try to write a close formula for finding the inverse/determinant from the elements of the initial matrix.
Also, if you don't know how to prove something is also a good idea to keep trying examples and see if you can spot any pattern.
 
  • #3
36,424
8,405
a) There is no ##A \in \Bbb R^{3 \times 3}## such that ##A^2 = \Bbb I_3##

I think this one is true, as there is no squared matrix that yields a -ive number. Is that enough justification?
As written, the statement above is false. I can think of a very easy counterexample. Did you mean to write ##A^2 = -\Bbb I_3##?
 
  • #4
JD_PM
1,128
158
Hi @Gaussian97 !

Oops, I made a typo (as pointed out by @Mark44) ; a) should read ##A^2 = -\Bbb I_3##

No, this is not a valid mathematical proof, if you think this is enough then you shouldn't have any problem putting it into a valid mathematical way.

But given that ##A^2##, I thought that it was trivial to see that ##A^2## cannot yield a -ive number so no prove was required. Are you suggesting that I am wrong and we should actually prove it? If yes, how? It looks trivial to me so I might be missing the point.

What have you tried so far? Or you've just tried a few examples and see that they have those properties?

Indeed, I have only tried to spot a counterexample. However, all of them satisfy the stated property so I suspect it must be true. Some examples

b)

$$
A=\begin{pmatrix}
2 & 1 \\
5 & 3 \\
\end{pmatrix}, \qquad A^{-1}=\begin{pmatrix}
3 & -1 \\
-5 & 2 \\
\end{pmatrix}
$$

Other example

$$
B=\begin{pmatrix}
13 & 4 \\
3 & 1 \\
\end{pmatrix}, \qquad B^{-1}=\begin{pmatrix}
1 & -4 \\
-3 & 13 \\
\end{pmatrix}
$$

So let's try to prove it. ##A ## is not given to be a square matrix but let's assume we only deal with that type.

$$
A=\begin{pmatrix}
a_{11} & a_{12} & ... & a_{1n} \\
. & . & ... & . \\
. & . & ... & . \\
a_{n1} & a_{n2} & ... & a_{nn} \\
\end{pmatrix}, \qquad \det A=\pm 1
$$

Next step I'd say is to go ahead and take the inverse of the generic ##A## but I guess there has to be a way of implementing the fact that ##\det A=\pm 1##, simplifying the labour. However, I do not see how to do so :(

For c) I also tried many examples. Some of them are

$$
C=\begin{pmatrix}
1 & 1 \\
1 & 1 \\
\end{pmatrix}, \qquad \det C=0
$$

$$
D=\begin{pmatrix}
1 & 1 \\
1 & 0 \\
\end{pmatrix}, \qquad \det D=-1
$$

Hence I suspect c) is true as well. I do not visualize any pattern from the examples though ...

Also I am not used to prove statements which involve matrix elements so I am not sure what methods are best for such. Should I try induction?
 
  • #5
36,424
8,405
But given that ##A^2##, I thought that it was trivial to see that ##A^2## cannot yield a -ive number so no prove was required. Are you suggesting that I am wrong and we should actually prove it? If yes, how? It looks trivial to me so I might be missing the point.
No, a proof is required. If the problem seems trivial to you, then a proof should be easy. I would do a proof by contradiction; i.e., assume that there is some 3 x 3 matrix A such that ##A^2 = -\mathbb I_3##, and show that this assumption leads to a contradiction.
 
  • #6
Infrared
Science Advisor
Gold Member
965
538
Note that if ##A=\begin{pmatrix}0 & -1\\ 1 & 0\end{pmatrix}## then ##A^2=-I_2,## so it doesn't seem to me that your intuition of negative scalar matrices having no square roots is accurate. Can there be a ##3\times 3## example?
 
  • Like
Likes FactChecker and JD_PM
  • #7
JD_PM
1,128
158
Note that if ##A=\begin{pmatrix}0 & -1\\ 1 & 0\end{pmatrix}## then ##A^2=-I_2,## so it doesn't seem to me that your intuition of negative scalar matrices having no square roots is accurate. Can there be a ##3\times 3## example?

@Infrared thanks for the comment but note that the question deals with matrices of dimension ##9=3 \times 3##. I could not find any ##3 \times 3## matrix satisfying ##A^2 = -\Bbb I_3##
 
  • #8
Gaussian97
Homework Helper
679
405
Oops, I made a typo (as pointed out by @Mark44) ; a) should read ##A^2 = -\Bbb I_3##
Yes, I also thought that. Otherwise, the identity matrix would be a trivial counterexample.
But given that ##A^2##, I thought that it was trivial to see that ##A^2## cannot yield a -ive number so no prove was required. Are you suggesting that I am wrong and we should actually prove it? If yes, how? It looks trivial to me so I might be missing the point.
You cannot say something is trivial if you don't know how to prove it. Is this property a defining property of matrices? I don't think so, then you must do some reasoning to arrive at that conclusion.

Indeed, I have only tried to spot a counterexample. However, all of them satisfy the stated property so I suspect it must be true. Some examples

b)

$$
A=\begin{pmatrix}
2 & 1 \\
5 & 3 \\
\end{pmatrix}, \qquad A^{-1}=\begin{pmatrix}
3 & -1 \\
-5 & 2 \\
\end{pmatrix}
$$

Other example

$$
B=\begin{pmatrix}
13 & 4 \\
3 & 1 \\
\end{pmatrix}, \qquad B^{-1}=\begin{pmatrix}
1 & -4 \\
-3 & 13 \\
\end{pmatrix}
$$

So let's try to prove it. ##A ## is not given to be a square matrix but let's assume we only deal with that type.

$$
A=\begin{pmatrix}
a_{11} & a_{12} & ... & a_{1n} \\
. & . & ... & . \\
. & . & ... & . \\
a_{n1} & a_{n2} & ... & a_{nn} \\
\end{pmatrix}, \qquad \det A=\pm 1
$$

Next step I'd say is to go ahead and take the inverse of the generic ##A## but I guess there has to be a way of implementing the fact that ##\det A=\pm 1##, simplifying the labour. However, I do not see how to do so :(
The inverse of a matrix is only defined for square matrices, so is a safe assumption to assume that.
How have you computed the inverse of ##A## and ##B##?
For c) I also tried many examples. Some of them are

$$
C=\begin{pmatrix}
1 & 1 \\
1 & 1 \\
\end{pmatrix}, \qquad \det C=0
$$

$$
D=\begin{pmatrix}
1 & 1 \\
1 & 0 \\
\end{pmatrix}, \qquad \det D=-1
$$

Hence I suspect c) is true as well. I do not visualize any pattern from the examples though ...

Also I am not used to prove statements which involve matrix elements so I am not sure what methods are best for such. Should I try induction?
I would say that you start proving the statement for 2x2 matrices, (there is an almost trivial proof) and then try to see if the same proof allows you to generalize to higher dimensions or maybe helps you find some counterexample.
 
  • #9
Infrared
Science Advisor
Gold Member
965
538
@Infrared thanks for the comment but note that the question deals with matrices of dimension ##9=3 \times 3##. I could not find any ##3 \times 3## matrix satisfying ##A^2 = -\Bbb I_3##

I understand that, and the reason why you can't find any is that there are none. I was just trying to show why your argument that "I think this one is true, as there is no squared matrix that yields a -ive number" was not sufficient (since this can happen for ##2\times2## matrices.)

Hint: use determinants.
 
  • Like
Likes JD_PM and FactChecker
  • #10
JD_PM
1,128
158
Thanks @Infrared, I see your point now :)

I was actually trying to prove a) by contradiction via determinants.

Assuming ##A^2 = -\Bbb I_3## holds, we take the determinant on both sides to get

$$ \det A^2= (\det A)^2= -1 \Rightarrow \det A = \sqrt{-1}$$

Oh so the determinant of ##A## is ill-defined over the real numbers, which is a contradiction (as the determinant of square matrices is well-defined over the real numbers).

Do you agree? :)
 
  • #11
JD_PM
1,128
158
How have you computed the inverse of ##A## and ##B##?
Via the standard Gaussian elimination algorithm

1) Define the block matrix ##M:=(A: \Bbb I)##.

2) Row-reduce ##M## to echelon form. If ##A## takes the form of a triangular matrix then A is invertible. Otherwise it is not.

3) Row-reduce further ##M## to ##M=( \Bbb I : B)##, where ##B := A^{-1}##.

I would say that you start proving the statement for 2x2 matrices, (there is an almost trivial proof) and then try to see if the same proof allows you to generalize to higher dimensions or maybe helps you find some counterexample.

Alright, I'll try to prove it for ##2 \times 2## matrices first.
 
  • #12
JD_PM
1,128
158
Alright, I'll try to prove it for ##2 \times 2## matrices first.

We see that

\begin{equation*}
A=\begin{pmatrix}
a_{11} & a_{12} \\
a_{21} & a_{22}
\end{pmatrix}, \qquad \det A = a_{11}a_{22} - a_{12}a_{21} = 1, 0, -1
\end{equation*}

Where the determinant equation indeed has the stated solutions, given that each ##A## matrix element can be either be ##1## or ##0##.

@Gaussian97, is this what you had in mind?
 
  • #13
Gaussian97
Homework Helper
679
405
Via the standard Gaussian elimination algorithm

1) Define the block matrix ##M:=(A: \Bbb I)##.

2) Row-reduce ##M## to echelon form. If ##A## takes the form of a triangular matrix then A is invertible. Otherwise it is not.

3) Row-reduce further ##M## to ##M=( \Bbb I : B)##, where ##B := A^{-1}##.
Okay, that fine, but since step 2 is very "matrix dependent" is not easy to get a general formula for the inverse this way. Do you know of any other method to invert matrices?

We see that

\begin{equation*}
A=\begin{pmatrix}
a_{11} & a_{12} \\
a_{21} & a_{22}
\end{pmatrix}, \qquad \det A = a_{11}a_{22} - a_{12}a_{21} = 1, 0, -1
\end{equation*}

Where the determinant equation indeed has the stated solutions, given that each ##A## matrix element can be either be ##1## or ##0##.

@Gaussian97, is this what you had in mind?
Yes, that is an easy proof for the 2d case, now what can you say about the 3d case?
 
  • #14
JD_PM
1,128
158
Okay, that fine, but since step 2 is very "matrix dependent" is not easy to get a general formula for the inverse this way. Do you know of any other method to invert matrices?

Another way is to use the famous formula

\begin{equation*}
A^{-1} = \frac{1}{\det(A)} adj A
\end{equation*}

Where the adjoint of ##A## is defined to be the transpose of the cofactor matrix.

Okay, that fine, but since step 2 is very "matrix dependent" is not easy to get a general formula for the inverse this way. Do you know of any other method to invert matrices?


Yes, that is an easy proof for the 2d case, now what can you say about the 3d case?

Ahhh then the statement breaks down! :biggrin: A counterexample:

\begin{equation*}
B=\begin{pmatrix}
1 & 0 & 1 \\
1 & 1 & 0 \\
0 & 1 &1
\end{pmatrix}, \qquad \det B = 2 \neq 1, 0, -1
\end{equation*}

By the way, do you think what I did at #10 is OK?
 
  • #15
JD_PM
1,128
158
b) If the matrix elements of a matrix ##A## are all integers and ##\det(A) = \pm 1## then the matrix elements of ##A^{-1}## are also integers.

This seems to be true as I tried to find a counterexample with several ##2 \times 2## matrices with ##\det(A) = \pm 1## and all their inverses contained integer numbers only. However I do not really see how to actually prove the statement. Could you please provide a hint?

It turns out there is a broader version of b):

Given

\begin{equation*}
\Bbb Z^{n \times n} = \{ A \in \Bbb R^{n \times n} | (A)_{ij} \in \Bbb Z \ \text{for} \ i \in \{ 1, ..., n\} \ \text{and} \ j \in \{ 1, ..., n\} \}
\end{equation*}

For any ##A \in \Bbb Z^{n \times n}## we have

$$\det(A) = \pm 1 \iff \ \text{A is invertible and} \ A^{-1} \in \Bbb Z^{n \times n}$$
 
  • #16
Gaussian97
Homework Helper
679
405
Another way is to use the famous formula

\begin{equation*}
A^{-1} = \frac{1}{\det(A)} adj A
\end{equation*}

Where the adjoint of ##A## is defined to be the transpose of the cofactor matrix.
Ok, now if ##A## only has integers, what can you say about ##adj A##?
 
  • #17
JD_PM
1,128
158
Ok, now if ##A## only has integers, what can you say about ##adj A##?
I see now where you are driving me at! :)

Given that the cofactor matrix is made of cofactor elements ##C_{ij}##, which are related to the minor ##M_{ij}## by

$$C_{ij} = (-1)^{i+j} M_{ij}$$

The minor is defined as taking the determinant of ##A## after stripping out its row ##i## and column ##j##. Given that ##A## contains integers only, the minor will also contain integers only. It follows that ##adj A## will be made of integers only and, via the famous relation, we see that ##A^{-1}## as well.

Hence b) is true.
 
  • #18
JD_PM
1,128
158
Thanks @Infrared, I see your point now :)

I was actually trying to prove a) by contradiction via determinants.

Assuming ##A^2 = -\Bbb I_3## holds, we take the determinant on both sides to get

$$ \det A^2= (\det A)^2= -1 \Rightarrow \det A = \sqrt{-1}$$

Oh so the determinant of ##A## is ill-defined over the real numbers, which is a contradiction (as the determinant of square matrices is well-defined over the real numbers).

Do you agree? :)

Hi @Mark44 , @Infrared

Is this an acceptable proof (via contradiction) of a)? If not I will think further.

Thanks.
 
  • #19
JD_PM
1,128
158
Well, I think #10 is right.

For even ##n## matrices ##n \times n## the statement is actually true as

$$\det A^2= (\det A)^2= (-1)^n \det \Bbb I_n \Rightarrow \det A = \pm 1$$

Where I used the determinant property ##\det(kA) = k^n \det A##. The statement indeed breaks down for odd ##n## :smile:
 
  • #20
WWGD
Science Advisor
Gold Member
6,178
7,702
Hi @Mark44 , @Infrared

Is this an acceptable proof (via contradiction) of a)? If not I will think further.

Thanks.
Yes, but remember you're assuming your entries are Real Numbers and clearly this would not be the case if you allowed Complex entries.
 
  • #21
JD_PM
1,128
158
Thanks @WWGD

Given

\begin{equation*}
\Bbb Z^{n \times n} = \{ A \in \Bbb R^{n \times n} | (A)_{ij} \in \Bbb Z \ \text{for} \ i \in \{ 1, ..., n\} \ \text{and} \ j \in \{ 1, ..., n\} \}
\end{equation*}

For any ##A \in \Bbb Z^{n \times n}## we have

$$\det(A) = \pm 1 \iff \ \text{A is invertible and} \ A^{-1} \in \Bbb Z^{n \times n}$$

I have been trying to prove the statement from right to left as follows

Given that the inverse of ##A## exists, we can make use of the famous formula

\begin{equation*}
A^{-1} = \frac{1}{\det(A)} adj A
\end{equation*}

The adjoint of ##A## contains integers only if ##A## is made of integers exclusively (which is the case, by definition), as explained in #17. ##A^{-1}## only contains integers (by assumption) so the only way the famous formula yields integer elements on both sides is when ##\det(A)=\pm 1##.

I think I got the idea right but I am not sure if the above can be called "proof".
 
  • #22
36,424
8,405
Well, I think #10 is right.

For even ##n## matrices ##n \times n## the statement is actually true as

$$\det A^2= (\det A)^2= (-1)^n \det \Bbb I_n \Rightarrow \det A = \pm 1$$

Where I used the determinant property ##\det(kA) = k^n \det A##. The statement indeed breaks down for odd ##n## :smile:
I don't understand what you're trying to say in the equation above, since you haven't included the given information.

A is an n X n matrix, with n an even integer, right? Do we still have ##A^2 = -\mathbb I_n##?
 
  • #23
JD_PM
1,128
158
A is an n X n matrix, with n an even integer, right? Do we still have ##A^2 = -\mathbb I_n##?

Yes. In that case one can find a matrix ##A## satisfying ##A^2 = -\mathbb I_n##, as @Infrared showed in #6.
 
  • #24
Gaussian97
Homework Helper
679
405
Thanks @WWGD



I have been trying to prove the statement from right to left as follows

Given that the inverse of ##A## exists, we can make use of the famous formula

\begin{equation*}
A^{-1} = \frac{1}{\det(A)} adj A
\end{equation*}

The adjoint of ##A## contains integers only if ##A## is made of integers exclusively (which is the case, by definition), as explained in #17. ##A^{-1}## only contains integers (by assumption) so the only way the famous formula yields integer elements on both sides is when ##\det(A)=\pm 1##.

I think I got the idea right but I am not sure if the above can be called "proof".
This is wrong, the adjoint of ##\begin{pmatrix}2&0&0\\0&\frac{1}{2}&0\\0&0&0\end{pmatrix}## contains only integers.
 
  • #25
JD_PM
1,128
158
This is wrong, the adjoint of ##\begin{pmatrix}2&0&0\\0&\frac{1}{2}&0\\0&0&0\end{pmatrix}## contains only integers.

Indeed but note that your example matrix is not invertible so the famous formula does not apply to it :)

The adjoint of ##A## contains integers only if ##A## is made of integers exclusively (which is the case, by definition), as explained in #17.

I think the above statement holds for invertible matrices. At least I could not find a counterexample. Please let me know if you find one.

But what I wrote in #21 does not correctly justify why ##\det(A)=\pm 1##. So let me tackle the right-from-left proof of theorem at #15 once again.

We notice ##A^{-1} \in \Bbb Z^{n \times n}## (by assumption) and ##adj(A) \in \Bbb Z^{n \times n}## (which follows from the given ##A \in \Bbb Z^{n \times n}##, as justified at #17). My mistake was to think that this argument was sufficient to conclude that ##\det(A)=\pm 1##. That is wrong; here's a counterexample via the famous formula

\begin{equation*}
\begin{pmatrix}1&0&0\\0&1&0\\0&0&1\end{pmatrix} = \frac 1 3 \begin{pmatrix}3&0&0\\0&3&0\\0&0&3\end{pmatrix}
\end{equation*}

We instead make the following argument: given that ##A A^{-1} = \Bbb I##, we take the determinant on both sides to get ##\det(A) \det (A^{-1}) = 1##. Given that ##\det(A^{-1}) \in \Bbb Z## and ##\det(A) \in \Bbb Z##, the only way ##\det(A) \det (A^{-1}) = 1## holds is for ##\det(A), \det(A^{-1}) \in \{1, -1\}##. In plain words: if we take any integer value of the determinant different from ##1, -1##, its inverse will not be an integer.

I think mathematicians would rather argue the above in terms of units.
 
  • #26
Gaussian97
Homework Helper
679
405
Indeed but note that your example matrix is not invertible so the famous formula does not apply to it :)


I think the above statement holds for invertible matrices. At least I could not find a counterexample. Please let me know if you find one.
Being invertible has actually nothing to do, indeed the counterexample could be
$$\begin{pmatrix}2&0&0\\0&\frac{1}{2}&0\\0&0&2\end{pmatrix}$$
But well, anyway this is not relevant to the proof.

This last one with determinants is much simpler I think.
 

Suggested for: Proving statements about matrices | Linear Algebra

Replies
8
Views
710
Replies
5
Views
70
Replies
16
Views
684
  • Last Post
Replies
1
Views
584
  • Last Post
Replies
4
Views
285
  • Last Post
Replies
6
Views
480
Replies
18
Views
331
  • Last Post
Replies
4
Views
303
Replies
6
Views
750
Top