(LinearAlgebra) all 2x2 invertible matrices closed under addition?

Sanglee
Messages
6
Reaction score
0

Homework Statement



Suppose V is a vector space.
Is the set of all 2x2 invertible matrices closed under addition? If so, please prove it. If not, please
provide a counter-example.

Homework Equations





The Attempt at a Solution



well i know that what does it mean to be closed under addition. When V is closed under addition, if I suppose vector u and w are in the V, their addition u+w is also in the V, right?

The answer for the question is No.
A counter-example my professor provided is I+(-I)=0
I and (-I) are invertible, but their addition 0 is not invertible. and I know why it's not invertible.
But I don't figure out why it is not closed under addition,,.
If the addition is not invertible, does it mean that the addition is not in the V?
 
Physics news on Phys.org
Sanglee said:

Homework Statement



Suppose V is a vector space.
Is the set of all 2x2 invertible matrices closed under addition? If so, please prove it. If not, please
provide a counter-example.

Homework Equations





The Attempt at a Solution



well i know that what does it mean to be closed under addition. When V is closed under addition, if I suppose vector u and w are in the V, their addition u+w is also in the V, right?
Yes.
The answer for the question is No.
A counter-example my professor provided is I+(-I)=0
I and (-I) are invertible, but their addition 0 is not invertible. and I know why it's not invertible.
But I don't figure out why it is not closed under addition,,.
If the addition is not invertible, does it mean that the addition is not in the V?
Yes. V consists of only invertible matrices, so 0 is not an element in V. So you have u=I and w=-I are both in V, but their sum u+w=0 is not in V. Therefore V is not closed under addition.
 
So clear, easy to understand. Thanks!
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top