# Homework: Matrix is invertible when rows are linearly independent

1. Apr 7, 2013

### skrat

Hi there,

I have a homework where I have to do this:
Prove that square matrix is invertible if the columns of the matrix are linearly independent.
There is also a hint: You can help with the following statement: Linear transformation L: U->V is bijection when a vector space basis N for U, it's "picture" (- i don't know the right english word) L(N) is a basis for vector space V. This applies for any space basis for U.

I tried several ways but I just can't get to the end. I don't want anybody here to write me the whole prove - I will hate you for that. However, I would be really happy if somebody could put me on the right path, show me the way and give me clues obvious enough to make it to the end.
If necessary I can write it down how I tried to solve this but I seriously doubt it would be any help to you!

EDIT: OK, I changed my mind. I would be happy to know where I got it wrong. My procedure:
Let $A:U\rightarrow V$ linear transformation and
$A=\begin{bmatrix} a_{11} & a_{12}^{} &\cdots &a_{1n} \\ a_{21}& a_{22} &\cdots &a_{2n} \\ \vdots && &\vdots\\ a_{n1}&a_{n2} & \cdots &a_{nn} \end{bmatrix}$

if $N=\begin{Bmatrix} u_{1} & \cdots & u_{n} \end{Bmatrix}$ basis for vector space U and
$M=\begin{Bmatrix} A(u_{1}) & \cdots & A(u_{n}) \end{Bmatrix}=\begin{Bmatrix} v_{1} & \cdots & v_{n} \end{Bmatrix}$ basis for vector space V.

Than I decided to take one vector x from space U:
$x=\alpha _{1}u_{1}+\alpha _{2}u_{2}+\cdots +\alpha _{n}u_{n}$
What linear transformation A does with x is:
$Ax=\begin{bmatrix} a_{11} & a_{12}^{} &\cdots &a_{1n} \\ a_{21}& a_{22} &\cdots &a_{2n} \\ \vdots && &\vdots\\ a_{n1}&a_{n2} & \cdots &a_{nn} \end{bmatrix}\cdot \begin{bmatrix} x_{1} \\ x_{2}\\ \vdots \\ x_{n} \end{bmatrix}=\begin{bmatrix} b_{1} \\ b_{2}\\ \vdots \\ b_{n} \end{bmatrix}$
Where I am not sure what exactly $x_{1}... x_{n}$ are (yes, components, but combination of what?) and the same goes for b's..

Hmmm, Is this even close to right way?

Last edited: Apr 7, 2013
2. Apr 7, 2013

### jambaugh

First, treat the square matrix as an endomorphism, i.e. it maps a space onto itself. So use the same basis for input and output.

Second, any basis will do so use the standard basis.

Third, try a 3x3 or 2x2 before trying the proof for arbitrary size.

The key is understanding what each column means in this context.

3. Apr 7, 2013

### Bacle2

You can also use the fact that the operations of row exchange and adding a multiple of one row to another row
preserve the value of the determinant. If rows are LDep , then a multiple of one row +some other row will generate a 0 row, giving you a matrix with determinant zero. You can prove this by using elementary matrices ( which generate all matrices).

In a different way, Gaussian elimination preserves the value of the determinant, which you can calculate at the end of elimination. If you start with all rows LD, your final matrix will have non-zero determinant, etc.

4. Apr 7, 2013

### Bacle2

Another way, using Rank-Nullity : your original space has dimension n -- you have a basis with n elements. The image of the basis vectors is a collection of n-dimensional vectors , so that the rank of L is n. The elements {A(ui); i=1,2,..n} are a basis for your target space, so that your target space is n-dimensional. Since the rank of L is n, its nullity is trivial. The map is then injective (trivial kernel) and surjective ( rank=n in an n-dim space), so it is invertible.

5. Apr 8, 2013

### skrat

We haven't mentioned the determinants yet, but thanks for the idea!

Ok, so $A: U\rightarrow U$ and $A=\begin{bmatrix} a_{11} & a_{12} & a_{13}\\ a_{21}& a_{22} & a_{23}\\ a_{31}& a_{32} & a_{33} \end{bmatrix}$. Basis $N=\begin{Bmatrix} u_{1},u_{2},u_{3} \end{Bmatrix}=\begin{Bmatrix} (1,0,0),(0,1,0),(0,0,1) \end{Bmatrix}$.
Lets take one vector x from U. $x=\alpha _{1}(1,0,0)+\alpha _{2}(0,1,0)+\alpha _{3}(0,0,1)=\begin{bmatrix} \alpha _{1}\\ \alpha _{2}\\ \alpha _{3} \end{bmatrix}$
Again, what A does with x is:
$Ax=\begin{bmatrix} a_{11} & a_{12} & a_{13}\\ a_{21}& a_{22} & a_{23}\\ a_{31}& a_{32} & a_{33} \end{bmatrix}\cdot \begin{bmatrix} \alpha _{1}\\ \alpha _{2}\\ \alpha _{3} \end{bmatrix}=\begin{bmatrix} a_{11}\alpha _{1}+a_{12}\alpha _{2}+a_{13}\alpha _{3}\\ a_{21}\alpha _{1}+a_{22}\alpha _{2}+a_{23}\alpha _{3}\\ a_{31}\alpha _{1}+a_{32}\alpha _{2}+a_{33}\alpha _{3} \end{bmatrix}=\begin{bmatrix} \beta _{1}\\ \beta _{2}\\ \beta _{3} \end{bmatrix}=y$

so
$y= (a_{11}\alpha _{1}+a_{12}\alpha _{2}+a_{13}\alpha _{3})u_{1}+( a_{21}\alpha _{1}+a_{22}\alpha _{2}+a_{23}\alpha _{3})u_{2}+( a_{31}\alpha _{1}+a_{32}\alpha _{2}+a_{33}\alpha _{3})u_{3}$$y=\beta_{1}u_{1}+\beta_{2}u_{2}+\beta_{3}u_{3}$

If I am on the right way to the finish, I should now be able to find out what each column means in this context, as jambaugh said. Yet I don't.

6. Apr 8, 2013

### jambaugh

Go less general, take one specific vector, say the first basis element. What is it mapped to?

7. Apr 8, 2013

### skrat

You mean:
$Au_{1}=\begin{bmatrix} a_{11} &a_{12} &a_{13} \\ a_{21} &a_{22} &a_{23} \\ a_{31} &a_{32} &a_{33} \end{bmatrix}\cdot \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}=\begin{bmatrix} a_{11} \\ a_{21} \\ a_{31} \end{bmatrix}$
Which is first column of matrix A.

8. Apr 9, 2013

### jambaugh

Right, so the columns are the images of the basis under the (left) mapping. If you look at the right action, the rows are the image of the standard dual basis (for row vectors) under the dual (right) mapping. (row x matrix = row). Now these facts are the starting point for the other techniques suggested. It is the prime fact you should understand.

What is a basis? A minimal spanning set, and as such its elements should be....?

9. Apr 13, 2013

### skrat

linearly independent?

I'm not sure if I understand you completely but I came up with this idea, I hope it is not too bad..

So:
$Au_{1}=\begin{bmatrix} a_{11} &a_{12} & a_{13}\\ a_{21} & a_{22} & a_{23}\\ a_{31} & a_{32} & a_{33} \end{bmatrix}\cdot \begin{bmatrix} 1\\ 0\\ 0 \end{bmatrix}=\begin{bmatrix} a_{11}\\ a_{21}\\ a_{31} \end{bmatrix}$ in other words: $Au_{1}=v_{1}$ if $A: U\rightarrow V$ . $v_{1}$ is than basis vector for vector space V and since it is basis vector it has to be linearly independent. So vectors $Au_{1}=v_{1}$,$Au_{2}=v_{2}$ and $Au_{3}=v_{3}$ are linearly independent and $v_{1}=\begin{bmatrix} a_{11}\\ a_{21}\\ a_{31} \end{bmatrix}$, $v_{2}=\begin{bmatrix} a_{12}\\ a_{22}\\ a_{32} \end{bmatrix}$ and $v_{3}=\begin{bmatrix} a_{13}\\ a_{23}\\ a_{33} \end{bmatrix}$. Basis vectors for V are columns of matrix A and are linearly independent.

Is it OK if I now define another transformation, let's say $B: V\rightarrow U$. Again basis vector for space V should be mapped into basis vector for space U. But we already know that basis vector for U are $u_{1}$, $u_{2}$ and $u_{3}$ and elements in basis vectors in V are $Au_{1}$, $Au_{2}$ and $Au_{3}$.

So this has to be true: $B(Au_{1})=u_{1}$ for all the basis vectors!
$B(Au_{i})=u_{i}$
$BAu_{i}=Iu_{i}$ where $I$ is Identity.
or in other words: $BA=I$ so $B=A^{-1}$

and finally A is reversible if the columns are linearly independent. Columns have to be linearly independent, because if they weren't than, they could be the basis for vector space V.

Last edited: Apr 13, 2013
10. Apr 13, 2013

### jambaugh

You got the essential point I was making. For the specific proof I would look at the fact that any vector can be expanded in the "new" basis and that gives you which vector it is mapped from via A by using the same coefficients but the "old" basis. (A mapping "old" u's to "new" v's). You thereby construct the inverse map. As proofs go, being able to directly construct the asserted inverse mapping is the most unequivocal form.

I also think this means of proving the theorem gives you better intuition as to exactly what the matrix is, it is in essence a list of the image of the current basis under the corresponding linear transformation. This also keeps apparent the basis dependence of and distinctness of the matrix representation of a Linear Op and the operator itself. It helps set the stage for change of basis, diagonalization, and so on.

11. Apr 14, 2013

### skrat

Thank you very much jambaugh!