1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework: Matrix is invertible when rows are linearly independent

  1. Apr 7, 2013 #1
    Hi there,

    I have a homework where I have to do this:
    Prove that square matrix is invertible if the columns of the matrix are linearly independent.
    There is also a hint: You can help with the following statement: Linear transformation L: U->V is bijection when a vector space basis N for U, it's "picture" (- i don't know the right english word) L(N) is a basis for vector space V. This applies for any space basis for U.

    I tried several ways but I just can't get to the end. I don't want anybody here to write me the whole prove - I will hate you for that. However, I would be really happy if somebody could put me on the right path, show me the way and give me clues obvious enough to make it to the end.
    If necessary I can write it down how I tried to solve this but I seriously doubt it would be any help to you!

    Thanks in advance!

    EDIT: OK, I changed my mind. I would be happy to know where I got it wrong. My procedure:
    Let [itex]A:U\rightarrow V[/itex] linear transformation and
    [itex]A=\begin{bmatrix}
    a_{11} & a_{12}^{} &\cdots &a_{1n} \\
    a_{21}& a_{22} &\cdots &a_{2n} \\
    \vdots && &\vdots\\
    a_{n1}&a_{n2} & \cdots &a_{nn}
    \end{bmatrix}[/itex]

    if [itex]N=\begin{Bmatrix}
    u_{1} & \cdots & u_{n}
    \end{Bmatrix}[/itex] basis for vector space U and
    [itex]M=\begin{Bmatrix}
    A(u_{1}) & \cdots & A(u_{n})
    \end{Bmatrix}=\begin{Bmatrix}
    v_{1} & \cdots & v_{n}
    \end{Bmatrix}[/itex] basis for vector space V.

    Than I decided to take one vector x from space U:
    [itex]x=\alpha _{1}u_{1}+\alpha _{2}u_{2}+\cdots +\alpha _{n}u_{n}[/itex]
    What linear transformation A does with x is:
    [itex]Ax=\begin{bmatrix}
    a_{11} & a_{12}^{} &\cdots &a_{1n} \\
    a_{21}& a_{22} &\cdots &a_{2n} \\
    \vdots && &\vdots\\
    a_{n1}&a_{n2} & \cdots &a_{nn}
    \end{bmatrix}\cdot \begin{bmatrix}
    x_{1} \\
    x_{2}\\
    \vdots \\
    x_{n}
    \end{bmatrix}=\begin{bmatrix}
    b_{1} \\
    b_{2}\\
    \vdots \\
    b_{n}
    \end{bmatrix}[/itex]
    Where I am not sure what exactly [itex]x_{1}... x_{n}[/itex] are (yes, components, but combination of what?) and the same goes for b's..

    Hmmm, Is this even close to right way?
     
    Last edited: Apr 7, 2013
  2. jcsd
  3. Apr 7, 2013 #2

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    First, treat the square matrix as an endomorphism, i.e. it maps a space onto itself. So use the same basis for input and output.

    Second, any basis will do so use the standard basis.

    Third, try a 3x3 or 2x2 before trying the proof for arbitrary size.

    The key is understanding what each column means in this context.
     
  4. Apr 7, 2013 #3

    Bacle2

    User Avatar
    Science Advisor

    You can also use the fact that the operations of row exchange and adding a multiple of one row to another row
    preserve the value of the determinant. If rows are LDep , then a multiple of one row +some other row will generate a 0 row, giving you a matrix with determinant zero. You can prove this by using elementary matrices ( which generate all matrices).

    In a different way, Gaussian elimination preserves the value of the determinant, which you can calculate at the end of elimination. If you start with all rows LD, your final matrix will have non-zero determinant, etc.
     
  5. Apr 7, 2013 #4

    Bacle2

    User Avatar
    Science Advisor

    Another way, using Rank-Nullity : your original space has dimension n -- you have a basis with n elements. The image of the basis vectors is a collection of n-dimensional vectors , so that the rank of L is n. The elements {A(ui); i=1,2,..n} are a basis for your target space, so that your target space is n-dimensional. Since the rank of L is n, its nullity is trivial. The map is then injective (trivial kernel) and surjective ( rank=n in an n-dim space), so it is invertible.
     
  6. Apr 8, 2013 #5
    We haven't mentioned the determinants yet, but thanks for the idea!


    Ok, so [itex]A: U\rightarrow U[/itex] and [itex]A=\begin{bmatrix}
    a_{11} & a_{12} & a_{13}\\
    a_{21}& a_{22} & a_{23}\\
    a_{31}& a_{32} & a_{33}
    \end{bmatrix}[/itex]. Basis [itex]N=\begin{Bmatrix}
    u_{1},u_{2},u_{3}
    \end{Bmatrix}=\begin{Bmatrix}
    (1,0,0),(0,1,0),(0,0,1)
    \end{Bmatrix}[/itex].
    Lets take one vector x from U. [itex]x=\alpha _{1}(1,0,0)+\alpha _{2}(0,1,0)+\alpha _{3}(0,0,1)=\begin{bmatrix}
    \alpha _{1}\\
    \alpha _{2}\\
    \alpha _{3}
    \end{bmatrix}[/itex]
    Again, what A does with x is:
    [itex]Ax=\begin{bmatrix}
    a_{11} & a_{12} & a_{13}\\
    a_{21}& a_{22} & a_{23}\\
    a_{31}& a_{32} & a_{33}
    \end{bmatrix}\cdot \begin{bmatrix}
    \alpha _{1}\\
    \alpha _{2}\\
    \alpha _{3}
    \end{bmatrix}=\begin{bmatrix}
    a_{11}\alpha _{1}+a_{12}\alpha _{2}+a_{13}\alpha _{3}\\
    a_{21}\alpha _{1}+a_{22}\alpha _{2}+a_{23}\alpha _{3}\\
    a_{31}\alpha _{1}+a_{32}\alpha _{2}+a_{33}\alpha _{3}
    \end{bmatrix}=\begin{bmatrix}
    \beta _{1}\\
    \beta _{2}\\
    \beta _{3}
    \end{bmatrix}=y[/itex]

    so
    [itex]y= (a_{11}\alpha _{1}+a_{12}\alpha _{2}+a_{13}\alpha _{3})u_{1}+(
    a_{21}\alpha _{1}+a_{22}\alpha _{2}+a_{23}\alpha _{3})u_{2}+(
    a_{31}\alpha _{1}+a_{32}\alpha _{2}+a_{33}\alpha _{3})u_{3}[/itex][itex]
    y=\beta_{1}u_{1}+\beta_{2}u_{2}+\beta_{3}u_{3}[/itex]

    If I am on the right way to the finish, I should now be able to find out what each column means in this context, as jambaugh said. Yet I don't.
     
  7. Apr 8, 2013 #6

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    Go less general, take one specific vector, say the first basis element. What is it mapped to?
     
  8. Apr 8, 2013 #7
    You mean:
    [itex]Au_{1}=\begin{bmatrix}
    a_{11} &a_{12} &a_{13} \\
    a_{21} &a_{22} &a_{23} \\
    a_{31} &a_{32} &a_{33}
    \end{bmatrix}\cdot \begin{bmatrix}
    1\\
    0\\
    0
    \end{bmatrix}=\begin{bmatrix}
    a_{11} \\
    a_{21} \\
    a_{31}
    \end{bmatrix}[/itex]
    Which is first column of matrix A.
     
  9. Apr 9, 2013 #8

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    Right, so the columns are the images of the basis under the (left) mapping. If you look at the right action, the rows are the image of the standard dual basis (for row vectors) under the dual (right) mapping. (row x matrix = row). Now these facts are the starting point for the other techniques suggested. It is the prime fact you should understand.

    What is a basis? A minimal spanning set, and as such its elements should be....?
     
  10. Apr 13, 2013 #9
    linearly independent?

    I'm not sure if I understand you completely but I came up with this idea, I hope it is not too bad..

    So:
    [itex]Au_{1}=\begin{bmatrix}
    a_{11} &a_{12} & a_{13}\\
    a_{21} & a_{22} & a_{23}\\
    a_{31} & a_{32} & a_{33}
    \end{bmatrix}\cdot \begin{bmatrix}
    1\\
    0\\
    0
    \end{bmatrix}=\begin{bmatrix}
    a_{11}\\
    a_{21}\\
    a_{31}
    \end{bmatrix}[/itex] in other words: [itex]Au_{1}=v_{1}[/itex] if [itex]A: U\rightarrow V[/itex] . [itex]v_{1}[/itex] is than basis vector for vector space V and since it is basis vector it has to be linearly independent. So vectors [itex]Au_{1}=v_{1}[/itex],[itex]Au_{2}=v_{2}[/itex] and [itex]Au_{3}=v_{3}[/itex] are linearly independent and [itex]v_{1}=\begin{bmatrix}
    a_{11}\\
    a_{21}\\
    a_{31}
    \end{bmatrix}[/itex], [itex]v_{2}=\begin{bmatrix}
    a_{12}\\
    a_{22}\\
    a_{32}
    \end{bmatrix}[/itex] and [itex]v_{3}=\begin{bmatrix}
    a_{13}\\
    a_{23}\\
    a_{33}
    \end{bmatrix}[/itex]. Basis vectors for V are columns of matrix A and are linearly independent.

    Is it OK if I now define another transformation, let's say [itex]B: V\rightarrow U[/itex]. Again basis vector for space V should be mapped into basis vector for space U. But we already know that basis vector for U are [itex]u_{1}[/itex], [itex]u_{2}[/itex] and [itex]u_{3}[/itex] and elements in basis vectors in V are [itex]Au_{1}[/itex], [itex]Au_{2}[/itex] and [itex]Au_{3}[/itex].

    So this has to be true: [itex]B(Au_{1})=u_{1}[/itex] for all the basis vectors!
    [itex]B(Au_{i})=u_{i}[/itex]
    [itex]BAu_{i}=Iu_{i}[/itex] where [itex]I[/itex] is Identity.
    or in other words: [itex]BA=I[/itex] so [itex]B=A^{-1}[/itex]

    and finally A is reversible if the columns are linearly independent. Columns have to be linearly independent, because if they weren't than, they could be the basis for vector space V.
     
    Last edited: Apr 13, 2013
  11. Apr 13, 2013 #10

    jambaugh

    User Avatar
    Science Advisor
    Gold Member

    You got the essential point I was making. For the specific proof I would look at the fact that any vector can be expanded in the "new" basis and that gives you which vector it is mapped from via A by using the same coefficients but the "old" basis. (A mapping "old" u's to "new" v's). You thereby construct the inverse map. As proofs go, being able to directly construct the asserted inverse mapping is the most unequivocal form.

    I also think this means of proving the theorem gives you better intuition as to exactly what the matrix is, it is in essence a list of the image of the current basis under the corresponding linear transformation. This also keeps apparent the basis dependence of and distinctness of the matrix representation of a Linear Op and the operator itself. It helps set the stage for change of basis, diagonalization, and so on.
     
  12. Apr 14, 2013 #11
    Thank you very much jambaugh!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted