Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Invertible matrix implies linear independent columns

  1. Oct 16, 2013 #1
    Is the title statement true?

    Was doing some studying today and this caught my eye, haven't looked into linear algebra in quite a while so I'm not sure how it is true :/

    Internet couldn't provide any decisive conclusions neither

    Many thanks
     
  2. jcsd
  3. Oct 16, 2013 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    As a very simple example, note that
    [tex]\begin{pmatrix}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}\begin{pmatrix}1 \\ 0 \\ 0 \end{pmatrix}= \begin{pmatrix}a_{11} \\ a_{21} \\ a_{31}\end{pmatrix}[/tex]
    and similarly for [itex]\begin{pmatrix}0 \\ 1 \\ 0 \end{pmatrix}[/itex] and [itex]\begin{pmatrix}0 \\ 0 \\ 1 \end{pmatrix}[/itex]

    That is, applying the linear transformation to the standard basis vectors gives the three columns. The linear transformation is invertible if and only if it maps R3 to all of R3. That is true if and only if those three vectors, the three columns, are a basis for R3 which is, in turn, true if and only if the three vectors are independent.

    Generalize that to Rn.
     
  4. Oct 16, 2013 #3
    But that means you CAN'T have linearly dependent and invertible linear transformations .. no?
     
    Last edited: Oct 16, 2013
  5. Oct 16, 2013 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, If those n vectors, the columns of the n by n matrix, are linearly dependent, they span only a subset of Rn and so the linear transformation is NOT invertible.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook