Invertible matrix implies linear independent columns

jamesb1
Messages
22
Reaction score
0
Is the title statement true?

Was doing some studying today and this caught my eye, haven't looked into linear algebra in quite a while so I'm not sure how it is true :/

Internet couldn't provide any decisive conclusions neither

Many thanks
 
Physics news on Phys.org
As a very simple example, note that
\begin{pmatrix}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}\begin{pmatrix}1 \\ 0 \\ 0 \end{pmatrix}= \begin{pmatrix}a_{11} \\ a_{21} \\ a_{31}\end{pmatrix}
and similarly for \begin{pmatrix}0 \\ 1 \\ 0 \end{pmatrix} and \begin{pmatrix}0 \\ 0 \\ 1 \end{pmatrix}

That is, applying the linear transformation to the standard basis vectors gives the three columns. The linear transformation is invertible if and only if it maps R3 to all of R3. That is true if and only if those three vectors, the three columns, are a basis for R3 which is, in turn, true if and only if the three vectors are independent.

Generalize that to Rn.
 
  • Like
Likes MathewsMD and jamesb1
But that means you CAN'T have linearly dependent and invertible linear transformations .. no?
 
Last edited:
Yes, If those n vectors, the columns of the n by n matrix, are linearly dependent, they span only a subset of Rn and so the linear transformation is NOT invertible.
 
Back
Top