Invertible matrix implies linear independent columns

Click For Summary

Discussion Overview

The discussion centers on the relationship between invertible matrices and the linear independence of their columns, exploring whether the statement "an invertible matrix implies linear independent columns" is true. The scope includes theoretical aspects of linear algebra and the implications of linear transformations.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant questions the truth of the statement regarding invertible matrices and linear independence, indicating uncertainty based on their recent study.
  • Another participant provides a mathematical example involving a 3x3 matrix and standard basis vectors, arguing that the invertibility of the transformation is equivalent to the columns being a basis for R3, which requires linear independence.
  • A third participant suggests that if the columns are linearly dependent, then the corresponding linear transformation cannot be invertible.
  • A fourth participant agrees with the previous point, stating that linearly dependent columns would only span a subset of Rn, thus confirming the transformation is not invertible.

Areas of Agreement / Disagreement

Participants generally agree that if the columns of a matrix are linearly dependent, the matrix cannot be invertible. However, the initial question about the truth of the statement remains open for further exploration.

Contextual Notes

The discussion does not resolve the initial uncertainty expressed by the first participant regarding the statement's truth, nor does it clarify all assumptions related to linear independence and invertibility.

jamesb1
Messages
22
Reaction score
0
Is the title statement true?

Was doing some studying today and this caught my eye, haven't looked into linear algebra in quite a while so I'm not sure how it is true :/

Internet couldn't provide any decisive conclusions neither

Many thanks
 
Physics news on Phys.org
As a very simple example, note that
\begin{pmatrix}a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{pmatrix}\begin{pmatrix}1 \\ 0 \\ 0 \end{pmatrix}= \begin{pmatrix}a_{11} \\ a_{21} \\ a_{31}\end{pmatrix}
and similarly for \begin{pmatrix}0 \\ 1 \\ 0 \end{pmatrix} and \begin{pmatrix}0 \\ 0 \\ 1 \end{pmatrix}

That is, applying the linear transformation to the standard basis vectors gives the three columns. The linear transformation is invertible if and only if it maps R3 to all of R3. That is true if and only if those three vectors, the three columns, are a basis for R3 which is, in turn, true if and only if the three vectors are independent.

Generalize that to Rn.
 
  • Like
Likes   Reactions: MathewsMD and jamesb1
But that means you CAN'T have linearly dependent and invertible linear transformations .. no?
 
Last edited:
Yes, If those n vectors, the columns of the n by n matrix, are linearly dependent, they span only a subset of Rn and so the linear transformation is NOT invertible.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
Replies
1
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
11K
  • · Replies 1 ·
Replies
1
Views
2K