Matrix concept Questions (invertibility, det, linear dependence, span)

  • #1
23
1

Homework Statement:

For n*n matrix, I would like to know how matrix A being invertible means that det(A) is not 0, that A is linearly independent, and that columns of A spans the matrix.

Relevant Equations:

NA
I have a trouble showing proofs for matrix problems. I would like to know how

A is invertible -> det(A) not 0 -> A is linearly independent -> Column of A spans the matrix

holds for square matrix A. It would be great if you can show how one leads to another with examples! :)

Thanks for helping out.
 

Answers and Replies

  • #2
Math_QED
Science Advisor
Homework Helper
2019 Award
1,693
719
Really any book on linear algebra has proofs of these facts. Have you consulted one of those? I like Axler's book on linear algebra :)
 
  • #3
13,457
10,517
The answer to these questions could be given easily on some higher level. E.g. the first inclusion follows from the fact that the determinant is a group homomorphism. I assume that you didn't want to hear this as explanation. Hence I have to ask you, what you already know.

What is a matrix to you?
What is the determinant to you?
What does "A is linearly independent" mean, if not that the column vectors are linearly independent?

Edit: Do you know the formulas ##\det(A\cdot B)=\det(A)\cdot \det (B)## and ##n= \dim \operatorname{im}(A) + \dim \operatorname{ker}(A)##?
 
Last edited:
  • #4
23
1
The answer to these questions could be given easily on some higher level. E.g. the first inclusion follows from the fact that the determinant is a group homomorphism. I assume that you didn't want to hear this as explanation. Hence I have to ask you, what you already know.

What is a matrix to you?
What is the determinant to you?
What does "A is linearly independent" mean, if not that the column vectors are linearly independent?

Edit: Do you know the formulas ##\det(A\cdot B)=\det(A)\cdot \det (B)## and ##n= \dim \operatorname{im}(A) + \dim \operatorname{ker}(A)##?
I am aware of the the first formula, but not the second one.

The trouble I am having with linear algebra is that when I am given a question, I would know how to solve for determinants and inverse or decide if a matrix is invertible or not. But I do not know how one is related to the other..
 
  • #5
13,457
10,517
You can imagine the determinant as the volume of the object spanned by its column vectors. If they are linearly independent, then they span a parallelepiped with positive volume. If not, then they span a hyperplane. But a lower dimensional object in a higher dimensional space has no volume.

The algebraic method is faster, but delivers no insights. If ##A## is invertible, then this means there is a matrix ##B## such that ##A\cdot B = 1##. As the determinant respects multiplication we get from this ##\det(AB)=\det(A)\det(B)=\det(1)=1## and so that ##\det(A)## is a divisor of ##1##, i.e. especially not equal ##0##.

If you have only a formula for the determinant, then you should prove this homomorphism property first, e.g. per induction.

This is why I asked you, what determinant means to you.

"A is linearly independent" is not enough. What does that mean? If I take what you wrote, then I see a vector ##A## in the vector space of linear functions on ##\mathbb{R}^n##. As a single vector is always linear independent as soon as it is different from the zero vector, this statement is trivially true and has absolutely nothing to do with the determinant of ##A##. Hence I assumed, that you meant something else. However, the
closest possibility to interpret what you might have meant, is to translate it by "the column vectors of ##A## are linear independent in ##\mathbb{R}^n##. But the I have ##n## linear independent vectors, and of course they span an ##n-##dimensional space. So where is the problem?

This is why I asked you about the meaning of "A is linearly independent". Linear independence always requires a reference: Where do the vectors live? When did you switch from matrix to vector? Matrices can only be linear independent if considered as vectors in some vector space.

The rank formula is the best way to see the implication from ##A## to its column vectors.

The remaining implication ##\det(A)\neq 0 \Longrightarrow ## "column vectors of ##A## are linearly independent" is best seen the other way around. Assume a non trivial linear dependence of the column vectors and see what is does to the determinant. One can handle this with the properties of the determinant, or the properties of ##A## as linear mapping.

This is why I asked you what ##A## means to you: a number scheme or a linear function?

Here is a good read about the geometry of such things:
https://arxiv.org/pdf/1205.5935.pdf
I don't know whether it will answer your questions above, but it will definitely help you to imaginate the objects you are dealing with.
 

Related Threads on Matrix concept Questions (invertibility, det, linear dependence, span)

Replies
2
Views
4K
Replies
1
Views
13K
Replies
3
Views
5K
  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
0
Views
785
Replies
3
Views
2K
Replies
6
Views
2K
Replies
8
Views
675
  • Last Post
Replies
9
Views
1K
Top