Matrix concept Questions (invertibility, det, linear dependence, span)

In summary, a matrix is a collection of numbers that can be used to modellinear relationships in a space. The determinant of a matrix is a number that tells you how much of the space the column vectors of the matrix occupy. The implication from the rank of a matrix to its column vectors is that they are linearly independent.
  • #1
Sunwoo Bae
60
4
Homework Statement
For n*n matrix, I would like to know how matrix A being invertible means that det(A) is not 0, that A is linearly independent, and that columns of A spans the matrix.
Relevant Equations
NA
I have a trouble showing proofs for matrix problems. I would like to know how

A is invertible -> det(A) not 0 -> A is linearly independent -> Column of A spans the matrix

holds for square matrix A. It would be great if you can show how one leads to another with examples! :)

Thanks for helping out.
 
Physics news on Phys.org
  • #2
Really any book on linear algebra has proofs of these facts. Have you consulted one of those? I like Axler's book on linear algebra :)
 
  • #3
The answer to these questions could be given easily on some higher level. E.g. the first inclusion follows from the fact that the determinant is a group homomorphism. I assume that you didn't want to hear this as explanation. Hence I have to ask you, what you already know.

What is a matrix to you?
What is the determinant to you?
What does "A is linearly independent" mean, if not that the column vectors are linearly independent?

Edit: Do you know the formulas ##\det(A\cdot B)=\det(A)\cdot \det (B)## and ##n= \dim \operatorname{im}(A) + \dim \operatorname{ker}(A)##?
 
Last edited:
  • #4
fresh_42 said:
The answer to these questions could be given easily on some higher level. E.g. the first inclusion follows from the fact that the determinant is a group homomorphism. I assume that you didn't want to hear this as explanation. Hence I have to ask you, what you already know.

What is a matrix to you?
What is the determinant to you?
What does "A is linearly independent" mean, if not that the column vectors are linearly independent?

Edit: Do you know the formulas ##\det(A\cdot B)=\det(A)\cdot \det (B)## and ##n= \dim \operatorname{im}(A) + \dim \operatorname{ker}(A)##?

I am aware of the the first formula, but not the second one.

The trouble I am having with linear algebra is that when I am given a question, I would know how to solve for determinants and inverse or decide if a matrix is invertible or not. But I do not know how one is related to the other..
 
  • #5
You can imagine the determinant as the volume of the object spanned by its column vectors. If they are linearly independent, then they span a parallelepiped with positive volume. If not, then they span a hyperplane. But a lower dimensional object in a higher dimensional space has no volume.

The algebraic method is faster, but delivers no insights. If ##A## is invertible, then this means there is a matrix ##B## such that ##A\cdot B = 1##. As the determinant respects multiplication we get from this ##\det(AB)=\det(A)\det(B)=\det(1)=1## and so that ##\det(A)## is a divisor of ##1##, i.e. especially not equal ##0##.

If you have only a formula for the determinant, then you should prove this homomorphism property first, e.g. per induction.

This is why I asked you, what determinant means to you.

"A is linearly independent" is not enough. What does that mean? If I take what you wrote, then I see a vector ##A## in the vector space of linear functions on ##\mathbb{R}^n##. As a single vector is always linear independent as soon as it is different from the zero vector, this statement is trivially true and has absolutely nothing to do with the determinant of ##A##. Hence I assumed, that you meant something else. However, the
closest possibility to interpret what you might have meant, is to translate it by "the column vectors of ##A## are linear independent in ##\mathbb{R}^n##. But the I have ##n## linear independent vectors, and of course they span an ##n-##dimensional space. So where is the problem?

This is why I asked you about the meaning of "A is linearly independent". Linear independence always requires a reference: Where do the vectors live? When did you switch from matrix to vector? Matrices can only be linear independent if considered as vectors in some vector space.

The rank formula is the best way to see the implication from ##A## to its column vectors.

The remaining implication ##\det(A)\neq 0 \Longrightarrow ## "column vectors of ##A## are linearly independent" is best seen the other way around. Assume a non trivial linear dependence of the column vectors and see what is does to the determinant. One can handle this with the properties of the determinant, or the properties of ##A## as linear mapping.

This is why I asked you what ##A## means to you: a number scheme or a linear function?

Here is a good read about the geometry of such things:
https://arxiv.org/pdf/1205.5935.pdf
I don't know whether it will answer your questions above, but it will definitely help you to imaginate the objects you are dealing with.
 

1. What does it mean for a matrix to be invertible?

For a matrix to be invertible, it must have a unique solution to the equation Ax = b for every b in its column space. This means that the matrix must have a non-zero determinant and its columns must be linearly independent.

2. How do you find the determinant of a matrix?

To find the determinant of a matrix, you must use a specific method depending on the size of the matrix. For a 2x2 matrix, you can use the formula ad - bc. For larger matrices, you can use row reduction or cofactor expansion to find the determinant.

3. What does it mean for a set of vectors to be linearly dependent?

A set of vectors is linearly dependent if at least one of the vectors in the set can be written as a linear combination of the other vectors. In other words, one of the vectors is not necessary for span of the set.

4. How do you determine if a set of vectors spans a particular space?

To determine if a set of vectors spans a particular space, you can use the row reduction method to see if the vectors can be reduced to the standard basis vectors for that space. You can also check if the vectors are linearly independent, as a set of linearly independent vectors will always span their entire space.

5. Can a matrix be invertible if it has a zero determinant?

No, a matrix cannot be invertible if it has a zero determinant. This is because a matrix with a zero determinant does not have a unique solution to the equation Ax = b for every b in its column space, which is a necessary condition for invertibility.

Similar threads

  • Calculus and Beyond Homework Help
Replies
25
Views
2K
  • Calculus and Beyond Homework Help
Replies
14
Views
598
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
882
  • Calculus and Beyond Homework Help
Replies
2
Views
393
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
991
Back
Top