Can Vectors be Linearly Independent if Det Not Equal 0?

In summary: If the determinant is 0, then the matrix is not invertible. If a matrix is not invertible, then the only solution of the homogenous system is the trivial solution. And it is also true that if you have n linearly independent vectors, then they span a vector space of dim=n.
  • #1
UrbanXrisis
1,196
1
I don't know if this is a rule, but can a set of vectors be linearly independent if their determinant is not equal to zero?

say 4 vectors are given in R^4, if I took the determinant of the 4 vectors such that det{v1 , v2, v3, v4} is not equal to zero, could i say that these vectors are linearly independent?

if they are linearly independent, then does it mean that these 4 vectors span R^4?
 
Physics news on Phys.org
  • #2
UrbanXrisis said:
I don't know if this is a rule, but can a set of vectors be linearly independent if their determinant is not equal to zero?

say 4 vectors are given in R^4, if I took the determinant of the 4 vectors such that det{v1 , v2, v3, v4} is not equal to zero, could i say that these vectors are linearly independent?

if they are linearly independent, then does it mean that these 4 vectors span R^4?

If the determinate is 0, then the matrix is not invertible. If a matrix is not invertible, then the only solution of the homogenous system is the trivial solution. And it is also true that if you have n linearly independent vectors, then they span a vector space of dim=n.

I'm taking linear algebra too, so my knowledge isn't that great, but I'm pretty sure everything I stated is correct.
 
  • #3
If I'm understanding you correctly, you are saying that if you took four vectors, say collumn vectors, and formed them as the collumns of a matrix, and took the determinant of that matrix and found the determinant of that matrix and it was not zero, then could you say that the vectors are linearly independant? If that's what you mean then yes, because you know from the properties of the determinant that if one of the collumns was a linear combination of the others, then the determinant would be zero. Since it is not zero then no collumn can be a linear combination of the others. Therefore the vecors are linearly independant.
 
  • #4
Geekster said:
If the determinate is 0, then the matrix is not invertible. If a matrix is not invertible, then the only solution of the homogenous system is the trivial solution. And it is also true that if you have n linearly independent vectors, then they span a vector space of dim=n.

I'm taking linear algebra too, so my knowledge isn't that great, but I'm pretty sure everything I stated is correct.
you have it exactly backwards. Determinant *not* zero means that there is a unique solution. (consider the zero matrix if need be to see where you've gone wrong)
 
  • #5
if a set of vector spans a matrix, doesn't the vectors have to be linearly independent? So what is the need for a basis? a basis is a set of vectors that span a vector space and is LI, but isn't a vector that spans V already LI?
 
Last edited:
  • #6
UrbanXrisis said:
if a vector spans a matrix, doesn't it have to be linearly independent? So what is the need for a basis? a basis is a set of vectors that span a vector space and is LI, but isn't a vector that spans V already LI?
Huh?:confused: What do you mean "if a vector spans a matrix"? And as far as a single vector being linearly independant, that is true as long as it is not zero. And why are you asking what the need is for a basis? And what do you mean by "a vector that spans V"? If a single vector spans V then it is one dimensional at most. Also, if V consisted only of zero, then the zero vector could span it, but not be linearly independant.
 
  • #7
yeah, i caught that and changed the post
 

1. Can vectors be linearly independent if the determinant is not equal to 0?

Yes, it is possible for vectors to be linearly independent even if the determinant is not equal to 0. This is known as a non-singular matrix, where the inverse of the matrix exists and the vectors are linearly independent.

2. What is the significance of the determinant in determining linear independence?

The determinant of a matrix represents the scaling factor of the volume of the parallelepiped formed by the vectors. If the determinant is equal to 0, it means that the volume is 0, indicating that the vectors are linearly dependent.

3. How can we determine if vectors are linearly independent using the determinant?

To determine if vectors are linearly independent using the determinant, we need to calculate the determinant of the matrix formed by the vectors. If the determinant is not equal to 0, then the vectors are linearly independent. If the determinant is equal to 0, then the vectors are linearly dependent.

4. Can a matrix have linearly independent columns but a determinant of 0?

Yes, it is possible for a matrix to have linearly independent columns but a determinant of 0. This can happen when one or more columns are multiples of each other, resulting in a determinant of 0 even though the columns are linearly independent.

5. What other methods can be used to determine linear independence besides the determinant?

Other methods that can be used to determine linear independence include calculating the rank of the matrix, checking if the null space of the matrix only contains the zero vector, and using the Gram-Schmidt process to check for orthogonality of the vectors.

Similar threads

  • Calculus and Beyond Homework Help
Replies
14
Views
591
  • Calculus and Beyond Homework Help
Replies
1
Views
276
  • Calculus and Beyond Homework Help
Replies
2
Views
981
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
942
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
449
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
Back
Top