How do I show that the vectors of an invertible MX are indepedent?

  • Thread starter Minhtran1092
  • Start date
  • Tags
    Vectors
In summary: The scalars ##c_i## are not all zero, so we can't just eliminate them all. The vector ##x## must be zero, too. So the matrix equation becomes ##Ax = 0##, and the matrix is invertible because the determinant of an invertible matrix is always positive. So the vectors are linearly independent.
  • #1
Minhtran1092
28
2
Suppose we have an nxn matrix A with column vectors v1,...,vn. A is invertible. With rank(A)=n. How do I prove that v1,...,vn are linearly independent?

I think I can prove this by using the fact that rank(A)=n, which tells me that there is a pivot in each of the n columns of the rref(A) matrix (because rref(invertible mx) gives an identity matrix). I'm not sure how to interpret this result to show that each column vector are linearly independent though.

Should I look at the linear combination of an identity matrix to establish independence?
 
Physics news on Phys.org
  • #2
I don't know how to go about the proof. However, if the v's are linearly dependent, then det(A) = 0 and the matrix is not invertible.
 
  • #3
Minhtran1092 said:
Suppose we have an nxn matrix A with column vectors v1,...,vn. A is invertible. With rank(A)=n. How do I prove that v1,...,vn are linearly independent?

I think I can prove this by using the fact that rank(A)=n, which tells me that there is a pivot in each of the n columns of the rref(A) matrix (because rref(invertible mx) gives an identity matrix). I'm not sure how to interpret this result to show that each column vector are linearly independent though.

Should I look at the linear combination of an identity matrix to establish independence?


Not sure what you mean by pivot and rref so I can't help you directly.

But if the vectors were linearly dependent the some linear combination of them would be zero - by definition. The coefficients of this linear combination form another vector. What is the matrix multiplied by this vector?
 
  • #4
lavinia said:
Not sure what you mean by pivot and rref so I can't help you directly.

But if the vectors were linearly dependent the some linear combination of them would be zero - by definition. The coefficients of this linear combination form another vector. What is the matrix multiplied by this vector?

rref stands for the Reduced Row Echelon Form operation on a calculator (which operates on a matrix to give the reduced row echelon form of some given matrix). rref(A) gives reduced row echelon form of A.

A pivot of a row refers to the leading 1 in the respective row of the rref of some MX.

I don't follow where you mentioned that the linear comb. of some dependent vectors would be zero.
 
  • #5
Minhtran1092 said:
I don't follow where you mentioned that the linear comb. of some dependent vectors would be zero.

That is just the definition of "linearly dependent".

If the ##v_i## are linearly dependent, then ##\sum c_iv_i = 0## where the scalars ##c_i## are not all zero.

The ##v_i## are column vectors of A. So think how to write ##\sum c_iv_i = 0## as ##Ax = 0##, for a non-zero vector ##x##.
 

1. How do I determine if the vectors of an invertible MX are independent?

To show that the vectors of an invertible matrix MX are independent, you can use the determinant test. If the determinant of MX is non-zero, then the vectors are independent. This is because an invertible matrix has a non-zero determinant, and the determinant represents the volume of the parallelepiped spanned by the vectors.

2. Can I use the row echelon form instead of the determinant test to show independence of vectors in an invertible matrix?

Yes, you can also use the row echelon form to show independence of vectors in an invertible matrix. If the matrix is in row echelon form, the pivots (leading non-zero entries) must be non-zero. This means that each column of the matrix is a pivot column, and therefore the columns are linearly independent.

3. Is it possible for two vectors in an invertible matrix to be independent but not orthogonal?

Yes, it is possible for two vectors in an invertible matrix to be independent but not orthogonal. Independence and orthogonality are two different concepts. Two vectors can be independent if they cannot be written as a linear combination of each other, while two vectors are orthogonal if they are perpendicular to each other.

4. Can I use the Gram-Schmidt process to show independence of vectors in an invertible matrix?

Yes, the Gram-Schmidt process can be used to show independence of vectors in an invertible matrix. This process takes a set of vectors and produces a new set of orthogonal (or even orthonormal) vectors that span the same space. If the original set of vectors was independent, the new set will also be independent.

5. Do I need to show that all vectors in an invertible matrix are independent, or just a subset?

You only need to show that a subset of vectors in an invertible matrix are independent. This is because if a subset of vectors are independent, then the remaining vectors can be expressed as a linear combination of the independent subset. In other words, if some vectors are independent, then all vectors in the matrix are also independent.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
879
  • Linear and Abstract Algebra
Replies
4
Views
1K
Replies
27
Views
1K
  • Linear and Abstract Algebra
Replies
9
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
15
Views
4K
  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
1K
Back
Top