Linear Independence: Matrix/Equations Analysis

TheColorCute
Messages
22
Reaction score
0
How do you know when a matrix (or equivocally a system of equations) is linearly independent? How do you know that it's linearly dependent?

For example, given this matrix,

[ 1 1 2 1]
[-2 1 4 0]
[ 0 3 2 2]

How do we know if this matrix is linearly independent or dependent?

Thanks! :)
 
Physics news on Phys.org
TheColorCute said:
How do you know when a matrix (or equivocally a system of equations) is linearly independent? How do you know that it's linearly dependent?

For example, given this matrix,

[ 1 1 2 1]
[-2 1 4 0]
[ 0 3 2 2]

How do we know if this matrix is linearly independent or dependent?

Thanks! :)

It isn't a matrix that is linearly dependent or independent. You can ask whether its rows or columns are. In this case the columns must be dependent because there are 4 of them and the columns have 3 components. To check whether the rows are dependent you would do row reduction. If a row becomes all 0 the rows are dependent.
 
What do you mean by "3 components"?
 
TheColorCute said:
What do you mean by "3 components"?

Each column is a 3d column vector. It has, count 'em, three components.
 
LCKurtz said:
Each column is a 3d column vector. It has, count 'em, three components.

Ahhh. And by "components" you mean rows?
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top