Can Linear Independence Be Determined Without Row Reduction?

  • Thread starter Thread starter charlies1902
  • Start date Start date
  • Tags Tags
    Basis
charlies1902
Messages
162
Reaction score
0
Suppose that S = {v1, v2, v3} is a basis for a
vector space V.
a. Determine whether the set T = {v1, v1 +
v2, v1 + v2 + v3} is a basis for V.
b. Determine whether the set
W = {−v2 + v3, 3v1 + 2v2 + v3, v1 −
v2 + 2v3} is a basis for V.


I must check if they're linearly independent.

For a:
c1v1+c2v1+c2v2+c3v1+c3v2+c3v3=0 c's are constants
v1(c1+c2)+v2(c2+c3)+v3(c3)
Forming the matrix gives
1 1 0
0 1 1
0 0 1
rref of this matrix is the identity matrix, thus it's linearly independent.


For b:
the same thing was done except the rref of the matrix was not the identity matrix, thus it's not a basis.


My question is is there an easier way to do this problem? It seems i made it harde/longer.
 
Physics news on Phys.org
charlies1902 said:
Suppose that S = {v1, v2, v3} is a basis for a
vector space V.
a. Determine whether the set T = {v1, v1 +
v2, v1 + v2 + v3} is a basis for V.
b. Determine whether the set
W = {−v2 + v3, 3v1 + 2v2 + v3, v1 −
v2 + 2v3} is a basis for V.


I must check if they're linearly independent.

For a:
c1v1+c2v1+c2v2+c3v1+c3v2+c3v3=0 c's are constants
It would be clearer if you start$$c_1v_1+c_2(v_1+v_2)+c_3(v_1+v_2+v_3)=0$$
v1(c1+c2)+v2(c2+c3)+v3(c3)
and maybe you wouldn't have made that mistake, should be
$$(c_1+c_2+c_3)v_1+(c_2+c_3)v_2 +c_3v_3=0$$
Forming the matrix gives
1 1 0
0 1 1
0 0 1
rref of this matrix is the identity matrix, thus it's linearly independent.

My question is is there an easier way to do this problem? It seems i made it harde/longer.

Yes, you can do it in your head. Looking at my last equation you see ##c_3=0##. Then since ##(c_2+c_3)=0## you know ##c_2=0## so...

Sometimes the equations are so simple it isn't worth the time to row reduce.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top