Prove Linear Independence of x1 and x2

s_stylie0728
Messages
13
Reaction score
0

Homework Statement


x1, x2, and x3 are linearly dependent. Show that x1 and x2 are linearly independent.


Homework Equations


After reduction using gaussian elimination, x1, x2, and x3 are proven to be linearly dependent because x1 and x2 are defined by x3 (being the free variable) as:

x1-x2-6x3 = 0
x2-2x3 = 0


The Attempt at a Solution


I set the two above equations equal to each other and created a linear combination of x1 and x2 to create x3. This came out to be:

(1/4)x1 - (1/2)x2 = x3

I did this because according to theory, I should define x3 as a linear combination of the two I'm trying to prove to be linearly independent because this eliminates x3. But my vector space is R^3, so I'm confused on how to "eliminate" x3. In order to prove linear independence the vectors must be equal to 0. But in this case, their equal to x3. I can't just say (1/4)x1 - (1/2)x2 = 0 can I? Then I tried solving for (1/4)x1 - (1/2)x2 - x3 = 0, and I just got definitions in terms of free variables again. My book gives no examples, so I just have my elementary grasp read from a complex theory to work off of here. Any ideas? I would appreciate it!
 
Physics news on Phys.org
Im a bit confused. Were you given three specific vectors to work with, or are you working with three arbitary vectors?
 
Ha, yeah. Sorry...

x1 = {2,1,3}
x2 = {3,-1,4}
x3 = {2,6,4}
 
A set of vectors {v1, v2, ..., vn} is linearly dependent iff the equation c1*v1 + c2*v2 + ... + cn*vn=0 has a solution where at least one of the constants ci is not zero.

If you're checking two vectors, the definition above can be used, but you can do something simpler (that is equivalent to this definition): one vector will be a constant multiple of the other.

Have you given us the right vectors for the problem you posted? x1 and x2 are linearly independent, not linearly dependent.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...

Similar threads

Back
Top