Is S Linearly Dependent in a Vector Space?

  • Thread starter Thread starter newtomath
  • Start date Start date
  • Tags Tags
    Linear Proof
newtomath
Messages
37
Reaction score
0
If S= { v1, v2, v3...vn} lies in a vector space, S is linearly dependent if one vector in S is a linear combination of all the other vectors in S.


So I set up the below:

c1v1A + c2v2A +c3 v3A= 0
c1v1B + c2v2B +c3 v3B= 0
c1v1C + c2v2C +c3 v3C= 0

Since S lies in the vector space we know there are infinitely many solutions for (c1,c2,c3)

What am I missing here?
 
Physics news on Phys.org
If the solution to your set of equations is c1 = c2 = c3 = 0, then the vectors v1, v2 and v3 are linearly independent. I'm not sure what exactly you're asking.
 
I'm assuming that, for example, by v1A you mean the first component of the vector v1? Your setup is correct but such an equation like that is not guaranteed infinitely many solutions: there is either exactly one solution (in which case it must be c1 = c2 = c3 = 0) or infinitely many. If there is only the one trivial solution then the set of vectors is linearly independent.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top