How to Prove that the Span of k-1 Vectors is also the Span of V?

oxlade15
Messages
2
Reaction score
0

Homework Statement


1. If V is spanned by {v1,v2, ..., vk} and one of these vectors can be written as a linear combination of the other k-1 vectors, prove that the span of these k-1 vectors is also V.


Homework Equations


A set S = {v1,v2, ..., vk}, k >= 2 is linearly dependent if and only if at least one of the vectors vj can be written as a linear combination of the other vectors in S.


The Attempt at a Solution


Since one of the vectors can be written as a linear combination of the other k-1 vectors, this means that the set of vectors is linearly dependent. Also, since V is spanned by {v1,v2, ..., vk}, then span(S) = {c1v1 + c2v2 + ...+ ckvk : c1, c2, ..., ck are real numbers} by the definition of the span of a set. In addition, by the definition of linearly dependent, there exists a nontrivial solution to the c1v1 + c2v2 +...+ ckvk = 0. These are all the pieces of information that I have deduced from the information given in the problem. From here, I am unsure as to how to proceed in this proof.
 
Physics news on Phys.org
Here's what I suggest. Say V1 is the span of {v1,...vk} and V2 is the span of {v1,...vk-1}. Now, assuming that V1 and V2 are different, there exists a vector x in V2, not in V1 or a vector y in V1, but not in V2.

Since x is in V2, there exists a sum c1v1 + c2v2 + ... + ck-1vk-1 = x. We can write a sum c1v1 + c2v2 + ... + ck-1vk-1 + 0vk = x + 0 = x, expressing x with a base of V1, so x is in V1.

Now y. It is given that vk is in V2, so we'll write it as b1v1 + ... bk-1vk-1.
y = a1v1 + ... + ak-1vk-1 + akvk = a1v1 + ... + ak-1vk-1 + ak( b1v1 + ... bk-1vk-1 ) = (a1+akb1)v1 + ... + (ak-1 + akbk-1)vk-1, so y is in V2.

Thus, since V1/V2 = V2/V1 = ∅, V1 = V2.
 
Last edited:
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top