# Prove a set is a basis for V3 if it spans i, j, and k

1. Sep 6, 2009

### Subdot

1. The problem statement, all variables and given/known data
"Prove that a set S of three vectors in V3 is a basis for V3 if and only if its linear span L(S) contains the three unit coordinate vectors i, j, k."

2. Relevant equations
I have the definitions of bases, linear independence, and linear spans. I have the theorems which states that a set of n linearly independent vectors is a basis, that every basis in Vn contains n vectors, that a set of linearly independent vectors is a subset of some basis, and that any orthogonal set is linearly independent.

I also have the theorem that states if a set of k vectors in Vn is linearly independent, then any set of k + 1 vectors in its linear span is linearly dependent. And of course, I have the theorem which states that a set spans every vector in its linear span uniquely if and only if the set spans the zero vector uniquely, and I know that a set is linearly independent if it spans the zero vector uniquely.

3. The attempt at a solution
I can prove the first part: "A set S of three vectors in V3 contains the three unit coordinate vectors i, j, k in L(S) if S is a basis for V3." I'm having trouble with the second part. I can prove it, but I don't like my proof because it only requires i and k to be in the set's linear span, it isn't *too* easy to generalize to a set of n vectors--which I have to do after this proof, and I think there may be some unnecessary parts to it.

So here is what I've got so far. My plan was to prove S was linearly independent by contradiction. There are n vectors in S, so S would be a basis for V3 by one of the theorem's I listed. To do this, I would first need to prove that none of the vectors in S are the zero vector (which I'll denote as 0).

(1)If i, j, and k are elements of L(S), then d1A1 + d2A2 + d3A3 = i, e1A1 + e2A2 + e3A3 = j, and f1A1 + f2A2 + f3A3 = k for some scalar constants di, ei, and fi where i = 1, 2, 3 and S = {A1, A2, A3}.

Assume S is dependent, then Ai might equal 0 for i = 1, 2, 3. Assume all three are zero. This cannot be true because d1A1 + d2A2 + d3A3 = i =/= 0. So not all the vectors in S are 0. Assume two of the vectors in S are 0. Let all but A1 = 0, if necessary, renumbering the vectors in S to achieve this. Then, d1A1 = i. Thus, A1 is a scalar multiple of i. But e1A1 = j--a contradiction since i is not a scalar multilpe of j. So two vectors in S can be 0.

Assume one of the vectors in S = 0. Let A3 = 0, which can be achieved by renumbering the vectors in S if necessary. Then, d1A1 + d2A2 = i = (1, 0, 0).

For this to be true, the last two components of A1 must be a scalar multiples of A2 and the first component of A1 must not be a scalar multiple of A2. But, e1A1 + e2A2 = j = (0, 1, 0), so the first component of A1 must be a scalar multiple of the first component of A2--a contradiction.

So no vector in S is 0. In c1A1 + c2A2 + c3A3 = 0, where c1, c2, and c3 are scalar constants. Since by assumption S is dependent, and since no vector in S = 0, the only way for this to be dependent is if not all of the constants are 0.

(2)Assume that c3 =/= 0. This means that c1A1 + c2A2 = -c3A3 --> -(c1/c3)A1 - (c2/c3)A2 = A3 which means that the equation d1A1 + d2A2 + d3A3 = i can be rewritten as d1A1 + d2A2 - ((d3c1)/c3)A1 - ((d3c2)/c3)A2 = (d1 - ((d3c1)/c3))A1 + (d2 - ((d3c2)/c3))A2 = i.

This can be rewritten as k1A1 + k2A2 = i, k a scalar constant. Similarly, for a scalar constant p, p1A1 + p2A2 = j. But this is the same as the case when A3 = 0, and so by a similar line of reasoning, -(c1/c3)A1 - (c2/c3)A2 =/= A3.

But this contradicts that S is linearly dependent, so the only solution is c1 = c2 = c3 = 0, and S is linearly independent. Since S consists of three vectors and is in V3, by one of the theorem's I mentioned above, S is is a basis for V3.

So as you can see, this proof only depends on two unit vectors being in L(S), which I find odd since the hypothesis has i, j, and k being members of L(S). In addition, I'm not sure if all of that is necessary. Was I right in needing to prove all the vectors in S were not 0? Could I have just skipped to the part labeled (2) of my proof but include the proof of no two vectors in S spanning i, j, and k there?

Also, I have an alternative proof, which is simpler and quicker, but I wasn't sure if it was "legal" (and there may be some unnecessary parts still). I also haven't thought about it as much as the above one. Here it is:

If i, j, and k are elements of L(S), then d1A1 + d2A2 + d3A3 = i, e1A1 + e2A2 + e3A3 = j, and f1A1 + f2A2 + f3A3 = k for some scalar constants di, ei, and fi where i = 1, 2, 3 and S = {A1, A2, A3}.

Multiply the first equation by a nonzero scalar constant b1, the second by a nonzero scalar constant b2, and the third by a nonzero scalar constant b3. Then, add all the equations together to get (b1d1A1 + b1d2A2 + b1d3A3) + (b2e1A1 + b2e2A2 + b2e3A3) + (b3f1A1 + b3f2A2 + b3f3A3) = (b1d1 + b2e1 + b3f1)A1 + (b1d2 + b2e2 + b3f2)A2 + (b1d3 + b2e3 + b3f3)A3 = b1i + b2j + b3k.

In other words, for some nonzero constant scalars q1, q2, and q3, q1A1 + q2A2 + q3A3 = b1i + b2j + b3k. A linear combination of S produces a linear combination of D = {i, j, k}. Since D is a basis for V3, a linear combination of i, j, k spans every vector in V3 uniquely. Since a linear combination of A1, A2, and A3 can make a linear combination of i, j, k, they can make a unique linear combination for every vector in V3 too. Thus, S is a basis too.

Last edited: Sep 7, 2009
2. Sep 6, 2009

### lanedance

thats a long post, i couldn't read it all but think you might have got into it a but much - here's a sketch to try & help

A basis S for vectorspace V is a linearly independent subset of V that spans V. This means any vector in V can be written as a linear combination of elements in S

->
say you have 3 vectors {u,v,w} that is a basis for V3
then span{u,v,w} = V3
This means any element of V3 can be written as a linear combination of u,v & w
Clearly i is in V3 so i is in span {u,v,w} and so on for j,k
<-
clearly span{i,j,k} = V3 as {i,j,k} is a basis for V3
now say you have 3 vectors {u,v,w} in V3 such that i, j, k are all contained in span{u,v,w}
as each i,j,k can be written as a linear combination of u,v,w it should be clear that span{i,j,k} is a subset of span{u,v,w}
in fact as {i,j,k} are lineraly independent and there are 3 elements, it is clear span{i,j,k} = span{u,v,w} = V3
(or compare dimensions & elements here)

Last edited: Sep 6, 2009
3. Sep 6, 2009

### Subdot

Okay, let's see if I understood the first part at least. So span{i,j,k} is a subset of span{u,v,w} because linear combinations of {u,v,w} can create span{i,j,k} (which means the last proof I wrote in the OP was "legal"?). Thus, every element in span{i,j,k} is in span{u,v,w} and so span{i,j,k} is a subset of span{u,v,w}?

I don't quite get the last part yet though.

Edit: Since that last sentence was vague, I'll try and be more precise. Can't a subset of a linearly dependent set be independent? So span{u,v,w} could be linearly dependent while containing V3 and so not be a basis. I know that if {a,b,c} is a subset of {e,f,g}, where a, b, c, e, f, and g are vectors, they will be equal (and the vectors contained having the same dimension follows from this, I think?), but I didn't think that worked with linear spans.

Edit 2: Perhaps though because {i,j,k} is a basis for V3, they can form span{u,v,w} through linear combinations. That means span{u,v,w} is a subset of span{i,j,k}. Since span{i,j,k} is also a subset of span{u,v,w}, the two sets are equal. Thus, span{i,j,k} = span{u,v,w} = V3. Is that correct? Or can {u,v,w} still be linearly dependent through that chain of logic?

Edit 3: I guess {u,v,w} can still be linearly dependent since a vector could be repeated more than once in span{u,v,w} from two different linear combinations. The sets would still be equal though since their elements are the same.

Last edited: Sep 6, 2009
4. Sep 6, 2009

### lanedance

Each, i,j,k is an element of span{u,v,w}
SO each i,j,k can be written as a linear combination of u,v,w

As span{i,j,k} is just the set of all linear combinations of i,j,k,
Clearly span{i,j,k} will be contained in the set of all linear combinations of u,v,w, = span{u,v,w}. This is beacause each i,j,k can be written as a combination of u,v,w

A subset of a linearly dependent set can be linearly independent, however I'm not sure where you went with that

bit of clarification:
span{u,v,w} is the space created by all linear combinations of u,v,w. This is an infinite set of vectors, linearly dependent by definition. In this case span{u,v,w} = V3

A basis for vectorspace V, is a set of linearly indpendent vectors, whose span = V

The dimension of a vectorspace V is the maximum number of linearly vectors in any subset of V. This is exactly the number of vectors in any basis for V

I'm not really sure what your questions is?

5. Sep 6, 2009

### Subdot

Yes, that's what I thought you were saying. My only problem was that only i, j, and k were in span{u,v,w} by hypothesis--not all linear combinations of i, j, and k. However, since linear combinations of i, j, and k would be linear combinations of linear combinations of {u,v,w}, each linear combination of i, j, and k is indeed in span{u,v,w}.

So I was correct on that part of the proof I wrote in the OP at least.

I was trying to figure out how you went from span{i,j,k} being a subset of span{u,v,w} to the sets being equal. I knew that for sets which have the same number of elements where one is a subset of the other are equal, but I wasn't sure if that worked with linear spans since they do not necessarily have the same number of elements.

Then my parenthetical phrase referred to that if two sets that are subsets of Vn are equal, then that implies their vectors have the same dimension. After thinking about it some more though, I don't think this is completely true since one vector could be in the 2nd dimension while the other two have three dimensions (I haven't reached a definition of dimension, btw, though I can guess what it will be...). The best that can be said is that the vectors that are equal in the two sets have the same dimension--so I think.

Part of this was trying to figure out how you went from span{i,j,k} being a subset of span{u,v,w} to the two sets being equal. I figured that span{i,j,k} = V3 and so contains span{u,v,w} by definition. Thus, span{u,v,w} and span{i,j,k} are subsets of each other. Therefore, they are equal, and I was able to understand your statement ("in fact as {i,j,k} are lineraly independent and there are 3 elements, it is clear span{i,j,k} = span{u,v,w} = V3") in that manner.

The other part of that edit was my problem: how you can tell {u,v,w} is linearly independent? If it is not linearly independent, it does not form a basis for V3 and so the proof is not complete. That part of my post was asking if span{u,v,w} = span{i,j,k} = V3, does it necessarily mean {u,v,w} is linearly independent? In my third edit, I figured that if {u,v,w} is dependent, then span{u,v,w} can contain some redundant members resulting from the linear dependency of {u,v,w}. If so, then although span{u,v,w} = V3, {u,v,w} could not be a basis for V3. Thus, span{u,v,w} = span{i,j,k} does not necessarily mean {u,v,w} is linearly independent.

Could there be a theorem out there which states that a subset of Vn which spans Vn and contains n vectors is linearly independent and therefore a basis for Vn? If so, I don't know it, so there is probably a different way to prove this.

Edit: I'm still stuck though, since I don't know how to prove {u,v,w} is linearly independent without resorting to the tedious proof that I don't like in the OP--not to mention that the proof in the OP does not rely on i, j, and k being in span{u,v,w}, unlike the one you outlined for me.

Edit 2: Aha! "Every set {v1,...,vn} of n vectors in V that span V is automatically a basis for V." http://www.math.uiuc.edu/~jvanha2/225/notes/lecture7.pdf [Broken]

So now I see where you were going with your proof. {u,v,w} would be a basis for V3 because it contains 3 elements and spans V3. Now if I can prove that this is true from the information I already have, I can use it.

Edit 3: Bleh. I just noticed this very important line in my text right after the definition of a linear span of a set S which is denoted by L(S): "Note that linear combinations of vectors in L(S) are again in L(S)." That meant it should have clicked immediately that if i, j, and k were elements of span{u,v,w}, span{u,v,w} = span{i,j,k} = Vn--without me having to prove that the statement was true like I did in the final proof I made in the OP. Very sorry about that. I'm not usually that unobservant....

Last edited by a moderator: May 4, 2017
6. Sep 7, 2009

### Subdot

Okay, I think I've got it now. Here's my current proof. It generalizes to n-vector space better too, but still not as nicely as I would have liked. Am I right that this proof works for 3-space and can be generalized to n-space? Thanks for the help so far.

If i, j, and k are in L(S) and if S = {u,v,w}, then i, j, and k can be written as linear combinations of {u,v,w}. Because linear combinations within span{u,v,w} are in span{u,v,w}, span{i,j,k} is a subset of span{u,v,w}. Since span{i,j,k} = V3, span{u,v,w} is a subset of of span{i,j,k}. Therefore, span{u,v,w} = span{i,j,k} = V3.

Assume {u,v,w} is linearly dependent. Then for some scalar constants c1, c2, and c3--with not all the constants equaling zero--for nonzero scalar constants ai, bi, and ci (i = 1, 2, 3): c1u + c2v + c3w = 0 --> for nonzero scalar constants ai, bi, and ci (i = 1, 2, 3): c1u + c2v = -c3w --> (-c1/c3)u - (c2/c3)v = w (assume c3 =/= 0, renumbering if necessary). So, w is in span{u,v}.

b1u + b2v + b3w = a1i + a2j + a3k for some scalar constants, not all zero, bi and ai where i = 1, 2, 3. Because w is in span{u,v}, this can be rewritten in terms of u and v as: k1u + k2v + k3(-c1/c3)u - k3(c2/c3)v = d1u + d2v = a1i + a2j + a3k, for appropriately chosen not all zero scalar constants d1, d2, and d3. This means, span{u,v} = span{i,j,k} = V3. {u,v} cannot be linearly independent since then it would be a basis for V3 which is impossible since all bases in V3 must contain 3 vectors.

So assume {u,v} is dependent. Then in a similar manner to the above, we find v is in span{u} and span{u} = V3. This is false because for some nonzero scalar constant, u must equal i, j, and k and thus be parallel to all three of them. Thus, {u,v} is not linearly dependent. But a set must be either linearly independent or linearly dependent. Thus, span{u,v} =/= V3 and {u,v,w} must be linearly independent. Because {u,v,w} is linearly independent and contains 3 vectors, it must be a basis for V3, which was to be proven.

Last edited: Sep 7, 2009