Vector Spaces, Subspaces, Bases etc

rad0786
Messages
187
Reaction score
0
Vector Spaces, Subspaces, Bases etc... :(

Hello. I was doing some homework questions out of the textbook and i came across a question which is difficult to understand, could somebody please help me out with it?

-- if U and W are subspaces of V, define their intersection U ∩ W as follows:
U ∩ W = {v / v is in both U and W}

a) show that U ∩ W is a subspace contained in U and W

b) If B and D are bases of U and W, and if U ∩ W = {0}, show that
B U D = {v / v is in B or D} is linearly independent.
--

I was able to do part a)! That wasn't so tricky. You just show that clousre of addition and scalar multiplication hold. (and show that the vectors each belong in U and W etc...)

So i understand part a), but part b) is where I am lost :confused:

To begin, I am not sure if "B is a basis of U" and "D is a basis of W"... or is "B and D" a basis of U and "B and D" is a basis of W? I think its the first one.

Next... U ∩ W = {0} means that the zero vector lies in U and lies in W.
Furthermore... B U D has a vector that lies in only B or in only D, and not B ∩ D.

so now is where I don't know how to show that it is linearly independent.

All i know so far is that U ∩ W = {0} has a basis of 1 and that is all i have to work with :( Can somebody please help me further?

Thanks
 
Physics news on Phys.org
"B U D has a vector that lies in only B or in only D, and not B ∩ D." What are you basing this statement on?

What does it mean for a vector space (B U D) to be linearly independent? (Indep. of what?)
 
Last edited by a moderator:
well... B U D = {v / v is in B or D} which means that v can only be in B or in D and not in both, B ∩ D.

For a vector (B U D) to be linearly independent, then the cooefficents of its linear combination has to equal to zero.

The problem is, i don't know how to even get the linear combination of B U D.

I know that the basis of B ∩ D is just { (1) } since U ∩ W = {0} however, B U D I just cannot see...
 
EnumaElish said:
"B U D has a vector that lies in only B or in only D, and not B ∩ D." What are you basing this statement on?
What does it mean for a vector space (B U D) to be linearly independent? (Indep. of what?)

B and D are not vector spaces! They are sets of vectors. "Linearly independent" means the same thing it always does for sets of vectors in vector spaces in vector spaces.

rad0786 said:
The problem is, i don't know how to even get the linear combination of B U D.
The same way you always do! Let the vectors in B be
{b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}

Then a linear combination is something of the form:
\alpha_1b_1+ \alpha_2b_2+ ...+ \alpha_nb_n+ \beta_1d_1+ ...+\beta_md_m.

Suppose such a thing were equal to 0. If all of the \alphas were equal to 0, the \betas must be also because D is linearly independent, and vice versa. So the only question is if it is possible for some of the \alphas and \betas to be 0 and to cancel. Suppose that were true. Move all of the terms involving \betas (i.e. multiplying vectors in D) over to the right side of the equation, leaving only terms involving \alphas (i.e. multiplying vectors in B) on the left. What can you say about thevector on the left side of the equation? What about the vector on the right?
 
Last edited by a moderator:
HallsofIvy. Thank you for that!

I wrote out what you were saying, and moved all the a's and b's to the left side and right side respectively. I just don't see how they would cancel.

However, I thought of something slightly different...

Let the vectors in B be {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}

then: s1b1 + ... + snbn + t1d1 + ... + tndn = 0 (s and t are scalars)

Now because they are a basis, then it has to n linearly independent vectors...
and so s = t = 0

Does that work?
 
B U D = {v / v is in B or D} which means that v can only be in B or in D and not in both, B ∩ D.
Wrong. Ordinarily, "or" is inclusive.
 
EnumaElish said:
Wrong. Ordinarily, "or" is inclusive.
Yes, but in this problem, we were already told that U ∩ W = {0}. Of course, the 0 vector cannot be in either basis so no vector is in both B and D.
 
Okay, I had missed that.
 
rad0786 said:
HallsofIvy. Thank you for that!

I wrote out what you were saying, and moved all the a's and b's to the left side and right side respectively. I just don't see how they would cancel.

However, I thought of something slightly different...

Let the vectors in B be {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}

then: s1b1 + ... + snbn + t1d1 + ... + tndn = 0 (s and t are scalars)

Now because they are a basis, then it has to n linearly independent vectors...
and so s = t = 0

Does that work?

I'm sorry, what has "n linearly independent vectors"? Certainly B does, only because we were told that B is a basis and defined B to have n vectors in it!
Did you notice that I said, and you also, Let the vectors in B be {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}, but then you have
s1b1 + ... + snbn + t1d1 + ... + tndn = 0
where you seem to be assuming that m= n.
Oh, and you can't say "s and t are scalars- you don't have any "s and t", you have s1, s2, ..., sn and t1, t2,...,tm: two sets ofscalars.

My point was that if you write s1b1 + ... + snbn + t1d1 + ... + tmdm = 0
(I have changed that last "n" to "m".)
as s1b1 + ... + snbn = -(t1d1 + ... + tmdm ) on the left side you have a linear combination of vectors in B- so it must be a vector in U- and on the right you have a linear combination of vectors in D- so it must be a vector in V. But they are equal and the only vector in both U and V is 0 (that's an important part of the hypotheses you didn't use!), each side must be equal to 0. NOW use the fact that each of B and D is linearly independent to show that all of the scalars must be 0.
 
  • #10
Hey I am so sorry... i made a typo... yes i did mean that m=n.

I was trying to arrive at this conclusion:

Let U = a1u1 + a2u2 + ... + anun which is in B
Let V = a1v1 + a2v2 + ...+ anvn which is in D
Vectors in both B and D = 0

a1u1 + a2u2 + ... + anun + a1v1 + a2v2 + ...+ anvn = 0
a1u1 + a1v1 + a2u2 + a2v2...+ anun + anvn = 0
a1(u1 + v1) + a2(u2 + v2) + ... + an(un + vn) = 0

and then show that a1 = a2 =... an = 0?

Yes i am assuming that B and D have n vectors... but now that i think of it... i can't make that assumption since we don't know the number of vectors in the bases of B and D

So speaking of what you had... since B is a basis with n vectors and D is a basis with m vectors... then the n vectors are a spanning set and linearly indepednet in B and the m vectors are a spanning set and linearly independent in D.

Im starting to hate math :( ...
 
  • #11
Okay... i have a VERY related question that has to do with this stuff...

--If U and W are subspaces of V and dimU =2, show that
either U⊆W or dim(U ∩ W) ≤ 1

This is how i went about it.

Let dimU = m and dimW = k
Then any basis of U = {u1 u2} and is a set of independent vectors of 2! thus, m = 2
Also, any basis of W ={w1 ... wk} and is a set of independent vectors of k.
Also, the dimV ≥ 2

But... U ∩ W is all the vectors that are in U and all the vectors that are in W. so the dim(U ∩ W) HAS to be ≤ 2, unless dimW =1, then dim(U ∩ W) HAS to be ≤ 1. If this were the case, then dimU > dimW and its possible for W⊆U

But... if U⊆W, then dimW > dimU ... which this just contraticts what i said above?

What am i doing wrong? why are the two contradicting?
 
  • #12
rad0786 said:
Hey I am so sorry... i made a typo... yes i did mean that m=n.p
This is getting weird! You don't seem to be understanding my comments at all. All you know about B and D is that they are sets of independent vectors in U and V respectively and that U and V have only the 0 vector in common. You have absolutely no reason to believe that m= n. That was my point.

I was trying to arrive at this conclusion:

Let U = a1u1 + a2u2 + ... + anun which is in B
Let V = a1v1 + a2v2 + ...+ anvn which is in D
Vectors in both B and D = 0
This makes no sense at all. U is not a vector, it is a vector space! Did you mean, say, u= a1u1 + a2u2 + ... + anun where u is a vector in U? But even then "which is in B" is wrong. The individual vectors u1, u2,... etc. are in B. A linear combination of them is in U but not necessarily in B.
Same comments apply to the next line.

"Vectors in B and D= 0". Actually, you know that 0 is not in B or D since any set of vectors containing the 0 vector can't be independent!

a1u1 + a2u2 + ... + anun + a1v1 + a2v2 + ...+ anvn = 0
a1u1 + a1v1 + a2u2 + a2v2...+ anun + anvn = 0
a1(u1 + v1) + a2(u2 + v2) + ... + an(un + vn) = 0

and then show that a1 = a2 =... an = 0?

Yes i am assuming that B and D have n vectors... but now that i think of it... i can't make that assumption since we don't know the number of vectors in the bases of B and D
Now, you seem to be assuming that the coefficients of the u's and the v's are the same- there is no reason to assume that.

So speaking of what you had... since B is a basis with n vectors and D is a basis with m vectors... then the n vectors are a spanning set and linearly indepednet in B and the m vectors are a spanning set and linearly independent in D.
No, nothing is said about "spanning". The only information that you are given is that B and D are linearly independent sets of vectors in U and V respectively and that the only vector in both U and V is the 0 vector.

Im starting to hate math :( ...
I don't think it is the math that is the problem! I think you need to read more carefully! Go back and read: (1) the definition of "independent" in your book, (2) the precise statement of the problem, (3) my responses to your questions. I've pretty much given you the complete solution in my 2nd response.
 
  • #13
rad0786 said:
But... U ∩ W is all the vectors that are in U and all the vectors that are in W.
No, it's all the vectors in U that are also in W. You've described the union, not the intersection.

so the dim(U ∩ W) HAS to be ≤ 2, unless dimW =1, then dim(U ∩ W) HAS to be ≤ 1.
Both clauses are true (though unproved) but the qualifier "unless" is incorrect. x<1 does not violate x<2.
 
  • #14
--If U and W are subspaces of V and dimU =2, show that
either U⊆W or dim(U ∩ W) ≤ 1
If dim(U∩W)= 2 then U∩W must be U!
 
Back
Top