1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Vector Spaces, Subspaces, Bases etc

  1. Oct 13, 2005 #1
    Vector Spaces, Subspaces, Bases etc... :(

    Hello. I was doing some homework questions out of the text book and i came accross a question which is difficult to understand, could somebody please help me out with it?

    -- if U and W are subspaces of V, define their intersection U ∩ W as follows:
    U ∩ W = {v / v is in both U and W}

    a) show that U ∩ W is a subspace contained in U and W

    b) If B and D are bases of U and W, and if U ∩ W = {0}, show that
    B U D = {v / v is in B or D} is linearly independent.
    --

    I was able to do part a)! That wasn't so tricky. You just show that clousre of addition and scalar multiplication hold. (and show that the vectors each belong in U and W etc....)

    So i understand part a), but part b) is where I am lost :confused:

    To begin, I am not sure if "B is a basis of U" and "D is a basis of W"... or is "B and D" a basis of U and "B and D" is a basis of W? I think its the first one.

    Next... U ∩ W = {0} means that the zero vector lies in U and lies in W.
    Furthermore... B U D has a vector that lies in only B or in only D, and not B ∩ D.

    so now is where I don't know how to show that it is linearly independent.

    All i know so far is that U ∩ W = {0} has a basis of 1 and that is all i have to work with :( Can somebody please help me further?

    Thanks
     
  2. jcsd
  3. Oct 14, 2005 #2

    EnumaElish

    User Avatar
    Science Advisor
    Homework Helper

    "B U D has a vector that lies in only B or in only D, and not B ∩ D." What are you basing this statement on?

    What does it mean for a vector space (B U D) to be linearly independent? (Indep. of what?)
     
    Last edited by a moderator: Oct 14, 2005
  4. Oct 14, 2005 #3
    well.... B U D = {v / v is in B or D} which means that v can only be in B or in D and not in both, B ∩ D.

    For a vector (B U D) to be linearly independent, then the cooefficents of its linear combination has to equal to zero.

    The problem is, i dont know how to even get the linear combination of B U D.

    I know that the basis of B ∩ D is just { (1) } since U ∩ W = {0} however, B U D I just cannot see...
     
  5. Oct 14, 2005 #4

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    B and D are not vector spaces! They are sets of vectors. "Linearly independent" means the same thing it always does for sets of vectors in vector spaces in vector spaces.

    The same way you always do! Let the vectors in B be
    {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}

    Then a linear combination is something of the form:
    [tex]\alpha_1b_1+ \alpha_2b_2+ ...+ \alpha_nb_n+ \beta_1d_1+ ...+\beta_md_m[/tex].

    Suppose such a thing were equal to 0. If all of the [tex]\alpha[/tex]s were equal to 0, the [tex]\beta[/tex]s must be also because D is linearly independent, and vice versa. So the only question is if it is possible for some of the [tex]\alpha[/tex]s and [tex]\beta[/tex]s to be 0 and to cancel. Suppose that were true. Move all of the terms involving [tex]\beta[/tex]s (i.e. multiplying vectors in D) over to the right side of the equation, leaving only terms involving [tex]\alpha[/tex]s (i.e. multiplying vectors in B) on the left. What can you say about thevector on the left side of the equation? What about the vector on the right?
     
    Last edited: Oct 14, 2005
  6. Oct 14, 2005 #5
    HallsofIvy. Thank you for that!

    I wrote out what you were saying, and moved all the a's and b's to the left side and right side respectively. I just dont see how they would cancel.

    However, I thought of something slightly different...

    Let the vectors in B be {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}

    then: s1b1 + ..... + snbn + t1d1 + ..... + tndn = 0 (s and t are scalars)

    Now because they are a basis, then it has to n linearly independent vectors...
    and so s = t = 0

    Does that work?
     
  7. Oct 14, 2005 #6

    EnumaElish

    User Avatar
    Science Advisor
    Homework Helper

    Wrong. Ordinarily, "or" is inclusive.
     
  8. Oct 14, 2005 #7

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    Yes, but in this problem, we were already told that U ∩ W = {0}. Of course, the 0 vector cannot be in either basis so no vector is in both B and D.
     
  9. Oct 14, 2005 #8

    EnumaElish

    User Avatar
    Science Advisor
    Homework Helper

    Okay, I had missed that.
     
  10. Oct 14, 2005 #9

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    I'm sorry, what has "n linearly independent vectors"? Certainly B does, only because we were told that B is a basis and defined B to have n vectors in it!
    Did you notice that I said, and you also, Let the vectors in B be {b1, b2, ..., bn} and the vectors in D be {d1, d2, ..., dm}, but then you have
    s1b1 + ..... + snbn + t1d1 + ..... + tndn = 0
    where you seem to be assuming that m= n.
    Oh, and you can't say "s and t are scalars- you don't have any "s and t", you have s1, s2, ..., sn and t1, t2,...,tm: two sets ofscalars.

    My point was that if you write s1b1 + ..... + snbn + t1d1 + ..... + tmdm = 0
    (I have changed that last "n" to "m".)
    as s1b1 + ..... + snbn = -(t1d1 + ..... + tmdm ) on the left side you have a linear combination of vectors in B- so it must be a vector in U- and on the right you have a linear combination of vectors in D- so it must be a vector in V. But they are equal and the only vector in both U and V is 0 (that's an important part of the hypotheses you didn't use!), each side must be equal to 0. NOW use the fact that each of B and D is linearly independent to show that all of the scalars must be 0.
     
  11. Oct 14, 2005 #10
    Hey im so sorry... i made a typo... yes i did mean that m=n.

    I was trying to arrive at this conclusion:

    Let U = a1u1 + a2u2 + .... + anun which is in B
    Let V = a1v1 + a2v2 + .....+ anvn which is in D
    Vectors in both B and D = 0

    a1u1 + a2u2 + .... + anun + a1v1 + a2v2 + .....+ anvn = 0
    a1u1 + a1v1 + a2u2 + a2v2.....+ anun + anvn = 0
    a1(u1 + v1) + a2(u2 + v2) + .... + an(un + vn) = 0

    and then show that a1 = a2 =... an = 0?

    Yes i am assuming that B and D have n vectors... but now that i think of it... i can't make that assumption since we don't know the number of vectors in the bases of B and D

    So speaking of what you had... since B is a basis with n vectors and D is a basis with m vectors... then the n vectors are a spanning set and linearly indepednet in B and the m vectors are a spanning set and linearly independent in D.

    Im starting to hate math :( ....
     
  12. Oct 15, 2005 #11
    Okay... i have a VERY related question that has to do with this stuff...

    --If U and W are subspaces of V and dimU =2, show that
    either U⊆W or dim(U ∩ W) ≤ 1

    This is how i went about it.

    Let dimU = m and dimW = k
    Then any basis of U = {u1 u2} and is a set of independent vectors of 2! thus, m = 2
    Also, any basis of W ={w1 ... wk} and is a set of independent vectors of k.
    Also, the dimV ≥ 2

    But... U ∩ W is all the vectors that are in U and all the vectors that are in W. so the dim(U ∩ W) HAS to be ≤ 2, unless dimW =1, then dim(U ∩ W) HAS to be ≤ 1. If this were the case, then dimU > dimW and its possible for W⊆U

    But... if U⊆W, then dimW > dimU ... which this just contraticts what i said above?

    What am i doing wrong? why are the two contradicting?
     
  13. Oct 15, 2005 #12

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    This is getting weird! You don't seem to be understanding my comments at all. All you know about B and D is that they are sets of independent vectors in U and V respectively and that U and V have only the 0 vector in common. You have absolutely no reason to believe that m= n. That was my point.

    This makes no sense at all. U is not a vector, it is a vector space! Did you mean, say, u= a1u1 + a2u2 + .... + anun where u is a vector in U? But even then "which is in B" is wrong. The individual vectors u1, u2,... etc. are in B. A linear combination of them is in U but not necessarily in B.
    Same comments apply to the next line.

    "Vectors in B and D= 0". Actually, you know that 0 is not in B or D since any set of vectors containing the 0 vector can't be independent!

    Now, you seem to be assuming that the coefficients of the u's and the v's are the same- there is no reason to assume that.

    No, nothing is said about "spanning". The only information that you are given is that B and D are linearly independent sets of vectors in U and V respectively and that the only vector in both U and V is the 0 vector.

    I don't think it is the math that is the problem! I think you need to read more carefully! Go back and read: (1) the definition of "independent" in your book, (2) the precise statement of the problem, (3) my responses to your questions. I've pretty much given you the complete solution in my 2nd response.
     
  14. Oct 15, 2005 #13

    Gokul43201

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    No, it's all the vectors in U that are also in W. You've described the union, not the intersection.

    Both clauses are true (though unproved) but the qualifier "unless" is incorrect. x<1 does not violate x<2.
     
  15. Oct 15, 2005 #14

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    If dim(U∩W)= 2 then U∩W must be U!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Vector Spaces, Subspaces, Bases etc
Loading...