Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Linear algebra, subspace.

  1. Mar 3, 2008 #1
    The question I am looking at asks,

    Let a be the vector (-1,2) in R^2. Is the set S = { x is in R^2 | x dot a = 0}
    a subspace?

    --> x and a are vectors...


    Can anyone explain how to show this?

    I was thinking that since the zero vector is in R^2, this must also be a subspace.

    Thank you.
     
  2. jcsd
  3. Mar 3, 2008 #2

    quasar987

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Since 0 is in S, you can't conclude that S that S is not a subspace.

    Is it a subspace? To aswer this, you must show that S is closed under addition and scalar multiplication. It is pretty direct, try it.

    If you run into problem, I can help you.
     
  4. Mar 3, 2008 #3
    Oh yes.
    Ok, but how would I proceed.

    Do i just look at 0 -- k(0) = (k0).

    I am not sure how to apply the axiums, can you/someone demonstrate?

    Thank you.
     
  5. Mar 3, 2008 #4

    quasar987

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    I'll do closure under addition for you.

    Let x and y be in S. We want to show that x+y is in S also.

    well (x+y) dot a = (x dot a) + (y dot a) = 0 + 0 = 0,

    so x+y is in S!

    (I used the distributivity property of the scalar product)
     
  6. Mar 3, 2008 #5
    For the constant,

    k(x dot a) = (kx dot a)
    k(0) = 0
    0 = 0

    Is this the proper way?
    Thanks for the help.
     
  7. Mar 3, 2008 #6

    quasar987

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    The exposition is not very logical.

    Try again using my template.

    "Let x be in S and k be a constant. We want to show...."
     
  8. Mar 4, 2008 #7

    HallsofIvy

    User Avatar
    Science Advisor

    Why do you keep looking at 0? Not only does just proving that 0 is in the set not prove it is a subspace, you don't need to prove that 0 is in the set!

    You need to prove two things: the set is closed under addition and the set is closed under scalar multiplication.

    Suppose u and v are two vectors such that [itex]u\cdot a= 0[/itex] and [itex]v\cdot a= 0[/itex] where a is a fixed vector. Can you prove that u+ v is also in that set- that [itex](u+v)\cdot a= 0[/itex]?

    Suppose u is a vector such that [itex]u\cdot a= 0[/tex] and [itex]\alpha[/itex] is any real number. Can you prove that [itex]\alpha[/itex] is also in that set- that [itex]\alpha u\cdot a= 0[/itex]?

    After you have proved the second of those, you don't need to prove 0 is in the set- it follows from the fact that 0v= 0 for any v.
     
  9. Mar 4, 2008 #8
    so, au dot a = 0

    u dot a = 0
    so, u(0) = 0.

    Is that enough?

    thank you for the patience everyone.
     
  10. Mar 4, 2008 #9

    HallsofIvy

    User Avatar
    Science Advisor

    I have no idea what you mean by this. You started with a a vector so I don't know what you mean by "au dot a= 0". Did you accidently also use a to mean a number?

    What do you mean by u(0)? u is a vector, not a function.

    If I try to translate, I get "If ku dot a= 0, then u dot a= 0".
    That's exactly the opposite of what I suggested: Show that if u dot a= 0, then ku dot a= 0 for any real number k.

    And, of course, you must prove that if u dot a= 0 and u dot v= 0, then (u+ v) dot a= 0.
     
  11. Mar 4, 2008 #10
    like the poster above mentioned,

    (u+v) dot a = 0
    then

    (u dot a) + (v dot a) = 0
    0 + 0 = 0 so, that is closed under addition.

    For scalar multiplication, yes I meant k:

    k(u dot a) = k.0 = 0

    ku dot a = k.0 = 0

    Is this the right way to prove it?
     
  12. Mar 4, 2008 #11

    quasar987

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Maybe you understand, but I wouldn't know because your exposition lacks words or at least "<==>" symbols for it to make any sense.

    In your proof of scalar multiplication, the critical part of the proof is the fact that ku dot a = k(u dot a). I see you nowhere make that connection explicitely.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook