Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Are vector spaces spans?

  1. Mar 17, 2009 #1
    Hello, very new to vector spaces, it seems like they take some getting used
    to. Anyway, since spans are sets of all the linear combinations of vectors
    contained within subspaces, I wonder whether or not vector spaces
    which contain elements (or vectors) that follow the ten axioms can be
    considered spans.
     
  2. jcsd
  3. Mar 17, 2009 #2
    You seem to be mixed up in your terminology.

    "Span" by itself doesn't really mean anything. You always talk about the span of a set of vectors.

    The span of a set of vectors is the set of all vectors that can be made from linear combinations of those vectors. This is the definition. It is easy to prove that a span of a set of vectors is a linear space itself.

    I'm not sure what "ten axioms" you're talking about.
     
  4. Mar 17, 2009 #3
    Oh so it doesn't just apply to subspaces. Yeah those ten axioms are
    rules that determine whether or not a set of vectors are contained
    within a vector space such as wheter or not they are closd under
    addition. Thank you for the response, very informative!:smile:
     
  5. Mar 17, 2009 #4
    I believe that you are refering to these ten:

    Addition:

    1. u + v is in V. (Closure under addition)
    2. u + v = v + u (Commutative property)
    3. u + (v + w) = (u + v) + w (Associative property)
    4. V has a zero vector 0 such that for every u in V, u = 0 = u (Additive identity)
    5. For every u in V, there is a vector in V denoted by -u such that u + (-u) = 0 (Additive inverse)

    Scalar Multiplication
    6. cu is in V. (Closure under scalar multiplication)
    7. c(u + v) = cu + cv (Distributive Property)
    8. (c + d)u= cu + du (Distributive Property)
    9. c(du) = (cd) u (Associative Property)
    10. 1(u) = u (Scalar Identity)



    ^From Elementary Linear Algebra by Larson, Edwards, and Falvo 5th edition.
     
  6. Mar 17, 2009 #5
    Yes, every vector space has a basis, i.e. a set of linearly independent vectors such that every element of the vector space is a linear combination of the set of basis vectors. This is true also for infinite dimensional vector spaces. There always exists a so-called Hamel basis which is a set of vectors such that every element of the vector space is a finite linear combination of the basis vectors.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook