Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A few questions about proving vector spaces

  1. Jul 20, 2013 #1
    I have a few questions about proving that a set is a vector space.

    1.) My book lists 8 defining properties of a vector space. I won't list them because I'm under the impression that these are built into the definition of a vector space and thus are common knowledge.

    My book also says that all vector spaces have the property that any linear combination of elements is also an element.

    My question is this: In showing something is a vector space, do I need to show that the former 8 properties hold, that the linear combo idea holds, or both?

    I've seen discussions online where people show only that linear combinations are members of the set, so the set is a vector space. Does this guarantee that those 8 properties hold, or are they just speaking of some specific example (like a subspace of a known vector spaces, where many of those 8 properties are "grandfathered in")?


    2.) If the elements of the set are said to be "real valued" can I just use the field axioms of the reals to show that these 8 properties will hold?

    For example, I was asked to show that the set of all differentiable real-valued functions is a vector space.

    Is it sufficient to show that any linear combination of differentiable functions is differentiable (through properties such as constants being pulled from the differentiation operator, and that the derivative of a sum is a sum of derivatives) and real valued, and then use the field axioms of the set of reals to show the other 8 properties?

    Thank you
     
  2. jcsd
  3. Jul 20, 2013 #2
    1.)
    You have to show that the 8 properties hold. The linear combo idea, as you call it, is a consequence of those properties. Simply showing that a linear combination of elements is still an element does not prove that the set is a vector space. Take for example the real 1-dimensional euclidean space [itex]\mathbb{R}[/itex] and consider only the subset composed of only positive part. Linear combination of elements are still elements but there are no inverse elements, so it is not a vector space.

    2.)
    Yes, usually that's what is done, the 8 properties hold simply by inheriting from the field properties and by how vector addition and scalar multiplication are defined.

    On the example you gave, the 8 properties hold by inheriting the properties of the derivative.
     
  4. Jul 20, 2013 #3
    Thank you, that is clear.
     
  5. Jul 20, 2013 #4

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Linear combinations of elements are not elements. For example if ##\mathbf{v} = 1##, then ##(-1)\mathbf{v}## is a linear combination of elements of the space, and is not in the space.

    What the OP said is completely right:

    So, given an arbitrary vector space that you know nothing about, then you need to check the 8 properties.
    However, if you know that you are a subset of a known vector space and the operations coincide, then it suffices to show that a linear combination remains in the set (and the set must also be nonempty). So in that sense, it suffices to check whether it's a subspace. The other axioms are indeed grandfathered in.
     
  6. Jul 20, 2013 #5
    OK, I haven't reached the idea of subspaces quite yet in my reading, but it's not a huge leap to have an intuitive feel on after doing vector spaces.

    So, if a set is a subset of a known vectorspace, then any of those 8 properties concerning properties of operations on elements (commutativity, distributivity, etc) immediately apply, and showing that a linear combination is always a member will show that the remaining 8 properties also hold (like the zero vector must be present if linear combinations are always members, because 0x can be shown to be the zero vector for any element x of the subset)?
     
  7. Jul 20, 2013 #6

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    Yes. Of course you want your set to be nonempty. Empty sets satisfy that a linear combination remain in the set, but they're not a vector space.

    So, if ##V## is a vector space and if ##S\subseteq V##, then ##S## is a vector space under the operations of ##V## if
    1) ##S## is nonempty
    2) For each ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, we have ##\lambda\mathbf{v} + \mu\mathbf{w}\in S##.
    We call ##S## subspace of ##V##.
     
  8. Jul 20, 2013 #7
    OK, thanks again to both of you.

    Though, if the set is empty, what does a linear combination even mean? Does a linear combination of no elements always reduce to some kind of additive/multiplicative identity depending on what those identities are in that space?
     
  9. Jul 20, 2013 #8

    micromass

    User Avatar
    Staff Emeritus
    Science Advisor
    Education Advisor
    2016 Award

    It's a vacuous concept. It has no meaning.

    We have the following property: For each ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, we have ##\lambda\mathbf{v} +\mu \mathbf{w}\in S##.

    I claim this property is true for ##S = \emptyset##. Indeed, if the property weren't true, then there would exist ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, such ##\lambda\mathbf{v} +\mu \mathbf{w}\in S##. But in particular, there would exist some element ##\mathbf{v}\in S##. But ##S## is empty. Contradiction.
     
  10. Jul 20, 2013 #9
    Of course.

    Any property concerning elements of a set is seems vacuously true for the empty set. Though I think you meant a "not an element of" symbol after the linear combo.
     
    Last edited: Jul 20, 2013
  11. Jul 20, 2013 #10

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No, he meant "is an element of" because he was doing a "proof by contradiction".


    By the way, if you are given that U is a subset of vector space V, and want to prove that U is a subspace, many of the properties are "inherited" from V. For example, if u and v are in U, then they are in V so u+ v= v+ u is true because it is true in V. You really only need to prove
    1) that U is closed under scalar multiplication.
    2) that U is closed under vector addition.
    3) that U is non-empty (which is equivalent to 0 being in U).
     
  12. Jul 20, 2013 #11
    If the property weren't true, then the negation of the property would be true. The negation of "For each ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, we have ##\lambda\mathbf{v} +\mu \mathbf{w}\in S##." is not "there would exist ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, such ##\lambda\mathbf{v} +\mu \mathbf{w}\in S##."

    The negation is: "there would exist ##\mathbf{v},\mathbf{w}\in S## and ##\lambda,\mu## scalars, such ##\lambda\mathbf{v} +\mu \mathbf{w}\notin S##." So he meant "not an element of."
     
    Last edited: Jul 20, 2013
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: A few questions about proving vector spaces
Loading...