Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Span of a list of vectors

  1. Feb 10, 2012 #1
    In order to improve my knowledge of Linear Algebra I am reading Linear Algebra Done Right by Sheldon Axler.

    In Chapter 2 under the heading Span and Linear Independence we find the following text:

    "If [itex] ( v_1, v_2, ... ... v_m ) [/itex] is a list off vectors in a vector space V, then each [itex] v_j [/itex] is a linear combination of [itex] ( v_1, v_2, ... ... v_m ) [/itex].

    Thus span[itex] ( v_1, v_2, ... ... v_m ) [/itex] contains each [itex] v_j [/itex].

    Conversely, because because subspaces are closed under scalar multiplications and addition, every subspace V containig each [itex] v_j [/itex] must contain span[itex] ( v_1, v_2, ... ... v_m ) [/itex].

    Thus the span of a list of vectors in V is the smallest subspace of V containing all the vectors in the list."

    Intuitively, given that new vectors are formed by scalar multiplication and addition of other vectors that span[itex] ( v_1, v_2, ... ... v_m ) [/itex] is also the largest subspace of V containing all the vectors in the list.

    Is this intuition correct? Can someone please confirm or otherwise?

    Peter
     
    Last edited: Feb 10, 2012
  2. jcsd
  3. Feb 10, 2012 #2

    HallsofIvy

    User Avatar
    Staff Emeritus
    Science Advisor

    No, that is not true. Let V be the vector space of ordered quadruples (a, b, c, d) and let [itex]v_1= (1, 0, 0, 0)[/itex], [itex]v_2= (0, 1, 0, 0)[/itex]. The subspace of V given by {(a, b, c, 0)} contains both vectors in the list but is not equal to their span.
     
  4. Feb 10, 2012 #3
    Oh! Just had another quick thought about this that possibly solves my confusion.

    A subspace with a longer list of vectors - but also containing [itex] v_1, v_2, ... ... v_m [/itex] would be a larger subpace than span [itex] v_1, v_2, ... ... v_m [/itex] but would contain [itex] v_1, v_2, ... ... v_m [/itex] - so it would appear (no real surprises!) Axler's statement was correct in this sense.

    Sorry - if this is the case I just ran to post the issue too fast - apologies!

    Peter
     
  5. Feb 10, 2012 #4
    Sorry HallsofIvy I put up my second post before seeing your post.

    Just considering your post now - Thanks!

    Peter
     
  6. Feb 10, 2012 #5
    Thanks HallsofIvy - see the point you are making.

    Appreciate your help!

    Peter
     
  7. Feb 12, 2012 #6

    Deveno

    User Avatar
    Science Advisor

    the defining characteristic of a finite-dimensional vector space is the size of any basis, called the dimension of the space.

    a basis has two defining properties:

    a) it is a linearly independent set
    b) it spans the space it is a basis OF.

    property (a) serves to ensure our basis is "as small as possible" by removing extraneous elements (ones we can form from linear combinations of other (basis) elements).

    property (b) serves to ensure our basis is "as large as necessary", that it is a generating set of our vector space.

    as one might expect, linear independence corresponds to injectivity of a linear mapping:

    a linear mapping (vector space morphism) is injective if and only if it maps a linearly independent set to a linearly independent set.

    spanning corresponds to surjectivity of a linear mapping:

    a linear mapping is surjective if and only if it maps a spanning set (and therefore some basis) to a spanning set of the "target space".

    note that a spanning set need not be linearly independent, nor does a linearly independent set need to be a spanning set.

    an important consequence of the above, is that an isomorphism (of vector spaces) takes a basis to a basis. so two vector spaces are "essentially the same" if we can match up the bases element-by-element in a 1-1 correspondence. this is why bases are so important for vector spaces...they tell us "everything we need to know" (often, in the literature, one see things defined for a basis only, and then "extended by linearity").

    a decent linear algebra text will have everything i have stated above as theorems, the proofs of which are straight-forward, but instructive.
     
  8. Feb 12, 2012 #7
    Thanks Deveno - most helpful!

    Just read through that and plan to do some basic work on Linear Algebra - particularly linear transformations. Will check over all the points you mention and work some illustrative examples.

    Basically I am interested in the structures of abstract algebra - particularly groups - but lately I keep tripping over the basics of linear algebra - must remedy that!

    Thanks again!

    Peter
     
  9. Feb 12, 2012 #8

    Deveno

    User Avatar
    Science Advisor

    linear algebra can be "modelled" by n-tuples of field elements (vectors in n-space), and usually the field in question is either Q,R or C (so chosen because people are familiar with these fields). then "linear transformations" between two spaces can be modelled by matrices. and matrices have an arithmetic, that is, we can compute their entries by doing things we know very well. so for a first exposure, this is the route chosen (start with the concrete, then generalize).

    if, however, one already knows about groups, one can view vector spaces another way:

    a field acting on an abelian group.

    now, the "action axioms" tell us that for every r,s in F* (we have to use F*, since F is NOT a group under multiplication), and every v in V (the abelian group):

    r.(s.v) = (rs).v (usually the "dot" is omitted),
    1.v = v (since 1 is the identity of F*).

    since we are acting upon an abelian group, it is "natural" to require that v→r.v be MORE than just a bijection, we want it to be an element of Aut(V):

    r.(u+v) = r.u + r.v

    and v→r.v has the inverse map v→(1/r).v

    now F has more than just a group structure on the underlying set, it also has an additive group structure, and we want this to be "compatible" with the group structure on (V,+). that is, if we call the map r→r.v σr, we want the map r→σr, to be an abelian group homomorphism from (F,+) to (Aut(V),+) (Aut(V) has a natural addition operation, called "point-wise addition": (σ+τ)(v) = σ(v) + τ(v)).

    in other words:

    σr+s = σr + σs, or:

    (r+s).v = r.v + s.v

    a careful comparison will show you these are just the defining axioms of a vector space (the first few axioms usually just state (V,+) is an abelian group).

    it turns out that requiring F to be a field, although essential for solving linear equations, is somewhat "over-specialized", that in many cases, we might just want a ring, R. so the concept of vector space generalizes rather easily to the concept of an R-module (and then, an F-module is precisely a vector space). and if R = Z, the ring of integers, it turns out a Z-module is just another name for abelian group. that is, in an abelian group, a product like:

    akbmcn, can be written as:

    ka + mb + nc, that is: a linear combination (over Z) of a,b and c.

    the whole point of linear algebra, is to reduce "complicated questions" to simpler ones, involving just basis elements (working "one dimension at a time"). this is analogous with reducing questions about a group, to questions about its set of generators. the neat thing about vector spaces is that they are "free objects", any vector space is equivalent (isomorphic) to the "free module generated by a basis set". this makes vector spaces "nicer" than groups, because most groups are merely quotients of a free object (we have relations, as well as generators), so the same set of generators can determine different groups (depending on how the generators interact). another way of expressing this is to say:

    vector spaces are a "universal" construction: seen one n-dimensional vector space, you've "seen 'em all".
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Span of a list of vectors
  1. Span of vectors (Replies: 6)

  2. Vectors and spanning (Replies: 6)

  3. Span of Vectors (Replies: 2)

  4. Span of Vectors (Replies: 3)

Loading...