Can Vector Spaces be Considered Spans?

evilpostingmong
Messages
338
Reaction score
0
Hello, very new to vector spaces, it seems like they take some getting used
to. Anyway, since spans are sets of all the linear combinations of vectors
contained within subspaces, I wonder whether or not vector spaces
which contain elements (or vectors) that follow the ten axioms can be
considered spans.
 
Physics news on Phys.org
You seem to be mixed up in your terminology.

"Span" by itself doesn't really mean anything. You always talk about the span of a set of vectors.

The span of a set of vectors is the set of all vectors that can be made from linear combinations of those vectors. This is the definition. It is easy to prove that a span of a set of vectors is a linear space itself.

I'm not sure what "ten axioms" you're talking about.
 
Oh so it doesn't just apply to subspaces. Yeah those ten axioms are
rules that determine whether or not a set of vectors are contained
within a vector space such as wheter or not they are closd under
addition. Thank you for the response, very informative!:smile:
 
evilpostingmong said:
Oh so it doesn't just apply to subspaces. Yeah those ten axioms are
rules that determine whether or not a set of vectors are contained
within a vector space such as wheter or not they are closd under
addition. Thank you for the response, very informative!:smile:

I believe that you are referring to these ten:

Addition:

1. u + v is in V. (Closure under addition)
2. u + v = v + u (Commutative property)
3. u + (v + w) = (u + v) + w (Associative property)
4. V has a zero vector 0 such that for every u in V, u = 0 = u (Additive identity)
5. For every u in V, there is a vector in V denoted by -u such that u + (-u) = 0 (Additive inverse)

Scalar Multiplication
6. cu is in V. (Closure under scalar multiplication)
7. c(u + v) = cu + cv (Distributive Property)
8. (c + d)u= cu + du (Distributive Property)
9. c(du) = (cd) u (Associative Property)
10. 1(u) = u (Scalar Identity)



^From Elementary Linear Algebra by Larson, Edwards, and Falvo 5th edition.
 
Yes, every vector space has a basis, i.e. a set of linearly independent vectors such that every element of the vector space is a linear combination of the set of basis vectors. This is true also for infinite dimensional vector spaces. There always exists a so-called Hamel basis which is a set of vectors such that every element of the vector space is a finite linear combination of the basis vectors.
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Back
Top