Chump said:
Why is it that a vector can be described in terms of a simple linear combination, like v = xi + yj + zk, where v is a vector, and i, j, and k are all unit vectors. It just seems a bit convenient that it's as simple as just adding the components this way. It seems like there should be a bit more to it. I guess I'm just looking to get as fundamental with the concept as possible. Is it just a matter of this being the definition of a vector by convention?
##i##, ##j##, and ##k## form what is called a basis for the vectors in the space. This means that any vector can be written uniquely as a linear combination of them. and further, one can not remove one of them and still be able to express every vector. For instance if you remove ##i## then you are left with only ##j## and ##k## and these can never combine to give you a vector with non-zero ##i## coordinate.
But there are many other bases for the vectors in 3 space. For instance the 3 vectors ##v = i + j##, ##w = i - j## and ##q = i + j + k## also form a basis. Any vector in 3 space can be written as a unique linear combination of them. To prove this note that ##i = 1/2(v+w)##, ##j = 1/2(v - w)## and ##k = q - v##. And again, if you remove anyone of them, then you can not write any vector as a linear combination of the remaining two. For instance, if you remove, ##v## then you cannot express ##v## as a linear combination of ##w## and ##q##.
If you write a vector as a linear combination ##av + bw + cq## then ##a##, ##b##, and ##c## are the coordinates of the vector with respect to the basis ##v##,##w##, ##q##. If you choose a different basis then the coordinates will be different. For instance, in the ##i##, ##j##, ##k## basis of orthogonal unit vectors, the coordinates of ##i## are ##a = 1##, ##b=0##, ##c = 0##. In the ##v##,##w##,##q## basis the coordinates of ##i## are ##a = 1/2##,##b= 1/2##, ##c = 0##.
Any basis for 3 space must contain three vectors and no one of them can lie in the plane spanned by the other two. They must be "linearly independent" and the dimension of 3 space is three because any basis requires 3 linearlly independent vectors. You do not need more than three and you can not do with less.
So ##i## ##j## and ##k## are just a convenient example of a basis for a three dimensional vector space. But in many applications, other bases are used. For instance, if one considers the linear transformation, ##i \rightarrow j##,##j \rightarrow i##, ##k \rightarrow k## then its eigenvectors form the basis, ## v = i + j## , ## w = i -j##, and ## q = k##.
As one can see, 3 space exists as a vector space without reference to any particular basis. A basis is just a linearly independent set that spans the vector space through linear combinations. For instance, one can think of the vectors in 3 space as arrows pointing from the origin and vector addition as parallel translating the second arrow so that its base touches the end point of the first. One multiplies an arrow by a number by scaling its length. This definition is basis free. It does not require ##i##,##j##, and ##k## or any other basis. In fact, one needs to prove that this vector space even has a basis and that its dimension(the number of vectors required for a basis) is three.
If a vector space has a basis with a finite number of vectors in it, then its said to be finite dimensional and its dimension is the number of vectors in the basis. (This only makes sense after you prove that any basis must have the same number of vectors in it.) So one defines finite dimension as the span of finitely many vectors and then one must prove as a theorem that there is a basis and any two bases have the same number of vectors in them.
But there are some vector spaces that are not finite dimensional. This means that there is no finite linearly independent set of vectors that spans the whole space. For example the vector space, ##R^{∞}## of vectors, ##(x_1, x_2, ...,x_n, ...) with infinitely many coordinates is infinite dimensional. (Just as in 3 space, vector addition is defined coordinate wise and scalar multiplication is, " Multiply each coordinate by the scalar".).
For an infinite dimensional vector space one must again to ask whether it has a basis and even to ask what a basis means in infinite dimensions. The definition of basis is a set of vectors that span the whole space and which is linearly independent. Span here means that any vector can be expressed as a linear combination of
finitely many vectors in the basis. Linearly independent means that no vector in the basis can be expressed as a finite linear combination of any of the others.
For infinite dimensional vectors spaces, there is generally no procedure for finding a basis. Try to find a basis for ##R^{∞}##. ( Note that all linear combinations of finitely many of the vectors (1,0,0,...) (0,1,0,...), ((0,0,0,1,0,0..) with a 1 in one coordinate and zeros everywhere else, does not span all of ##R^{∞}##. )Generally, mathematicians assume that every vector space has a basis but this assumption is equivalent logically to the Axiom of Choice.
An abstract vector space is defined by forgetting the geometry of Euclidean space and keeping only the algebraic rules of vector addition and scalar multiplication. The algebraic definition is that vector addition forms an abelian group (addition is commutative and associative, and every vectors has an additive inverse) and vectors can be multiplied by a field of scalars so that the multiplication distributes over vector addition, ##r(v + w) = rv + rw## and ##(rs)v = r(sv)##. These algebraic rules allow for many possibilities. For instance, as has been already mentioned, the polynomials of one variable are a vector space. Another example is the real numbers considered as a vector space with only the rational numbers as the field of scalars. Vector addition in this case is just addition of numbers, and multiplication by a scalar is just multiplication of numbers. This vector space is infinite dimensional. Also a vector space can be finite, for instance a finite dimensional vector space whose field of scalars is the field ##Z_2##.
The idea of a vector space does not include any idea of length of vectors or angle between them. These ideas are additional structure. The vector space part is just the algebraic rules. So it is irrelevant that ##i##,##j##,and ##k## are orthogonal unit vectors from the point of view of the algebraic structure of 3 space as a vector space. In general, if one has an idea of length and angle the vector space is called a Hilbert space. If there is an idea of length but not necessarily of angle, the vector space is called a Banach space. 3 space with Euclidean geometry is a Hilbert space.