I'm going to make things a little more concrete (but still abstract) by outlining the construction of a tensor product space.
First, a little about vector spaces.
A (Hamel) basis for a vector space V is a subset B of V that is linearly independent, and that spans V. Even if V is infinite-dimensional, the concepts of linear independence and span involve only linear combinations of a finite number of vectors. In fact, without extra structure on V, it doesn't even make sense to talk about the sum of an infinite number of vectors. An infinite sum is a limit of the sequence of partial sums, and, without extra structure, the concept of limit can't be defined.
If S is a set (of, e.g., distinguishable oranges), then the free vector space F \left( S \right) is the vector space that has S as a basis, i.e., the set of all (formal) finite linear combinations of elements of S.
What is a linear combination of oranges?
A concrete, rigorous realization of the above follows.
Let S be a set. Define F \left( S \right) to be the set of functions that map into a field, say \mathbb{R}, and such that each function is non-zero for only fintely many elements (in general, different for different functions). This finiteness will be used to reflect the fact that only sums of finite numbers of vectors are allowed. The definitions
\left( f + g \right) \left( s \right) := f \left( s \right) + g \left( s \right)
\left( \alpha f \right) \left( s \right) := \alpha f \left( s \right)
for f and g in F \left( S \right) and \alpha \in \mathbb{R} give F \left( S \right) vector space structure.
F \left( S \right) is the free vector space on set S. To see how F \left( S \right) captures the idea of linear combinations of set S, consider the following functions.
For each s \in S, define an element e_s \in F \left( S \right) by
<br />
e_s \left( s' \right) = \left( \begin{array}{cc} 1 & s=s'\\ 0 & s \neq s' \end{array} \right<br />
Clearly, there is a bijection from S to the set of all such functions. But each of these functions does live in vector space F \left( S \right), so when we talk about linear combinations of elements of S, we really mean linear combinations of the appropriate functions e_s. It is fairly easy to show that the set of all such e_s is a basis for F \left( S \right).
If V and W are vector spaces, then applying the above to set V \times W produces vector space F \left( V \times W \right). The tensor product space V \otimes W is found by forming a quotient vector space of the free vector space F \left( V \times W \right) with an appropriate subspace. The subspace acts as the zero vector in the quotient space.
Since \left( v , w \right), \left( \alpha v' , w \right), and \left( v + \alpha v' , w \right) are all distinct elements of V \times W, e_{\left( v , w \right)}, e_{\left( \alpha v' , w \right)|, and e_{\left( v + \alpha v' , w \right)} are linearly independent in F \left( V \times W \right), since they are all basis elements. Consequently,
e_{\left( v , w \right)} + e_{\left( \alpha v' , w \right)} - e_{\left( v + \alpha v' , w \right)}
and, similarly,
e_{\left( v , w \right)} + e_{\left( v , \alpha w' \right)} - e_{\left( v , w + \alpha w' \right)}
are non-zero in F \left( V \times W \right). But, we want
v \otimes w + \alpha v' \otimes w - \left( v + \alpha v' \right) \otimes w = 0
v \otimes w + v \otimes \alpha w' - v \otimes \left( w + \alpha w' \right) = 0.
Consequently, use
e_{\left( v , w \right)} + e_{\left( \alpha v' , w \right)} - e_{\left( v + \alpha v' , w \right)}
e_{\left( v , w \right)} + e_{\left( v , \alpha w' \right)} - e_{\left( v , w + \alpha w' \right)}
to generate a subspace U of F \left( V \times W \right).
Then V \otimes W is F \left( V \times W \right) / U.
Another way to think of quotient vector spaces is in terms of groups. Any vector space is an abelian group with vector addition the group product and the zero vector the group identity. Any subspace is a normal subgroup, and thus can be used to form a quotient group, with the subsapce the identity of the quotient group.
If, as in relativity, V and W are both finite-dimensional spaces, then V \otimes W is (naturally) isomorphic to the vector space of bilinear maps from V* \otimes W* to \mathbb{R}. For infinite-dimensional spaces V \otimes W is isomorphic to a proper subspace of bilinear maps from V* \otimes W* to \mathbb{R}. Therefore, this space of bilinear mapping is often taken to be the tensor product space.