Let's say that $W$ is a vector space (ANY vector space, over the same field as $V$). Furthermore, suppose that $f:\mathcal{B} \to W$ is *any* function.
To say that $V$ is universal among all pairs $(f,W)$ (and actually to be "precise" we mean the pair $(1_{\mathcal{B}},V)$ where $1_{\mathcal{B}}: \mathcal{B} \to V$ is the inclusion function:
$1_{\mathcal{B}}(v_j) = v_j$, for each $v_j \in \mathcal{B}$)
means that there exists a unique linear transformation: $T:V \to W$ such that:
$f = T \circ 1_{\mathcal{B}}$.
Explicitly:
$T\left(\sum\limits_j a_jv_j\right) = \sum\limits_j a_jf(v_j)$
This process is called "extending by linearity", or "defining a linear map by its action on a basis".
So, here is how it works in practice:
Let's use a simple example, we will take $F = \Bbb R$, and $V = \Bbb R^3$, and $\mathcal{B} = \{(1,0,0),(0,1,0),(0,0,1)\}$
For $W$, we will use $\Bbb R^2$ (we could use other examples, $\Bbb R^5$ would work, too, if we desired).
Now to determine $T$, we need a PAIR, $(f,W)$, so I need to say what $f$ is. Let's just pick $3$ elements of $W$ more or less at random, for the images of $(1,0,0),(0,1,0)$ and $(0,0,1)$:
$f((1,0,0)) = (2,4)$
$f((0,1,0)) = (-1,5)$
$f((0,0,1)) = (0,3)$.
My claim is now there is only ONE linear map $T:\Bbb R^3 \to \Bbb R^2$ that will satisfy:
$f = T \circ 1_{\mathcal{B}}$.
To know what $T$ might be, it suffices to define it for an arbitrary $(x,y,z) \in \Bbb R^3$, that is we must specify what $T(x,y,z))$ is equal to.
Now $T((x,y,z)) = T((x,0,0) + (0,y,0) + (0,0,z)) = T(x(1,0,0) + y(0,1,0) + z(0,0,1))$.
If $T$ is to be linear, we must have:
$T(x(1,0,0) + y(0,1,0) + z(0,0,1)) = xT((1,0,0)) + yT((0,1,0)) + zT((0,0,1))$
$= x(2,4) + y(-1,5) + z(0,3) = (2x,4x) + (-y,5y) + (0,3z) = (2x-y,4x+5y+3z)$
So if such a linear $T$ exists, we have to have:
$T((x,y,z)) = (2x-y,4x+5y+3z)$
This shows that if $T$ exists, it must be unique. So all that remains (which I leave to you) is to show that the $T$ defined above, *is indeed linear*, which shows existence.
There is a similarity here with the universal mapping property of a free group, the mapping $1_{\mathcal{B}}$ here corresponds to *inclusion of generators* of a set $X$ into the free group generated by $X$-this is no accident:
Vector spaces are free $F$-modules, and any two bases are set-isomorphic (a concept enshrined as the definition of "dimension").
Now this is a rather abstract approach-the usual way vector spaces are introduced is as LINEAR COMBINATIONS of basis elements. "Formal" linear combinations are what you have to have, in order to have CLOSURE of a set under addition and scalar multiplication.
There is one thing to be careful of, here: "formal linear combinations" assume beforehand, that the set-elements are linearly independent (often just by declaring them so), that is, no actual algebraic relations are held to hold between the set-elements we take as basis elements. But if we are dealing with set-elements that already have some algebraic relationship (such as points in the plane, or $\Bbb R \oplus \Bbb R$) we cannot assume this. That is why it is *key* in this theorem of Cooperstein's, that $\mathcal{B}$ be a *basis*, and not just some set of vectors.
The long and short of this is, if you have a basis, like say, $\{1,x,x^2,x^3,\dots\}$ for $\Bbb R[x]$, then any linear map $T: \Bbb R[x] \to W$ can be defined *just by looking at $T(1),T(x),T(x^2),T(x^3),\dots$ etc.*
The usual way this is presented, is by considering *matrix representations*. If the basis $\mathcal{B}$ is given, we can represent the linear combination:
$v = a_1v_1 +\cdots + a_nv_n$ by the $n \times 1$ array (column vector):
$\begin{bmatrix}a_1\\ \vdots\\a_n\end{bmatrix}$ (note how this suppresses the basis as "understood"). Sometimes this is written $[v]_{\mathcal{B}}$.
In this basis, if we know $T(v_j)$, then that is the same as knowing:
$[T]_{\mathcal{B}'}^{\mathcal{B}}\begin{bmatrix}0\\ \vdots\\1\\ \vdots\\0\end{bmatrix}$ for some representation of the $T(v_j)$ in a basis $\mathcal{B}'$ for $W$.
Now what the matrix on the right "does" to the matrix form of $T$, is pick out the $j$-th column, that is:
$[T]_{\mathcal{B}'}^{\mathcal{B}} = \begin{bmatrix}|&|&\cdots&|\\ [T(v_1)]_{\mathcal{B}'}&T(v_2)]_{\mathcal{B}'}&
\cdots&[T(v_n)]_{\mathcal{B}'} \\|&|&\cdots&|\end{bmatrix}$
For example, in my illustration above, the matrix for $T$ is seen to be:
$[T]_{\mathcal{B}'}^{\mathcal{B}} = \begin{bmatrix}2&-1&0\\4&5&3\end{bmatrix}$
if we use the basis $\mathcal{B}' = \{(1,0),(0,1)\}$ for $\Bbb R^2$.
Hopefully, it should be clear this is the ONLY 3x2 matrix that maps:
$\begin{bmatrix}1\\0\\0\end{bmatrix} \mapsto \begin{bmatrix}2\\4\end{bmatrix}$
$\begin{bmatrix}0\\1\\0\end{bmatrix} \mapsto \begin{bmatrix}-1\\5\end{bmatrix}$
$\begin{bmatrix}0\\0\\1\end{bmatrix} \mapsto \begin{bmatrix}0\\3\end{bmatrix}$
via matrix multiplication from the left.