This works as follows: Let ##\vec{b}_k## (##k \in \{1,\ldots,d \}##, where ##d## is the dimension of the (real) vector space). Then you can define a matrix
$$g_{jk} = \vec{b}_j \cdot \vec{b}_k.$$
By assumption this is a symmetric non-degenerate matrix with only positive eigenvalues, because the scalar product is by definition a positive definite bilinear form.
In this context the basis vectors with lower indices are called co-variant basis vectors. If you have another basis ##\vec{b}_k'##, then there is a invertible matrix ##{T^j}_k## such that
$$\vec{b}_k' = {T^j}_k \vec{b}_j,$$
where the Einstein summation convention is used. According to this convention it is understood that over any index which occurs twice in a formula you have to sum, and there must always be one of these indices a lower and the other be an upper index (two equal indices at the same vertical place are strictly forbidden; if such a thing occurs in the formulas of this socalled Ricci calculus, you made a mistake!).
Now vectors are invariant objects, i.e., you can decompose any vector uniquely wrt. to any basis with the corresponding vector components,
$$\vec{V}=V^j \vec{b}_j = \vec{b}_k' V^{\prime k} = {T^{j}}_ k V^{\prime k} \vec{b}_k.$$
I.e., the components transform "covariantly",
$$V^j={T^j}_k V^{\prime k},$$
or defining
$${\tilde{T}^k}_j={(T^{-1})^k}_j \; \Rightarrow \; {\tilde{T}^k}_j {T^j}_l=\delta_j^l = \begin{cases} 1 & \text{if} \quad j=l, \\ 0 & \text{if} \quad j \neq l. \end{cases}$$
Then you have
$${\tilde{T}^l}_j V^j ={\tilde{T}^l}_j {T^j}_k V^{\prime k}=\delta_k^l V^{\prime k}=V^{\prime l}.$$
For the scalar products of vectors you find
$$\vec{V} \cdot \vec{W} =(V^j \vec{b}_j) \cdot (W^k \vec{b}_k) = V^j W^k \vec{b}_j \cdot \vec{b}_k = g_{jk} V^j W^k.$$
The same holds for the components wrt. the other basis
$$\vec{V} \cdot \vec{W}=g_{jk}' V^{\prime j} V^{\prime k},$$
where the transformation law for the metric components read
$$g_{jk}' = \vec{b}_j' \cdot \vec{b}_{k}' = (\vec{b}_l {T^l}_j) \cdot (\vec{b}_m {T^m}_k) = {T^l}_j {T^m}_k \vec{b}_l \cdot \vec{b}_m = {T^l}_j {T^m}_k g_{lm},$$
i.e., you have to apply the rule for covariant transformations to each lower index of ##g_{lm}##. The inverse of this formula is of course
$$g_{lm} = {\tilde{T}^j}_l {\tilde{T}^k}_m g_{jk}'.$$
Now you can also introduce a contravariant basis ##\vec{b}^j## for each given covariant basis by demanding that
$$\vec{b}^j \cdot \vec{b}_k=\delta_{k}^j.$$
To find them we need to inverse of the matrix ##g_{jk}## which we denote as ##g^{lm}##, i.e., we have
$$g_{jk} g^{kl}=\delta_j^l.$$
The matrix ##g_{jk}## is invertible because the scalar product is non-degenerate, because it's positive definite, i.e., the matrix has only positive eigenvalues and thus it's determinant is non-zero. Indeed, defining
$$\vec{b}^{\prime j}=g^{jk} \vec{b}_k$$
does the job, because
$$(g^{jk} \vec{b}_k) \cdot \vec{b}_l = g^{jk} \vec{b}_k \cdot \vec{b}_l = g^{jk} g_{kl}=\delta_k^l.$$
Then you have
$$\vec{V} \cdot \vec{b}^j=(V^k \vec{b}_k) \cdot \vec{b}^j = V^k \vec{b}_k \cdot \vec{b}^j = V^k \delta_k^j = V^j.$$
So you get the contravariant vector components by multiplying with the co-basis vectors. On the other hand you have
$$\vec{V} = V^j \vec{b}_j = V^j g_{jk} \vec{b}^k=V_k \vec{b}^k.$$
So you get covariant components of ##\vec{V}## as
$$V_k = g_{jk} V^j.$$
The ##\vec{b}^j## are contravariant, i.e., they transform analogously to the contravariant vector components:
$$\vec{b}^{\prime l} \cdot \vec{b}_m'=\delta_m^l.$$
From this you get
$$\vec{b}^{\prime l} \cdot ({T^j}_m \vec{b}_j)={T^j}_m \vec{b}^{\prime l} \cdot \vec{b}_j =\delta_m^l = \vec{b}^m \cdot \vec{b}_j.$$
From this we have
$$\vec{b}^{\prime l} \cdot \vec{b}_j={\tilde{T}^l}_j,$$
because the inverse matrix of ##{T^{j}}_k## is uniquely given by the matrix ##{\tilde{T}^l}_m##. So we have
$$\vec{b}^{\prime l} = \vec{b}^{\prime l} \cdot \vec{b}_j \vec{b}^j={\tilde{T}^l}_j \vec{b}_j,$$
i.e., the transform contravariantly, as claimed above.