First of all you are talking about tensor components not tensors, which are invariant objects under basis transformations. Your question is best answered for a plain vector space without additional structures like a scalar product (or an indefinite fundamental bilinear form as in the case of Minkowski space).
Then you have vectors, which you can add and multiply with scalars. A basis ##\vec{b}_j##, ##j \in \{1,2,\ldots,d \}## where ##d## is the dimension of the vector space, is a set of vectors, for which any given vector ##\vec{x}## can be expressed uniquely as linear combination of these basis vectors:
$$\vec{x}=\vec{e}_j x^j,$$
where the summation over equal pairs of indices (one subscript and one superscript) is understood (Einstein summation convention).
The next simple object you can define on the vector space are linear forms, i.e., linear mappings from the vector space to the scalars. Obviously such a linear form ##L## is already determined, if you know the results for the basis:
$$L_j=L(\vec{b}_j).$$
Then you have
$$L(\vec{x})=L(\vec{e}_j x^j)=x^j L(\vec{e}_j)=L_j x^j.$$
Now the linear forms build a vector space in a natural way too. You add two linear forms or multiply a linear form with a scalar by doing this pointwise, i.e., for each vector ##\vec{x}## one defines
$$(L+M)(\vec{x})=L(\vec{x})+M(\vec{x}), \quad (\lambda L)(\vec{x})=\lambda L(\vec{x}).$$
Then a basis of linear forms is given by the dual basis of the vector-space basis ##\vec{b}_j##, which we denote as ##\vec{b}^k##. It is defined by their values for the basis vectors:
$$\vec{b}^k(\vec{b}_j)=\delta_j^k=\begin{cases} 1 & \text{for} \quad j=k, \\ 0 & \text{for} \quad j \neq k. \end{cases}$$
Then each linear form ##L## can be written as
$$L=L_j \vec{b}^j,$$
because then indeed
$$L(\vec{e}_k)=L_j \vec{b}^j(\vec{b}_k)=L_j \delta_k^j=L_k.$$
Now we choose another basis for the vector space. Then there is a matrix $${T^j}_k$$ which is invertible with inverse $${U^j}_k$$ such that
$$\vec{b}_j={T^k}_j \vec{b}_k', \quad \vec{b}_k'={U^j}_k \vec{b}_j.$$
For the vector components we get
$$\vec{x}=x^j \vec{b}_j=x'^k \vec{b}_k'=x^j {T^k}_j \vec{b}_k'.$$
Since the decomposition of ##\vec{x}## in terms of the basis ##\vec{b}_k## we have
$$x'^k={T^k}_j x^j.$$
One says the vector components transform contragrediently to the basis vectors or says the basis vectors transform covariantly and the vector components contra variantly.
Now we can also figure out the transformation properties of the corresponding dual bases of the vector space of linear forms (dual space). By definition we have
$$L_k'=L(\vec{b}_k')=L({U^j}_k \vec{b}_j)={U^j}_k L(\vec{b}_j)=L_j {U^j}_k.$$
This implies
$$L=L_k' \vec{b}'^k=L_j {U^j}_k \vec{b}'^k=L_j \vec{b}^j \; \Rightarrow \; \vec{b}^j={U^j}_k \vec{b}'^k \; \Rightarrow \; \vec{b}'^k={T^k}_j \vec{b}^j.$$
The latter follows from ##\hat{U}^{-1}=\hat{T}##. The cobasis vectors thus transform cogrediently to the basis vectors.
There's no basis independent isomorphism between the vector space and its dual space, but they are isomorphic, because you can define a one-to-one linear map between the two spaces using a basis. You can define a coordinate independent map for a vector space which has specified a non-degenerate bilinear form (like a skalar product for a Euclidean vector space or the Minkowski pseudoscalar product in Minkowski space). This "fundamental bilinear form" we denote with ##\vec{x} \cdot \vec{y}## for any two vectors ##\vec{x}## and ##\vec{y}##. Again it's completely defined for a given basis by knowing the values
$$\eta_{jk}=\vec{b}_j \cdot \vec{b}_k.$$
That the fundamental form is non-degenerate means that the matrix ##\hat{\eta}=(\eta_{jk})## has an inverse, which we call ##\hat{\eta}^{-1}=(\eta^{jk})##.
Now given a linear form ##L## we have
$$L_j=L(\vec{b}_j).$$
Now we can write for any ##\vec{x}##
$$L(\vec{x})=L_j x^j.$$
Now we define the vector
$$\vec{L}=L^j \vec{b}_j=\eta^{jk} L_k \vec{b}_j.$$
Then we have
$$\vec{L} \cdot \vec{x}=\eta_{jk} L^j x^k=L_k x^k=L(\vec{x}),$$
and this defines a coordinate independent one-to-one mapping from the dual space to the vector space. Now it makes sense to simply call ##L_j## the covariant and ##L^k## the contravariant components of the vector ##\vec{L}##. You can convert from one to the other with help of (pseudo)metric components:
$$L_j=\eta_{jk} L^k, \quad L^k=\eta^{kj} L_j.$$