Calculating reciprocal base vectors

I have just started diving into tensor analysis. To be honest, I didn't know whether to post this question in the vector analysis forum or this one. I have looked at a few books on the subject and scoured the internet, but I can't seem to find anything that answers this question. Or, maybe I just didn't understand what was being said.

Say you are given a set of basis column vectors in n dimensions $\{\vec{g}_{i}\}$, for i=1,...,n and let $G=[\vec{g}_{i}]$ be the matrix of column vectors. Can the reciprocal base vectors be calculated simply by taking the inverse of this matrix where $G^{-1}=[\vec{g}^{i}]$ is the matrix of the row vectors representing the reciprocal basis $\{\vec{g}^{i}\}$? This would make since because matrix multiplication would yield the identity matrix, and, when written in this form, matrix multiplication becomes the inner product of the vectors that construct the respective matrices as so:

$G^{-1}G=[\vec{g}^{i}][\vec{g}_{j}]=[\vec{g}^{i}\cdot\vec{g}_{j}]=[\delta^{i}_{j}]=I$

Since the inner product of two roof or two cellar base vectors isn't necessarily the kronecker delta, this is the only way that makes sense to me. So, in other words, the rows of the inverse of G would be the reciprocal basis. Is this correct?

Also, I have gotten through most of Simmonds book A Brief Introduction to Tensor Analysis. For much of the book, he uses the less formal terms roof and cellar instead of Covariant and Contravariant. Are these latter terms only applied to the components of a vector? Or can they be applied to the vectors themselves?

Last edited: