lonewolf219 said:
I keep wanting to perform matrix multiplication, but g_αγ*g_bδ would just be the unit matrix if we did this, right?
##g_{\alpha\gamma} g_{\beta \delta}## is
not matrix multiplication - it is an object with four free indices that are uncorrelated. In tensorial language, matrix multiplication corresponds to a contraction of two rank-two tensors. For example, ##A_{i j} B^{j k}## can be viewed as a matrix multiplication - try to compare this with the usual component-wise multiplication definition of matrix multiplication before reading on!
As for how to evaluate the RHS of the equation, well we have to perform a summation over the dummy indices. As an example, for ##\alpha = \beta = 0##, we have
F_{0 0} = \sum_{\gamma = 0}^{3}\sum_{\delta = 0}^{3} g_{0\gamma} g_{0\delta} F^{\gamma \delta}
where I have put the summations in explicitly just to show things clearly.
Of course there is an easier way to do it since these are rank-two tensors - recall earlier that I mentioned that matrix multiplication involves contracting one index in a product of two such tensors. So, ##C_{i k} = A_{i j} B^{j k}## can be viewed as taking the i-th row of A and multiply it element-wise with the k-th column of B, which is what we do when we multiply matrices in the usual sense. If we rewrite the given equation as
F_{\alpha\beta} = g_{\alpha \gamma} F^{\gamma \delta} (g^{T})_{\delta \beta}
(note that I've taken the transpose of the second ##g## to reverse the indices so that we can interpret it as matrix multiplication)
then we can write
\mathbf{F}' = \mathbf{g}\,\mathbf{F}\,\mathbf{g}^{T}