# Tensor transformations

1. Apr 19, 2010

### peterjaybee

Hi,

In component form the transformation for the following tensor can be written as
$$F^{\mu\nu}=\Lambda^{\mu}_{\alpha}\Lambda^{\nu}_{\beta}F^{\beta\alpha}$$

or in matrix notation, apparently as
$$F^{'}=LFL^{T}$$
Here L is the Lorentz transformation matrix

Im happy with the component form, but I dont understand where the transpose matrix bit comes from in the matrix equation, and why it is on the RHS of the F tensor.

2. Apr 19, 2010

### Fredrik

Staff Emeritus
It follows immediately from the definition of the product of two matrices.

$$(AB)_{ij}=A_{ik}B_{kj}$$

(What does this definition say is on row i, column j of $$LFL^T$$?)

3. Apr 20, 2010

### peterjaybee

Im sorry, I still can't see it.

4. Apr 20, 2010

### Fredrik

Staff Emeritus
No need to apologize. I know a lot of people are having difficulties with this. I'm genuinely interested in why that is, so when you do see it, I'd appreciate if you could tell me what it was that confused you.

If we write the component on row $\mu$, column $\nu$, of an arbitrary matrix X as $X_{\mu\nu}$, then

$$(LFL^T)_{\mu\nu}=(LF)_{\mu\rho}(L^T)_{\rho\nu}=L_{\mu\sigma}F_{\sigma\rho}L_{\nu\rho}=L_{\mu\sigma}L_{\nu\rho}F_{\sigma\rho}$$

5. Apr 20, 2010

### peterjaybee

I strugle with this concept (and alot of other index manipulations) because I find the index notation unfamiliar and a little alien. Because I dont understand it, my logic is flawed. For example when I initially saw
$$F^{\mu\nu}=\Lambda^{\mu}_{\alpha}\Lambda^{\nu}_{\beta}F^{\beta\alpha}$$,
I thought to get the transformed faraday components you just times two lorentz matricies together then right multiply by the untransformed faraday tensor. Even though I know this is wrong (having tried it) I do not understand why it is wrong. It is very difficult to describe.

Thanks to you, I now understand how to do the manipulation which is a relief . The manipulation itself makes sense, I just dont understand where my logic in the above fails if you see what I mean.

Ill try expressing it in another way if someone asked me...
"Can you get the transformed faraday components by just multiplying two lorentz matricies together then right multiply by the untransformed faraday tensor?"...
I would say no, but if they then asked me why not, I would be stuck.

6. Apr 20, 2010

### dx

Maybe it will be help a little if we say it in coordinate free language. In spacetime, the distinction between a vector and a covector is only conceptually useful, but computationally, we can convert a vector into a covector or a covector into a vector using the metric tensor g(_,_). So if we have a vector v, then the covector corresponding to that is defined as g(v,_), i.e. the vector vμ corresponds to the covector vα through vα = gαμvμ

Similarly, if we have a contravariant tensor Fαβ, then we use this by contracting it with covectors v and w thus: Fαβvαwβ. But, as we have seen above, vα = gαμvμ and wβ = gβνvν. So Fαβvαwβ = Fαβgαμvμgβνvν = gβνgαμFαβvμvν.

So the contravariant tensor, which acts on pairs of covectors, can be made to act on their corresponding vectors by replacing it with the covariant tensor gβνgαμFαβ = Fμν.

7. Apr 20, 2010

### dx

Just noticed the question was not about raising and lowering indices. Ignore my previous post.

8. Apr 20, 2010

### Fredrik

Staff Emeritus
Look at the definition of matrix multiplication again, in #2. Note that the sum is always over an index that's a column index for the matrix on the left and a row index for the matrix on the right. Since $\Lambda^\nu{}_{\beta}$ is row $\nu$, column $\beta$ of a $\Lambda$, and $F^{\beta\alpha}$ is row $\beta$, column $\alpha$ of a $F$, the result

$$\Lambda^\nu{}_\beta F^{\beta\alpha}=(\Lambda F)^{\nu\alpha}$$

follows immediately from the definition of matrix multiplication. But now look at

$$\Lambda^\mu{}_\alpha F^{\beta\alpha}$$

Note that the sum is over the column index of F. If you have another look at the definition of matrix multiplication, you'll see that this means that if the above is a component of the product of two matrices, one of which is F, then F must be the matrix on the left. When you understand that, the rest should be easy.

Also note that you should LaTeX $\Lambda^\mu{}_\nu$ as \Lambda^\mu{}_\nu, so that the column index appears diagonally to the right below the row index. And check out the comment about the inverse here to see why the horizontal position of the indices matters.

9. Apr 20, 2010

### peterjaybee

I finally get it! Its a miracle.