The Lorentz transformation matrix properties

Mesmerized
Messages
54
Reaction score
0
Hello,

As known, any Lorentz transformation matrix \Lambda must obey the relation \Lambda^\mu~_\nu\Lambda^\rho~_\sigma g_{\mu \rho}=g_{\nu \sigma}. The same holds also for the inverse metric tensor g^{\nu \sigma} which has the same components as the metric tensor itself (don't really understand why every tex formula starts from a new line), i.e. \Lambda^\mu~_\nu\Lambda^\rho~_\sigma g^{\nu \sigma}=g^{\mu \rho}. Putting this all as a matrix relation, these two formulas are \Lambda^T~g~\Lambda=g,~~~\Lambda~g~\Lambda^T=g~~~~~(1), where g is the metric tensor (and also the inverse metric tensor, as they are both the same). From here one can deduce that \Lambda^T=\pm\Lambda, so Lorentz transformation matrix should be either symmetric or antisymmetric. And Everything was great until today, when in Weinberg's book on quantum field theory (vol.1, formula 2.5.26, http://www.scribd.com/doc/3082871/Steven-Weinberg-The-Quantum-Theory-of-Fields-Vol-1-Foundations , page 70, though it isn't much important) I met a Lorentz transformation matrix which is "almost" antisymmetric (it is antisymmetric, except for there aren't zero's on main diagonal).

So I guess I'm wrong somewhere. Isn't the Lorentz transformation matrix restricted to be either symmetric or antisymmetric? Or the equations (1) have other solutions too?
 
Last edited by a moderator:
Physics news on Phys.org
Mesmerized said:
From here one can deduce that \Lambda^T=\pm\Lambda

How?
 
Lorentz transformation operators are analogues of rotation operators, which aren't symmetric or antisymmetric. They are orthogonal, however, so the inverse is equal to the adjoint.
 
George Jones said:
How?

\Lambda^T~g~\Lambda=\Lambda~g~\Lambda^T

I thought that this can be satisfied only when \Lambda^T=\pm\Lambda
 
Mesmerized said:
don't really understand why every tex formula starts from a new line...

use "itex" tags instead of "tex" to get inline tex.
 
Nugatory said:
use "itex" tags instead of "tex" to get inline tex.

thanks.

by the way, at least the most general boost should be symmetric, according to wikipedia http://en.wikipedia.org/wiki/Lorentz_transformation (the section 'Boost in any direction')
 
##\Lambda^T\eta\Lambda=\eta## implies that ##\Lambda^{-1}=\eta\Lambda^T\eta##, so if ##\Lambda^T=\Lambda##, then you can invert a Lorentz transformation simply by flipping the sign of the 0th row and the 0th column (leaving the 00 element unchanged). This works in 1+1 dimensions. I don't have time to fully think through the 3+1-dimensional case right now, but I looked at a couple of my old posts (about boosts) to refresh my memory, and it seems that this works for boosts even in the 3+1-dimensional case. I don't think it will work in general.
 
Last edited:
In general, the metric tensor, g^{\mu\nu}, and its inverse, g_{\mu\mu} do NOT have the same components. Where did you get the idea that they did?
 
HallsofIvy said:
In general, the metric tensor, g^{\mu\nu}, and its inverse, g_{\mu\mu} do NOT have the same components. Where did you get the idea that they did?
I'm talking about flat spacetime.

Fredrik said:
##\Lambda^T\eta\Lambda=\eta## implies that ##\Lambda^{-1}=\eta\Lambda^T\eta##, so if ##\Lambda^T=\Lambda##, then you can invert a Lorentz transformation simply by flipping the sign of the 0th row and the 0th column (leaving the 00 element unchanged). This works in 1+1 dimensions. I don't have time to fully think through the 3+1-dimensional case right now, but I looked at a couple of my old posts (about boosts) to refresh my memory, and it seems that this works for boosts even in the 3+1-dimensional case. I don't think it will work in general.
yeah, right, but seems like it always works with pure boosts (without rotations).
 
  • #10
Mesmerized said:
\Lambda^T~g~\Lambda=\Lambda~g~\Lambda^T

I thought that this can be satisfied only when \Lambda^T=\pm\Lambda

I don't think this is right. As a counterexample, rotations do satisfy the condition \Lambda^T~g~\Lambda=\Lambda~g~\Lambda^T, but they don't satisfy \Lambda^T=\pm\Lambda.
 
  • #11
I don't have time to type in the proof, but, a Lorentz transformation \Lambda that is a rotation has form

<br /> \Lambda = <br /> <br /> \begin{bmatrix}<br /> 1 &amp; 0\\<br /> 0 &amp; R<br /> \end{bmatrix}<br />
where R is a 3x3 rotation matrix that is orthogonal with respect to the spatial part of the metric and thus satisfies

R^T = R^{-1}.
Consequently,

\Lambda^T = \Lambda^{-1}.
I haven't proved this, but it should be plausible.
 
  • #12
bcrowell said:
I don't think this is right. As a counterexample, rotations do satisfy the condition \Lambda^T~g~\Lambda=\Lambda~g~\Lambda^T, but they don't satisfy \Lambda^T=\pm\Lambda.

yeah, rotation matrix is a good counter-example. So, the conclusion is that just by looking on a matrix one can't say if it can be a Lorentz transformation or cannot. It should be just carefully checked against \Lambda^T~g~\Lambda=g relation.
 
  • #13
rotation matrices are orthogonal, but boosts are not. \Lambda^{-1}=\Lambda^T is wrong
 
  • #14
Mesmerized said:
rotation matrices are orthogonal

It depends on what is meant by "orthogonal". Often "orthogonal" means with respect to a particular bilinear form g, i.e.,

A^T g A = g,
is the definition of an orthogonal A with respect to g.

3x3 rotations with g the 3x3 identity is a special case;

4x4 rotations with g the standard "metric" of special relativity is a special case.

In the second case, the the group of Lorentz transformations is denoted O(3,1), orthogonal with respect to a bilinear form that has signature (3,1); In the second case, the the group of "rotations" is denoted O(3), orthogonal with respect to a bilinear form that has signature (3,0).
 
  • #15
And talking about checking carefully against the consistency condition, the matrix in equation 2.5.26 in Weinberg's QFT book that I was talking about

<br /> S^\mu~_\nu = <br /> <br /> \begin{bmatrix}<br /> 1 &amp; 0 &amp; -\alpha &amp; \alpha\\<br /> 0 &amp; 1 &amp; -\beta &amp; \beta\\<br /> \alpha &amp; \beta &amp; 1-\zeta &amp; \zeta\\<br /> -\alpha &amp; -\beta &amp; -\zeta &amp; 1+\zeta\\<br /> \end{bmatrix},<br /> \\<br /> <br /> <br /> \zeta=(\alpha^2+\beta^2)/2<br />

seems like doesn't satisfy the S^TgS=g relation. The first row of S^Tg matrix is going to be (1,0,\alpha,\alpha), (g is 1,1,1,-1) and when multiplying this by the third column of S matrix to get one of the non diagonal elements of g, the result isn't zero. Either I'm slowly starting to mess everything up after several hours of working with matrix equations, or there's a problem in Weinberg's book.
 
  • #16
Mesmerized said:
rotation matrices are orthogonal, but boosts are not. \Lambda^{-1}=\Lambda^T is wrong

<br /> \underline L(e_0)= e_0 \cosh \phi+ e_1 \sinh \phi \\<br /> \underline L(e_1)= e_1 \cosh \phi+ e_0 \sinh \phi<br />

This operator represents a pure boost. It is, crucially, not symmetric because one of the basis vectors must dot with itself to negative one to represent a hyperbolic geometry. The adjoint is
<br /> \overline L(e_0)= e_0 \cosh \phi - e_1 \sinh \phi \\<br /> \overline L(e_1)= e_1 \cosh \phi - e_0 \sinh \phi<br />

It's clear that \overline L \underline L(a)=a for any vector a. This makes the operator orthogonal.

Edit: this is one reason to avoid the word "transpose". Finding the adjoint of an operator in mixed signature spaces is no longer as simple as taking a transpose.
 
Last edited:
  • #17
Muphrid said:
<br /> \underline L(e_0)= e_0 \cosh \phi+ e_1 \sinh \phi \\<br /> \underline L(e_1)= e_1 \cosh \phi+ e_0 \sinh \phi<br />

This operator represents a pure boost. It is, crucially, not symmetric

How? It is symmetric. Just put it in a matrix form

Muphrid said:
The adjoint is
<br /> \overline L(e_0)= e_0 \cosh \phi - e_1 \sinh \phi \\<br /> \overline L(e_1)= e_1 \cosh \phi - e_0 \sinh \phi<br />

It's clear that \overline L \underline L(a)=a for any vector a. This makes the operator orthogonal.

Edit: this is one reason to avoid the word "transpose". Finding the adjoint of an operator in mixed signature spaces is no longer as simple as taking a transpose.
I'm trying to understand what you have written here, and it seems like you've found one particular example of Lorentz transformation, which is orthogonal, but in general it isn't I think
 
  • #18
The matrix's components may make a nice "symmetric" pattern, but the operator is not equal to its adjoint, therefore it is not symmetric in the strictest sense of linear algebra.
 
  • #19
thanks, Muphrid, but I guess you're a little too clever for me for now. We'll talk after some half a year :)

And that matrix from Weinberg's book I was talking about, it turned out that it was written really wrong, but in the russian version, which I'm reading. In the original it's perfectly right.
 
Back
Top