- #1

- 751

- 78

## Summary:

- I want to prove certain properties related to Lorentz Transformations.

## Main Question or Discussion Point

This exercise was proposed by samalkhaiat here

Given the defining property of Lorentz transformation [itex]\eta_{\mu\nu}\Lambda^{\mu}{}_{\rho}\Lambda^{\nu}{}_{\sigma} = \eta_{\rho \sigma}[/itex], prove the following identities

(i) [itex]\ (\Lambda k) \cdot (\Lambda x) = k \cdot x[/itex]

(ii) [itex]\ p \cdot (\Lambda x) = (\Lambda^{-1}p) \cdot x[/itex]

(iii) [itex]\ \left(\Lambda^{-1}k \right)^{2} = k^{2}[/itex]

Where ##\Lambda## is a linear transformation

My attempt

Alright, let us deal with the inner-product vector space ##(\Re, \Re^n, +, \langle \cdot , \cdot \rangle )## . Addition is defined as usual and the inner product is the standard inner product over ##\Re^n##

$$\langle \cdot , \cdot \rangle : \Re^n \times \Re^n \rightarrow \Re : (X=\Big(

\begin{pmatrix}

x^1 \\

. \\

. \\

x^n

\end{pmatrix}\Big), Y=

\Big(

\begin{pmatrix}

y^1 \\

. \\

. \\

y^n

\end{pmatrix}\Big)) \mapsto \sum_{i=1}^n x_i y_i = X^T \cdot Y$$

(i) Applying the definition of ##\langle \cdot, \cdot \rangle##

$$X^T \cdot Y=(\Lambda k)^T \cdot (\Lambda x)=(k^T \Lambda^T) \cdot \Lambda x $$

Well, I get the answer iff I assume that ##\Lambda## is an orthogonal matrix (i.e. ##\Lambda^{-1}=\Lambda^{T}##) and ##k## is a symmetric matrix (i.e. ##k^{T}=k##)

$$(k^T \Lambda^T) \cdot \Lambda x = (k^T \Lambda^{-1}) \cdot \Lambda x = k \cdot x$$

Mmm it doesn't

(ii) Let's apply the definition of the given inner product to the RHS

$$X^T \cdot Y= (\Lambda^{-1}p)^{T} \cdot x$$

Assuming ##\Lambda## is an orthogonal matrix and ##p## is symmetric I indeed get the answer, but I've got the same issue as in (i).

(iii) I get it if I also assume that ##\Lambda## is involutory (i.e. when ##\Lambda## is raised to an even power yields the identity matrix and when is raised to an odd power yields ##\Lambda##; more properties here).

Is my reasoning correct? If yes, I think there may be a better way of proving them all (without that many assumptions).

Side note: ##\Re^4## is the most common space-time vector space; it is the four dimensional real vector space consisting of all 4-tuples (i.e. ##x=(x^0, x^1, x^2, x^3)##) and the inner-product here satisfies ##X^T \cdot Y = x^0y^0 - \vec x \cdot \vec y##

Any help is appreciated.

Thank you

Given the defining property of Lorentz transformation [itex]\eta_{\mu\nu}\Lambda^{\mu}{}_{\rho}\Lambda^{\nu}{}_{\sigma} = \eta_{\rho \sigma}[/itex], prove the following identities

(i) [itex]\ (\Lambda k) \cdot (\Lambda x) = k \cdot x[/itex]

(ii) [itex]\ p \cdot (\Lambda x) = (\Lambda^{-1}p) \cdot x[/itex]

(iii) [itex]\ \left(\Lambda^{-1}k \right)^{2} = k^{2}[/itex]

Where ##\Lambda## is a linear transformation

My attempt

Alright, let us deal with the inner-product vector space ##(\Re, \Re^n, +, \langle \cdot , \cdot \rangle )## . Addition is defined as usual and the inner product is the standard inner product over ##\Re^n##

$$\langle \cdot , \cdot \rangle : \Re^n \times \Re^n \rightarrow \Re : (X=\Big(

\begin{pmatrix}

x^1 \\

. \\

. \\

x^n

\end{pmatrix}\Big), Y=

\Big(

\begin{pmatrix}

y^1 \\

. \\

. \\

y^n

\end{pmatrix}\Big)) \mapsto \sum_{i=1}^n x_i y_i = X^T \cdot Y$$

(i) Applying the definition of ##\langle \cdot, \cdot \rangle##

$$X^T \cdot Y=(\Lambda k)^T \cdot (\Lambda x)=(k^T \Lambda^T) \cdot \Lambda x $$

Well, I get the answer iff I assume that ##\Lambda## is an orthogonal matrix (i.e. ##\Lambda^{-1}=\Lambda^{T}##) and ##k## is a symmetric matrix (i.e. ##k^{T}=k##)

$$(k^T \Lambda^T) \cdot \Lambda x = (k^T \Lambda^{-1}) \cdot \Lambda x = k \cdot x$$

Mmm it doesn't

*smell*good; too many restrictions I think.(ii) Let's apply the definition of the given inner product to the RHS

$$X^T \cdot Y= (\Lambda^{-1}p)^{T} \cdot x$$

Assuming ##\Lambda## is an orthogonal matrix and ##p## is symmetric I indeed get the answer, but I've got the same issue as in (i).

(iii) I get it if I also assume that ##\Lambda## is involutory (i.e. when ##\Lambda## is raised to an even power yields the identity matrix and when is raised to an odd power yields ##\Lambda##; more properties here).

Is my reasoning correct? If yes, I think there may be a better way of proving them all (without that many assumptions).

Side note: ##\Re^4## is the most common space-time vector space; it is the four dimensional real vector space consisting of all 4-tuples (i.e. ##x=(x^0, x^1, x^2, x^3)##) and the inner-product here satisfies ##X^T \cdot Y = x^0y^0 - \vec x \cdot \vec y##

Any help is appreciated.

Thank you