Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Writing the Einstein action in terms of tetrads

  1. Sep 5, 2012 #1


    User Avatar
    Gold Member

    In the paper http://arxiv.org/pdf/hep-th/9301028.pdf pages 8-9 Pullin shows how to write the Einstein action in terms of tetrads [itex]e^a_I[/itex]. Part of the proof is: "...the last term yields [itex]e_M^{[a} e_N^{b]} \delta^M_{[I} \delta^K_{J]} C_{bK}^{\;\;\; N}[/itex]. It is easy to check that the prefactor in this expression is nondegenerate..."

    Can somebody explain how you do this check?
  2. jcsd
  3. Sep 6, 2012 #2
    I think you want to check that the antisymmetrizations do not cause the prefactor to be zero for any indices a,I,J; that is all.
  4. Sep 6, 2012 #3


    User Avatar
    Gold Member

    Could you elaborate a bit please?

    I know that when you have [itex]M_{ab} v_b = 0[/itex], if the [itex]M_{ab}[/itex] is non-degenerate, (has no zero eigenvalues, or equaivalently a non-zero determinant), it implies [itex]v_b = 0[/itex].

    Is Pullin's case the same, we are dealing with the direct product of three matrices though? Do we need to do it show the determinant of the direct product matrix is zero?

    Is the antisymmetrizations you mentioned related to the calculation of this determinant?

    Or is it to do with something more simple?
  5. Sep 6, 2012 #4
    Yeah it's exactly like that, except now you have some euclidean indices from the tetrads to mess up the notation... But your equation basically reads [itex] M^{ab} V_b = 0 [/itex]. But now your M is a tensor product of two vectors. Remember also that the tetrad vectors are orthogonal to each other. Perhaps that helps? I'm not entirely sure, I didn't actually bother to do it :D Whenever something reads "it's easy to see that... " you're probably going to have a bad time.
  6. Sep 13, 2012 #5


    User Avatar
    Gold Member

    If [itex]A_{ij}[/itex] is a [itex]m \times m [/itex] matrix and [itex]B_{ij}[/itex] is a [itex]n \times n[/itex] matrix, the direct product is

    [itex]C = A \otimes B[/itex]

    where [itex]C[/itex] is an [itex]mn \times mn[/itex] matrix with elements

    [itex]C_{\alpha \beta} = A_{ij} B_{kl}[/itex]


    [itex]\alpha = n (i-1) + k , \;\;\;\; \beta = n(j-1) + l[/itex].

    The determinant of [itex]C_{\alpha \beta}[/itex] is given by the usual formula

    det C_{\alpha \beta} = {1 \over (mn)!} \sum_{\alpha_1 \beta_1} \cdots \sum_{\alpha_{mn} \beta_{mn}}
    \epsilon_{\alpha_1 \dots \alpha_{mn}} \epsilon_{\beta_1 \dots \beta_{mn}}
    C_{\alpha_1 \beta_1} \dots C_{\alpha_{mn} \beta_{mn}}

    In terms of [itex]A_{ij}[/itex] and [itex]B_{kl}[/itex] this becomes

    det C_{\alpha \beta} = {1 \over (mn)!} \sum_{i_1, j_1, k_1, l_1} \cdots
    \sum_{i_{mn}, j_{mn}, k_{mn}, l_{mn}}
    \epsilon_{n(i_1 - 1) + k_1 \cdots n(i_{mn} - 1) + k_{mn}}
    \epsilon_{n(j_1 - 1) + l_1 \cdots n(j_{mn} - 1) + l_{mn}}
    A_{i_1 j_1} B_{k_1 l_1} \dots A_{i_{mn} j_{mn}} B_{k_{mn} l_{mn}}

    Looks a bit daunting
    Last edited: Sep 13, 2012
  7. Sep 14, 2012 #6


    User Avatar
    Gold Member

    Actually looking into it you have the simple formula

    [itex]det C = (det A)^n (det B)^m[/itex].

    This is easy to see if [itex]A[/itex] and [itex]B[/itex] are diagonal, this suggests a way to prove the result when [itex]A[/itex] and [itex]B[/itex] can be diagonalised.
    Last edited: Sep 14, 2012
  8. Apr 5, 2013 #7


    User Avatar
    Gold Member

    Proof of original statement...

    http://dl.dropbox.com/u/81787406/nondegen.pdf [Broken]
    Last edited by a moderator: May 6, 2017
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook