- #1

julian

Gold Member

- 674

- 155

Can somebody explain how you do this check?

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter julian
- Start date

- #1

julian

Gold Member

- 674

- 155

Can somebody explain how you do this check?

- #2

- 938

- 9

- #3

julian

Gold Member

- 674

- 155

I know that when you have [itex]M_{ab} v_b = 0[/itex], if the [itex]M_{ab}[/itex] is non-degenerate, (has no zero eigenvalues, or equaivalently a non-zero determinant), it implies [itex]v_b = 0[/itex].

Is Pullin's case the same, we are dealing with the direct product of three matrices though? Do we need to do it show the determinant of the direct product matrix is zero?

Is the antisymmetrizations you mentioned related to the calculation of this determinant?

Or is it to do with something more simple?

- #4

- 938

- 9

- #5

julian

Gold Member

- 674

- 155

If [itex]A_{ij}[/itex] is a [itex]m \times m [/itex] matrix and [itex]B_{ij}[/itex] is a [itex]n \times n[/itex] matrix, the direct product is

[itex]C = A \otimes B[/itex]

where [itex]C[/itex] is an [itex]mn \times mn[/itex] matrix with elements

[itex]C_{\alpha \beta} = A_{ij} B_{kl}[/itex]

with

[itex]\alpha = n (i-1) + k , \;\;\;\; \beta = n(j-1) + l[/itex].

The determinant of [itex]C_{\alpha \beta}[/itex] is given by the usual formula

[itex]

det C_{\alpha \beta} = {1 \over (mn)!} \sum_{\alpha_1 \beta_1} \cdots \sum_{\alpha_{mn} \beta_{mn}}

\epsilon_{\alpha_1 \dots \alpha_{mn}} \epsilon_{\beta_1 \dots \beta_{mn}}

C_{\alpha_1 \beta_1} \dots C_{\alpha_{mn} \beta_{mn}}

[/itex].

In terms of [itex]A_{ij}[/itex] and [itex]B_{kl}[/itex] this becomes

[itex]

det C_{\alpha \beta} = {1 \over (mn)!} \sum_{i_1, j_1, k_1, l_1} \cdots

\sum_{i_{mn}, j_{mn}, k_{mn}, l_{mn}}

\epsilon_{n(i_1 - 1) + k_1 \cdots n(i_{mn} - 1) + k_{mn}}

\epsilon_{n(j_1 - 1) + l_1 \cdots n(j_{mn} - 1) + l_{mn}}

[/itex]

[itex]

A_{i_1 j_1} B_{k_1 l_1} \dots A_{i_{mn} j_{mn}} B_{k_{mn} l_{mn}}

[/itex].

Looks a bit daunting

[itex]C = A \otimes B[/itex]

where [itex]C[/itex] is an [itex]mn \times mn[/itex] matrix with elements

[itex]C_{\alpha \beta} = A_{ij} B_{kl}[/itex]

with

[itex]\alpha = n (i-1) + k , \;\;\;\; \beta = n(j-1) + l[/itex].

The determinant of [itex]C_{\alpha \beta}[/itex] is given by the usual formula

[itex]

det C_{\alpha \beta} = {1 \over (mn)!} \sum_{\alpha_1 \beta_1} \cdots \sum_{\alpha_{mn} \beta_{mn}}

\epsilon_{\alpha_1 \dots \alpha_{mn}} \epsilon_{\beta_1 \dots \beta_{mn}}

C_{\alpha_1 \beta_1} \dots C_{\alpha_{mn} \beta_{mn}}

[/itex].

In terms of [itex]A_{ij}[/itex] and [itex]B_{kl}[/itex] this becomes

[itex]

det C_{\alpha \beta} = {1 \over (mn)!} \sum_{i_1, j_1, k_1, l_1} \cdots

\sum_{i_{mn}, j_{mn}, k_{mn}, l_{mn}}

\epsilon_{n(i_1 - 1) + k_1 \cdots n(i_{mn} - 1) + k_{mn}}

\epsilon_{n(j_1 - 1) + l_1 \cdots n(j_{mn} - 1) + l_{mn}}

[/itex]

[itex]

A_{i_1 j_1} B_{k_1 l_1} \dots A_{i_{mn} j_{mn}} B_{k_{mn} l_{mn}}

[/itex].

Looks a bit daunting

Last edited:

- #6

julian

Gold Member

- 674

- 155

Actually looking into it you have the simple formula

[itex]det C = (det A)^n (det B)^m[/itex].

This is easy to see if [itex]A[/itex] and [itex]B[/itex] are diagonal, this suggests a way to prove the result when [itex]A[/itex] and [itex]B[/itex] can be diagonalised.

[itex]det C = (det A)^n (det B)^m[/itex].

This is easy to see if [itex]A[/itex] and [itex]B[/itex] are diagonal, this suggests a way to prove the result when [itex]A[/itex] and [itex]B[/itex] can be diagonalised.

Last edited:

- #7

julian

Gold Member

- 674

- 155

Proof of original statement...

http://dl.dropbox.com/u/81787406/nondegen.pdf [Broken]

http://dl.dropbox.com/u/81787406/nondegen.pdf [Broken]

Last edited by a moderator:

Share: