Tensors Notation - Summation Convention - meaning of (a_ij)*(a_ij)

Click For Summary

Discussion Overview

The discussion centers on the summation convention in tensor notation, specifically examining the implications of multiplying elements of a matrix and the resulting summations. Participants explore the properties of orthogonal matrices and the interpretation of the Dirac delta function in this context.

Discussion Character

  • Technical explanation
  • Debate/contested

Main Points Raised

  • One participant explains the summation convention, stating that when indices appear twice, summation over that index is implied.
  • The same participant asserts that for an orthogonal matrix, the product of its elements leads to the Dirac delta function, suggesting that aiαaiβ = δαβ.
  • Another participant challenges the interpretation of δαβ, arguing that it should be considered as the trace of the identity matrix, which sums to the dimension of the space (3 in this case).
  • There is a clarification that the expression being questioned involves the sum of the squares of all elements of the tensor, rather than a simpler product.
  • A later reply indicates that the original poster has resolved their confusion after considering a subtle point from their reference material.
  • Participants emphasize the importance of correctly writing upper and lower indices in tensor notation.

Areas of Agreement / Disagreement

Participants express differing interpretations of the summation convention and the properties of the Dirac delta function. The discussion remains unresolved regarding the correct application of these concepts in the context of the original question.

Contextual Notes

There are nuances in the interpretation of tensor notation and the implications of summation that are not fully resolved. The discussion highlights potential misunderstandings regarding the application of the summation convention and the properties of orthogonal matrices.

metalrose
Messages
112
Reaction score
0
The summation convention for Tensor Notation says, that we can omit the summation signs and simply understand a summation over any index that appears twice.

So consider a 3X3 matrix A whose elements are denoted by aij, where i and j are indices running from 1 to 3.

Now consider the multiplication aa.

Using the summation convention described above, the summation here would be over the index i since it occurs twice.

Now if the matrix A is an orthogonal matrix, then it has the property that elements of any row or column can be thought of as components of a vector whose magnitude is 1, and that they are all mutually orthogonal.

So, aaαβ

Where δ is the dirac delta function.

Now what if α=β?

According to the above equation, aa should equal 1 since δαβ=1 for α=β.

But if we write it as aa, by summation convention, this means a summation over both i and α(or β).

First summing over α, this means multiplication of each element of the i th row with itself.
This will equal 1, as a result of A being orthogonal.

Now summing over i, we'll get i*1=i.

Also, if we had summed over i first and then α, we would have got α*1=α.

Where am I going wrong??
 
Physics news on Phys.org
If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

<br /> \delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3<br />

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.
 
haushofer said:
If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

<br /> \delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3<br />

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.

What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)
 
metalrose said:
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)

That would be the sum of the squares of all elements of the tensor.
 
I missed out on a subtle point in my book. Hence the confusion.

I think i am clear now. Thanks for the replies anyway.
 
metalrose said:
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)

No, i didn't. Check your expression for a*a-transpose.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 9 ·
Replies
9
Views
715
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 9 ·
Replies
9
Views
5K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 24 ·
Replies
24
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K