Tensors Notation - Summation Convention - meaning of (a_ij)*(a_ij)

  • Thread starter metalrose
  • Start date
  • #1
113
0
The summation convention for Tensor Notation says, that we can omit the summation signs and simply understand a summation over any index that appears twice.

So consider a 3X3 matrix A whose elements are denoted by aij, where i and j are indices running from 1 to 3.

Now consider the multiplication aa.

Using the summation convention described above, the summation here would be over the index i since it occurs twice.

Now if the matrix A is an orthogonal matrix, then it has the property that elements of any row or column can be thought of as components of a vector whose magnitude is 1, and that they are all mutually orthogonal.

So, aaαβ

Where δ is the dirac delta function.

Now what if α=β?

According to the above equation, aa should equal 1 since δαβ=1 for α=β.

But if we write it as aa, by summation convention, this means a summation over both i and α(or β).

First summing over α, this means multiplication of each element of the i th row with itself.
This will equal 1, as a result of A being orthogonal.

Now summing over i, we'll get i*1=i.

Also, if we had summed over i first and then α, we would have got α*1=α.

Where am I going wrong??
 

Answers and Replies

  • #2
haushofer
Science Advisor
Insights Author
2,356
764
If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

[tex]
\delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3
[/tex]

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.
 
  • #3
113
0
If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

[tex]
\delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3
[/tex]

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)
 
  • #4
HallsofIvy
Science Advisor
Homework Helper
41,833
956
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)
That would be the sum of the squares of all elements of the tensor.
 
  • #5
113
0
I missed out on a subtle point in my book. Hence the confusion.

I think i am clear now. Thanks for the replies anyway.
 
  • #6
haushofer
Science Advisor
Insights Author
2,356
764
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)
No, i didn't. Check your expression for a*a-transpose.
 

Related Threads on Tensors Notation - Summation Convention - meaning of (a_ij)*(a_ij)

Replies
24
Views
4K
Replies
12
Views
1K
Replies
1
Views
480
  • Last Post
Replies
2
Views
1K
Replies
7
Views
454
Replies
4
Views
2K
Replies
2
Views
891
Replies
3
Views
784
  • Last Post
Replies
4
Views
617
  • Last Post
Replies
5
Views
1K
Top