# Tensors Notation - Summation Convention - meaning of (a_ij)*(a_ij)

• metalrose
In summary, the summation convention for tensor notation allows us to omit summation signs and understand a summation over any index that appears twice. When considering a 3x3 matrix A with elements denoted by aij, the multiplication aiαaiβ can be simplified to a summation over the index i. If A is an orthogonal matrix, then the equation aiαaiβ=δαβ holds, where δ is the dirac delta function. However, when considering α=β, the result is not necessarily 1 as summation over both i and α (or β) must be taken into account. This can be seen by considering the trace of the identity matrix, equaling the dimension of the space being worked in
metalrose
The summation convention for Tensor Notation says, that we can omit the summation signs and simply understand a summation over any index that appears twice.

So consider a 3X3 matrix A whose elements are denoted by aij, where i and j are indices running from 1 to 3.

Now consider the multiplication aa.

Using the summation convention described above, the summation here would be over the index i since it occurs twice.

Now if the matrix A is an orthogonal matrix, then it has the property that elements of any row or column can be thought of as components of a vector whose magnitude is 1, and that they are all mutually orthogonal.

So, aaαβ

Where δ is the dirac delta function.

Now what if α=β?

According to the above equation, aa should equal 1 since δαβ=1 for α=β.

But if we write it as aa, by summation convention, this means a summation over both i and α(or β).

First summing over α, this means multiplication of each element of the i th row with itself.
This will equal 1, as a result of A being orthogonal.

Now summing over i, we'll get i*1=i.

Also, if we had summed over i first and then α, we would have got α*1=α.

Where am I going wrong??

If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

$$\delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3$$

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.

haushofer said:
If you say "δαβ=1 for α=β" you don't sum; you consider just an element on the diagonal regarding the delta as a matrix!

However, you are summing. So:

$$\delta^{\alpha}_{\alpha} = 1 + 1 + 1 = 3$$

It's just the trace of the identity matrix, and I think you'll agree that that's equal to the dimension of the space you're working in (3) ;)

Btw, it's a good habit to write upper and lower indices, even though in flat space and Euclidean coordinates these are equivalent.

What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)

metalrose said:
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)

That would be the sum of the squares of all elements of the tensor.

I missed out on a subtle point in my book. Hence the confusion.

I think i am clear now. Thanks for the replies anyway.

metalrose said:
What you seem to have done is this :

Ʃiaii

However, my question is regarding

Ʃij(aij)*(aij)

No, i didn't. Check your expression for a*a-transpose.

## 1. What is tensors notation?

Tensors notation is a way of representing mathematical objects called tensors, which are multidimensional arrays of numbers. This notation uses indices to represent the different dimensions of a tensor and allows for concise and elegant expressions of mathematical equations.

## 2. What is the summation convention in tensors notation?

The summation convention is a shorthand notation used in tensors notation to simplify expressions involving summations over repeated indices. It states that whenever an index appears twice in a product, it is implicitly summed over all its possible values.

## 3. What is the meaning of (a_ij)*(a_ij) in tensors notation?

In tensors notation, (a_ij)*(a_ij) represents the squared magnitude of the tensor a. It is calculated by multiplying each element of the tensor by itself and then summing them up.

## 4. What is the significance of having repeated indices in tensors notation?

Repeated indices in tensors notation indicate that a summation is taking place over those indices. This allows for more compact and efficient representation of mathematical equations involving tensors.

## 5. How is tensors notation used in scientific research?

Tensors notation is used in various fields of science, such as physics, engineering, and mathematics, to represent and manipulate mathematical equations involving tensors. It allows researchers to write complex equations in a concise and elegant manner, making it easier to understand and solve problems.

• Special and General Relativity
Replies
4
Views
454
• Special and General Relativity
Replies
7
Views
1K
• Special and General Relativity
Replies
9
Views
4K
• Special and General Relativity
Replies
9
Views
2K
• Special and General Relativity
Replies
10
Views
2K
• General Math
Replies
5
Views
1K
• Special and General Relativity
Replies
22
Views
2K
• Special and General Relativity
Replies
9
Views
452
• Linear and Abstract Algebra
Replies
2
Views
2K
• Special and General Relativity
Replies
11
Views
2K