Undergrad Is my use of Einstein notation correct in this example?

Click For Summary
The discussion centers on the correct application of Einstein notation for a diagonal matrix R, specifically R = diag[1, -1, 1]. It is clarified that using equal indices in both upper and lower positions implies summation, which makes the initial notation incorrect. The proper way to express the relationship R^T R = I in Einstein notation involves using the Kronecker delta, δαβ, rather than the identity matrix notation. Additionally, it is suggested to treat the diagonal entries as distinct numbers rather than components of a tensor. Overall, the conversation emphasizes the importance of adhering to standard conventions in tensor notation.
redtree
Messages
335
Reaction score
15
TL;DR
I am wondering if I am using it correctly
I am wondering if I am using Einstein notation correctly in the following example.

For a matrix ##R## diagonal in ##1##, except for one entry ##-1##, such as ##R = [1,-1,1]##, is it proper to write the following in Einstein notation:
##R_{\alpha} R_{\beta} = \mathbb{1}_{\alpha \beta} ##, such that ##\Gamma_{\alpha} \Gamma_{\beta} = \Gamma_{\alpha} \Gamma_{\beta} \mathbb{1}_{\alpha \beta} = \big(R_{\alpha}\Gamma_{\alpha} \Big) \Big( R_{\beta}\Gamma_{\beta}\Big)##
 
Mathematics news on Phys.org
It is wrong.
First of all, equal index in up and down positions implies summation. So

##R_a R_b = 1_{a b}## Is wrong, and to be honest, i don't even understand what did you was trying to say.

The second equation is also wrong, the index on left hand of the equation is different of the index on right hand side.
 
LCSphysicist said:
I don't even understand what did you was trying to say.
First off, I am trying to write the following in Einstein notation, where ##R=\mathrm{diag}[1,-1,1]##, then ##R^T R = \mathbb{1}_{\dim{R}}##.
 
##R## is a matrix and so it has two indices not one. Is the notation ##R = \text{diag} [1,-1,1]## confusing you? They are the entries of the diagonal of the matrix, rather than the components of a vector in some basis. They are the diagonal components of a second rank tensor in some basis. You use ##\delta_{\alpha \beta}## instead of ##\mathbb{1}_{\alpha \beta}##. Anyway, using Einstein's summation convention you would write

$$
R^T R= \mathbb{1}
$$

as

$$
(R^T)^\alpha_{\;\; \gamma} R^\gamma_{\;\; \beta} = \delta^\alpha _{\;\; \beta} .
$$

or

$$
R^{\;\; \alpha}_ \gamma R^\gamma_{\;\; \beta} = \delta^\alpha _{\;\; \beta} .
$$

or as ##R## is symmetric

$$
R^\alpha_{\;\; \gamma} R^\gamma_{\;\; \beta} = \delta^\alpha _{\;\; \beta} .
$$I suppose, it's not standard, you could introduce the numbers ##r_{(1)}= 1 , r_{(2)} = -1, r_{(3)} = 1## where I have used brackets around the indices to indicate that I'm simply taking them to be numbers rather than components of a tensor in some basis, which is what they actually are. And then write

$$
R^\alpha_{\;\; \beta} = r_{(\beta)} \delta^\alpha_{\;\; \beta}
$$

no summation over ##\beta## is implied. Then you could write

\begin{align*}
R^\alpha_{\;\; \gamma} R^\gamma_{\;\; \beta} & = r_{(\alpha)} r_{(\beta)} \delta^\alpha_{\;\; \gamma} \delta^\gamma_{\;\; \beta}
\nonumber \\
& = r_{(\alpha)} r_{(\beta)} \delta^\alpha_{\;\; \beta}
\nonumber \\
& = r_{(\alpha)}^2 \delta^\alpha_{\;\; \beta}
\nonumber \\
& = \delta^\alpha_{\;\; \beta}
\end{align*}

But this is an abuse of the usual convention.
 
Last edited:
  • Like
Likes vanhees71, topsquark and redtree
Here is a little puzzle from the book 100 Geometric Games by Pierre Berloquin. The side of a small square is one meter long and the side of a larger square one and a half meters long. One vertex of the large square is at the center of the small square. The side of the large square cuts two sides of the small square into one- third parts and two-thirds parts. What is the area where the squares overlap?

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
12
Views
2K
Replies
29
Views
3K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 10 ·
Replies
10
Views
2K