Index notation of vector rotation

Click For Summary

Discussion Overview

The discussion revolves around the index notation and matrix notation of vector rotations, particularly focusing on the representation of vectors and the implications of orthogonal transformations. Participants explore the consistency between these two notational systems and the challenges associated with index placement in summation.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants express confusion regarding the relationship between matrix notation and index notation for vector rotations, particularly in terms of index placement and summation rules.
  • There is a discussion about the representation of vectors as products of basis vectors and their components, with some participants suggesting that the notation used is unconventional.
  • One participant proposes that the use of an identity matrix in the context of orthogonal transformations does not alter the fundamental representation of the vector.
  • Another participant points out the limitation of having more than two identical indices in summation, suggesting the need for alternative notation to avoid confusion.
  • Some participants agree that the notation is common in physics but express uncertainty about its mathematical justification.
  • There is a mention of the Einstein summation notation as a preferred method in some physics texts, contrasting with the matrix representation discussed.
  • A participant attempts to rewrite the notation to clarify the relationship between different bases and the corresponding transformations, seeking validation from others.

Areas of Agreement / Disagreement

Participants generally express confusion and uncertainty regarding the notational conventions and their implications. There is no clear consensus on the correctness of the proposed notations or the best approach to reconcile the differences between matrix and index notation.

Contextual Notes

Participants highlight limitations related to index placement and the potential for misunderstanding when using unconventional matrix representations. The discussion also reflects varying levels of familiarity with the notation and its applications in physics.

shinobi20
Messages
277
Reaction score
20
TL;DR
I'm a bit confused with the matrix notation and index notation of vector rotations. Specifically, how to understand the consistency of the index placement compared to the matrix notation.
Given a vector ##\mathbf{r} = r^i e_i## where ##r^i## are the components, ##e_i## are the basis vectors, and ##i = 1, \ldots, n##. In matrix notation,

\begin{equation*}
\mathbf{r} = \begin{bmatrix} e_1 & e_2 & \ldots e_n \end{bmatrix} \begin{bmatrix} r^1 \\ r^2 \\ \vdots\\ r^n \end{bmatrix}
\end{equation*}

We can do an orthogonal transformation, e.g. rotation, on both the components and the basis vectors by inserting an identity matrix ##I = R^T R## in the middle and not change the vector. In index notation,

\begin{equation*}
\mathbf{r} = e_i (R^T)^i_{\; j} R^k_{\; i} r^i
\end{equation*}

Notice that the indices are placed in such a way that the summation with respect to the components and basis vectors makes sense. However, I'm a bit confused with the two notations,

\begin{align*}
& \mathbf{r} = e R^T R r \quad \text{matrix notation}\\
& \mathbf{r} = e_i (R^T)^i_{\; j} R^k_{\; i} r^i \quad \text{index notation}
\end{align*}

For the first equation, I believe ##R^T R## is a simple matrix multiplication so that without thinking of the current context it should follow the row-column rule for matrix multiplication ##(R^T)^i_{\; j} R^j_{\; k}##. Notice that the column of ##R^T## labeled by ##j## is summed over the rows of ##R## also labeled by ##j##. So, how do I reconcile this with the second equation?

I know that we could just switch the placement of ##(R^T)^i_{\; j} R^k_{\; i}## in the index notation since they are just numbers such that ##(R^T)^i_{\; j} R^k_{\; i} =R^k_{\; i} (R^T)^i_{\; j}##; in this case the summed over indices is ##i## and it now follows the row-column summation having the same index. I think I have a partial understanding of the situation, but I think I need more clarification on this.
 
Physics news on Phys.org
shinobi20 said:
TL;DR Summary: I'm a bit confused with the matrix notation and index notation of vector rotations. Specifically, how to understand the consistency of the index placement compared to the matrix notation.

Given a vector ##\mathbf{r} = r^i e_i## where ##r^i## are the components, ##e_i## are the basis vectors, and ##i = 1, \ldots, n##. In matrix notation,

\begin{equation*}
\mathbf{r} = \begin{bmatrix} e_1 & e_2 & \ldots e_n \end{bmatrix} \begin{bmatrix} r^1 \\ r^2 \\ \vdots\\ r^n \end{bmatrix}
\end{equation*}
I don't think I've seen this idea before. I guess we are putting the basis vectors in a row matrix and calling that ##e##, and putting the components of the vector ##\mathbf r## (in that basis) into a column matrix and calling that ##r##. Then, the usual decomposition of the vector in that basis can be represented as a matrix multiplication: ##er##.
shinobi20 said:
We can do an orthogonal transformation, e.g. rotation, on both the components and the basis vectors by inserting an identity matrix ##I = R^T R## in the middle and not change the vector. In index notation,

\begin{equation*}
\mathbf{r} = e_i (R^T)^i_{\; j} R^k_{\; i} r^i
\end{equation*}
You can't have more that two indices the same in any summation. You need to use something else instead of one pair of the ##i##'s.
shinobi20 said:
Notice that the indices are placed in such a way that the summation with respect to the components and basis vectors makes sense. However, I'm a bit confused with the two notations,

\begin{align*}
& \mathbf{r} = e R^T R r \quad \text{matrix notation}\\
& \mathbf{r} = e_i (R^T)^i_{\; j} R^k_{\; i} r^i \quad \text{index notation}
\end{align*}

For the first equation, I believe ##R^T R## is a simple matrix multiplication so that without thinking of the current context it should follow the row-column rule for matrix multiplication ##(R^T)^i_{\; j} R^j_{\; k}##. Notice that the column of ##R^T## labeled by ##j## is summed over the rows of ##R## also labeled by ##j##. So, how do I reconcile this with the second equation?

I know that we could just switch the placement of ##(R^T)^i_{\; j} R^k_{\; i}## in the index notation since they are just numbers such that ##(R^T)^i_{\; j} R^k_{\; i} =R^k_{\; i} (R^T)^i_{\; j}##; in this case the summed over indices is ##i## and it now follows the row-column summation having the same index. I think I have a partial understanding of the situation, but I think I need more clarification on this.
Putting in the extra matrices doesn't change the basic idea, as far as I understand it. Again, you can't have more than two indices the same.
 
PeroK said:
I don't think I've seen this idea before. I guess we are putting the basis vectors in a row matrix and calling that ##e##, and putting the components of the vector ##\mathbf r## (in that basis) into a column matrix and calling that ##r##. Then, the usual decomposition of the vector in that basis can be represented as a matrix multiplication: ##er##.

You can't have more that two indices the same in any summation. You need to use something else instead of one pair of the ##i##'s.

Putting in the extra matrices doesn't change the basic idea, as far as I understand it. Again, you can't have more than two indices the same.
Oh! I see what's wrong. I should have written,

\begin{equation*}
\mathbf{r} = r'^i e'_i = e'_i r'^i = e_j (G^T)^j_{\; i} G^i_{\; k} r^k
\end{equation*}

Now, the matrix notation and the index notation match, is this correct?

Also, with regards to the idea that you have not seen, please take a peek at post.
 
shinobi20 said:
Oh! I see what's wrong. I should have written,

\begin{equation*}
\mathbf{r} = r'^i e'_i = e'_i r'^i = e_j (G^T)^j_{\; i} G^i_{\; k} r^k
\end{equation*}

Now, the matrix notation and the index notation match, is this correct?

Also, with regards to the idea that you have not seen, please take a peek at post.
It's the matrix notation that is new to me. I can see how it works, but it's stretching the concept of a matrix somewhat - your row matrix of basis vectors is not a row matrix of numbers, which is what you would normally expect. That said, I'm not sure what your question is.
 
PeroK said:
It's the matrix notation that is new to me. I can see how it works, but it's stretching the concept of a matrix somewhat - your row matrix of basis vectors is not a row matrix of numbers, which is what you would normally expect. That said, I'm not sure what your question is.
I agree it is a valid concern, I think it is convenient to write it that way since we can think of a vector as the inner product of its components and basis vectors. Although, as to the precise mathematical justification, I'm not sure. One thing is that it is quite a common notation being used in the physics circle, I guess.
 
shinobi20 said:
I agree it is a valid concern, I think it is convenient to write it that way since we can think of a vector as the inner product of its components and basis vectors.
It's not really an inner product.
shinobi20 said:
Although, as to the precise mathematical justification, I'm not sure. One thing is that it is quite a common notation being used in the physics circle, I guess.
Yes, I can see the temptation to write ##[\mathbf e_1, \dots \mathbf e_n]## and use that a "vector" of sorts. But, the physics texts I've seen have avoided this, in favour of using the Einstein summation notation.
 
PeroK said:
It's not really an inner product.

Yes, I can see the temptation to write ##[\mathbf e_1, \dots \mathbf e_n]## and use that a "vector" of sorts. But, the physics texts I've seen have avoided this, in favour of using the Einstein summation notation.
I see. But talking about summation notation, is my rewrite in post #3 correct in terms of summation notation?
 
shinobi20 said:
I see. But talking about summation notation, is my rewrite in post #3 correct in terms of summation notation?
Okay, I think I see what you are trying to do. First, we can write down, with implied summation:
$$\mathbf r = r^i \mathbf e_i = r'^i \mathbf e'_i$$That's just a vector expressed in two different bases.

Now, the primed and unprimed bases must be related. I.e. each basis vector itself can be expressed in the other basis. In general, we would have something like:
$$\mathbf e_i = T_i^j \mathbf e'_j$$Where, at this stage, the ##T_i^j## are just a set of numbers. You can then put these numbers into a matrix. We also have another set of numbers ##S_i^j##, where:
$$\mathbf e'_i = S_i^j \mathbf e_j$$Note that, in general:
$$\mathbf e_i = T_i^jS_j^k \mathbf e_k$$And we can see that the matrices ##T## and ##S## are inverses of each other.

What you are doing is using these transformation matrices informaly to write this vector equation in an informal matrix format:
$$e = Te'$$, where ##e## and ##e'## are row and column "matrices" with the bases vectors as entries. That's purely notational and I'm not sure what the problem is.

What's important is that we often want to know how the components of a vector transform under a change of basis. I.e. we want to relate the components ##r^i## to the components ##r'^i##. We can see that:
$$\mathbf r = r^i \mathbf e_i = r'^j \mathbf e'_j = r'^j S_j^i \mathbf e_i$$And, by uniqueness of the expansion in a given basis, we can see that:
$$r^i = r'^j S_j^i = (S^T)^i_jr'^j$$In other words, if we want to use this in matrix form, we have to use the transpose of the matrix ##S##.

This is what's tricky. We are using the "other" matrix, transposed. In particular, the components of a vector transform differently from the basis vectors themselves.

That's the important thing. The rest is notational.
 
  • Like
Likes   Reactions: shinobi20

Similar threads

  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 27 ·
Replies
27
Views
2K
  • · Replies 24 ·
Replies
24
Views
2K