Einstein summation convention and rewriting as a matrix

I.e., it is a sum over ##j## and ##k## and both, ##j## and ##k##, are summed over the same range, here ##1-3##. And with this range I would like to calculate the matrix product$$\begin{bmatrix} a_{11} & a_{12} & a_{13} \\ a_{21} & a_{22} & a_{23} \\ a_{31} & a_{32} & a_{33} \end{bmatrix} \cdot \begin{bmatrix} b_{11} & b_{12} & b_{13} \\ b_{21} & b_{22} &
  • #1
peterspencers
72
0

Homework Statement



The question asks us to write down the matrix represented by the following summation.

2. Homework Equations


The question summation...
$$\sum_{j,k=1}^{3} a_{ij}b_{jk}x_{k}$$

The Attempt at a Solution



$$
\sum_{j,k=1}^{3} a_{ij}b_{jk}x_{k} = \begin{pmatrix}a_{1j}b_{jk}x_{k}\\a_{2j}b_{jk}x_{k}\\a_{3j}b_{jk}x_{k}\end{pmatrix}$$

$$\begin{pmatrix}a_{11}b_{11}x_{1}+a_{12}b_{21}x_{1}+a_{13}b_{31}x_{1}+a_{11}b_{12}x_{2}+a_{12}b_{22}x_{2}+a_{13}b_{32}x_{2}+a_{11}b_{13}x_{3}+a_{12}b_{23}x_{3}+a_{13}b_{33}x_{3}
\\a_{21}b_{11}x_{1}+a_{22}b_{21}x_{1}+a_{23}b_{31}x_{1}+a_{21}b_{12}x_{2}+a_{22}b_{22}x_{2}+a_{23}b_{32}x_{2}+a_{21}b_{13}x_{3}+a_{22}b_{23}x_{3}+a_{23}b_{33}x_{3}
\\a_{31}b_{11}x_{1}+a_{32}b_{21}x_{1}+a_{33}b_{31}x_{1}+a_{31}b_{12}x_{2}+a_{32}b_{22}x_{2}+a_{33}b_{32}x_{2}+a_{31}b_{13}x_{3}+a_{32}b_{23}x_{3}+a_{33}b_{33}x_{3}\end{pmatrix}$$
[/B]
 
Physics news on Phys.org
  • #2
I suggest to write it as ##\displaystyle \sum_k \left( \sum_j a_{ij}b_{jk} \right) x_k##. So what is the inner sum?
 
  • #3
So, the inner sum has one repeated indicie and two free indices, so therefore has nine equations, forming a 3x3 matrix with three terms in each equation, giving us...

$$
\begin{pmatrix}a_{1j}b_{j1}&a_{1j}b_{j2}&a_{1j}b_{j3}\\a_{2j}b_{j1}&a_{2j}b_{j2}&a_{2j}b_{j3}\\a_{3j}b_{j1}&a_{3j}b_{j2}&a_{3j}b_{j3}\end{pmatrix}$$
Where j is summed over 1-3 in the usual way?
 
  • #4
I don't see any Einstein summation by the way, but nonetheless the inner sum is only a number, for each pair ##(i,k)## one sum. What do we get, e.g. for ##(i,k)=(2,3)\,##? In total we have all combinations ##\{(i,k)\,\vert \,1 \leq i,k \leq 3\}##, that is ##9## combinations in total. It makes sense to arrange them as a matrix, yes, so what is in the ##i-##th row and ##k-##th column?
 
  • #5
peterspencers said:
Where j is summed over 1-3 in the usual way?
Sorry, I've seen this a bit too late. Yes, but why didn't you sum it? Does ##a_{21}b_{13}+a_{22}b_{23}+a_{23}b_{33}## remind you on something? How else can it be written?
 
  • #6
To write the original question in Einstein summation, it would be.. ?
$$a_{ij}b_{jk}x_{k}$$
for (i,k) = (2,3) we get.. ?
$$a_{2j}b_{j3} = a_{21}b_{13}+a_{22}b_{23}+a_{23}b_{33}$$
And in the ith row and kth column we would have... ?
$$a_{i1}b_{1k}+a_{i2}b_{2k}+a_{i3}b_{3k}$$
 
  • #7
Yes. And this is the same as
$$
\begin{bmatrix}a_{i1} & a_{i2} & a_{i3}\end{bmatrix} \cdot \begin{bmatrix}b_{1k} \\ b_{2k} \\ b_{3k} \end{bmatrix}
$$
Do you see the matrix multiplication from here?

The Einstein summation would be ##\displaystyle \sum_j a_{ij}b_{jk} = a^i_j\, b^j_k##. It drops the sum by interpretation of the location where the indices are written. The inner ##j##, which is the lower index of the first factor and the upper index of the second factor, indicates (implicitly) what has to be summed.

Now write this as a matrix multiplication. And in the next step, do the same with the sum over ##k## and the vector ##(x_k)_k##.
 
  • #8
For the first task.
$$
\begin{bmatrix}a_{ij}&a_{ij}&a_{ij}\end{bmatrix}\cdot\begin{bmatrix}b_{jk}\\b_{jk}\\b_{jk}\end{bmatrix}$$
And for the second tast
$$\left(\begin{bmatrix}a_{ij}&a_{ij}&a_{ij}\end{bmatrix}\cdot\begin{bmatrix}b_{jk}\\b_{jk}\\b_{jk}\end{bmatrix}\right)\cdot\begin{bmatrix}x_{k}\\x_{k}\\x_{k}\end{bmatrix}$$
$$=\begin{bmatrix}a_{ij}b_{jk}&a_{ij}b_{jk}&a_{ij}b_{jk}\end{bmatrix}\cdot\begin{bmatrix}x_{k}\\x_{k}\\x_{k}\end{bmatrix}$$
 
  • #9
I'm not sure whether you abbreviate things or don't know matrix multiplication. A vector ##(x_k,x_k,x_k)## doesn't make sense here.

Let me try it the other way around. Assume we have a square matrix ##A## and a square matrix ##B##. What is at position ##(i,k)## of the product matrix, i.e. the entry in the ##i-##th row and ##k-##th column? Let us denote this entry by ##(AB)_{ik}##.

So what is ##(AB)_{ik}##?

If we further have a vector ##\begin{bmatrix}x_1 \\ \vdots \\ x_n\end{bmatrix}##, what is the ##i-##th entry of, say ##A\cdot x## or in coordinates, what is

##(A\cdot x)_{i}##?
 
  • #10
$$(AB)_{ik}=\begin{pmatrix}a_{11}b_{11}+a_{12}b_{21}&a_{11}b_{12}+a_{12}b_{22}\\a_{21}b_{11}+a_{22}b_{21}&a_{21}b_{12}+a_{22}b_{22}\end{pmatrix}$$
and
$$(A(\cdot)x)_{i}=a_{i1}x_{1}+a_{i2}x_{2}+... a_{in}x_{n}$$

May I ask what the correct latex code for A dot x is?
 
  • #11
peterspencers said:
$$(AB)_{ik}=\begin{pmatrix}a_{11}b_{11}+a_{12}b_{21}&a_{11}b_{12}+a_{12}b_{22}\\a_{21}b_{11}+a_{22}b_{21}&a_{21}b_{12}+a_{22}b_{22}\end{pmatrix}$$
and
$$(A(\cdot)x)_{i}=a_{i1}x_{1}+a_{i2}x_{2}+... a_{in}x_{n}$$

May I ask what the correct latex code for A dot x is?
Yes. It's ##\text{ A \cdot x }##. For matrices, also \ldots (horizontal dots) and \vdots (vertical dots) can be useful. The second equation is correct. The first one is also true for ##n=2## where you used an abbreviation:

If we multiply two matrices, the entry at position ##(i,k)## is the sum we get, if we multiply the ##i-##th row of ##A## with the ##k-##th column of ##B##. In formulas this reads
$$
(A\cdot B)_{ik} = \begin{bmatrix} a_{i1} & \ldots & a_{in} \end{bmatrix} \cdot \begin{bmatrix} b_{1k} \\ \vdots \\ b_{nk} \end{bmatrix} = \sum_{j=1}^n a_{ij} \, b_{jk} = a^i_j \, b^j_k
$$
and the matrix is ##AB = ((AB)_{ik})_{{}^{1\leq i \leq n}_{1 \leq k \leq n}}\,##. The outer indices are mostly not written and people write short ##(AB)_{ik}## but mean the entire matrix. To be exact, it only notes a single entry and the outer indices are needed to show that we consider the entire number scheme, the matrix (as you did). Since it cannot be confused, people simply drop the ranges of ##i## and ##k##. But keep this in mind, if you see, e.g. ##M_{ij}##. It can be meant as a single entry (at position ##(i,j)\,##) or the entire matrix. I asked for the single entry, not the entire matrix, but o.k.

Now you know what ##(A\cdot B)_{ik}## is - single entries and entire matrix, too - as well as what ##(A \cdot x)_{k}## is. Now substitute the entries of the product into the formula for ##Ax##. Remember, that this again is a sum. It's summed over ##k##.

If you want to write the entire expression in Einstein summation, it is
$$
\sum_{j,k}a_{ij} \, b_{jk} \,x_k = a^i_j \, b^j_k \, x^k
$$
 
Last edited:
  • #12
so, just to make sure I am with you so far, the entry position of ##(A \cdot x)_{i}##
$$=\sum_{j=1}^{n} a_{ij}x_{j}$$
Where ##j## represents columns in the matrix ##A## and rows in the matrix ##x##
and
$$(A \cdot x)=Ax$$
where A is a square matrix ..
$$A=\begin{pmatrix}a_{11}..&a_{1n}\\.\\.\\a_{n1}..&a_{nn}\end{pmatrix}$$
and ..
$$x=\begin{pmatrix}x_{1}\\.\\.\\x_{n}\end{pmatrix}$$
so.. ##(A \cdot x)_{k}## asks for the ##kth## position of the product of ##A## and ##x##, and ##k## represents the column in that product matrix?
But that product matrix only has one column, I can see what ##(A \cdot x)_{i}## is but ##(A \cdot x)_{k}## does not make any sense to me as I have understood things so far, have I misunderstood what ##A## and ##x## are in this context?
 
  • #13
##A## was a bad choice, as it's also a factor of ##AB##. But you've said
peterspencers said:
$$(A \cdot x)_{i}=a_{i1}x_{1}+a_{i2}x_{2}+... a_{in}x_{n}$$
Let's change the variables for clarity. Say
$$(P \cdot x)_{i}=p_{i1}x_{1}+p_{i2}x_{2}+\ldots + p_{in}x_{n}$$
Basically you also have calculated
$$
(A\cdot B)_{ik} = \begin{bmatrix} a_{i1} & \ldots & a_{in} \end{bmatrix} \cdot \begin{bmatrix} b_{1k} \\ \vdots \\ b_{nk} \end{bmatrix} = \sum_{j=1}^n a_{ij} \, b_{jk} = a^i_j \, b^j_k
$$
Now cmbine both equations to ## ((AB) \cdot x)_i = (AB)_{i1}x_{1}+(AB)_{i2}x_{2}+\ldots + (AB)_{in}x_{n}## and insert what you have for ##(AB)_{ik}##. The sum is thus the sum along ##k## of the original formula in post #1.
 
  • #14
Ok, I think I understand, many thanks for your patient help.

Just to make sure I've understood correctly, if we take another example:

$$\sum_{j=1}^{3} x_{j}b_{ij}$$

This would then equal..

$$x_{1}b_{i1}+x_{2}b_{i2}+x_{3}b_{i3}$$

Which is the dot product of the matrices...

$$\begin{pmatrix}x_{1}&x_{2}&x_{3}\end{pmatrix}\cdot\begin{pmatrix}b_{i1}&b_{i2}&b_{i3}\end{pmatrix}$$

Which gives the matrix..

$$\begin{pmatrix}x_{1}b_{11}+x_{2}b_{12}+x_{3}b_{13}\\x_{1}b_{21}+x_{2}b_{22}+x_{3}b_{23}\\x_{1}b_{31}+x_{2}b_{32}+x_{3}b_{33}\end{pmatrix}$$

Is this correct?
 
Last edited:
  • #15
peterspencers said:
Ok, I think I understand, many thanks for your patient help.

Just to make sure I've understood correctly, if we take another example:

$$\sum_{j,=1}^{3} x_{j}b_{ij}$$
I assume the comma is a rest of a copy and paste error and does not indicate a forgotten summation over ##i##.
This would then equal..

$$x_{1}b_{i1}+x_{2}b_{i2}+x_{3}b_{i3}$$

Which is the product of the matrices...

$$\begin{pmatrix}x_{1}&x_{2}&x_{3}\end{pmatrix}\cdot\begin{pmatrix}b_{i1}&b_{i2}&b_{i3}\end{pmatrix}$$
Not quite.
$$x_{1}b_{i1}+x_{2}b_{i2}+x_{3}b_{i3}= \begin{pmatrix}x_{1}&x_{2}&x_{3}\end{pmatrix}\cdot\begin{pmatrix}b_{i1}\\ b_{i2}\\b_{i3}\end{pmatrix}=\begin{pmatrix}x_{1}&x_{2}&x_{3}\end{pmatrix}\cdot\begin{pmatrix}b_{i1}&b_{i2}&b_{i3}\end{pmatrix}^\tau$$
The product of two row vectors isn't defined as a matrix multiplication. However, things might change, if you interpret the free index ##i## as an abbreviation of three stacked row vectors ##\begin{pmatrix}b_{11}&b_{12}&b_{13}\end{pmatrix}\; , \;\begin{pmatrix}b_{21}&b_{22}&b_{23}\end{pmatrix}\; , \;\begin{pmatrix}b_{31}&b_{32}&b_{33}\end{pmatrix}## in which case we have the product
$$
\begin{pmatrix}x_{1}&x_{2}&x_{3} \end{pmatrix} \cdot \begin{pmatrix}b_{11}&b_{12}&b_{13}\\b_{21}&b_{22}&b_{23}\\b_{31}&b_{32}&b_{33}\end{pmatrix}= \begin{pmatrix}x_1b_{11}+x_2b_{21}+x_3b_{31}\\x_1b_{12}+x_2b_{22}+x_3b_{32}\\x_1b_{13}+x_2b_{23}+x_3b_{33}\end{pmatrix}
$$
or three consecutive column vectors ##\begin{pmatrix}b_{11}&b_{12}&b_{13}\end{pmatrix}^\tau\; , \;\begin{pmatrix}b_{21}&b_{22}&b_{23}\end{pmatrix}^\tau\; , \;\begin{pmatrix}b_{31}&b_{32}&b_{33}\end{pmatrix}^\tau## in which case we have the product
$$
\begin{pmatrix}x_{1}&x_{2}&x_{3} \end{pmatrix} \cdot \begin{pmatrix}b_{11}&b_{21}&b_{31}\\b_{12}&b_{22}&b_{32}\\b_{13}&b_{23}&b_{33}\end{pmatrix}= \begin{pmatrix}x_1b_{11}+x_2b_{12}+x_3b_{13}\\x_1b_{21}+x_2b_{22}+x_3b_{23}\\x_1b_{31}+x_2b_{32}+x_3b_{33}\end{pmatrix}
$$
But who am I to know what you mean? Nobody can know, whether you use hidden, secret meanings. Row times row isn't allowed. It's either row times column, column times row, row times matrix or matrix times column. Whether the unspecified index ##i## represents something else than a single vector with a fixed ##i## cannot be known or even guessed in such a stripped context. That is why notation has either to be unambiguous, i.e. commonly used, or specified.
Which gives the matrix..

$$\begin{pmatrix}x_{1}b_{11}+x_{2}b_{12}+x_{3}b_{13}\\x_{1}b_{21}+x_{2}b_{22}+x_{3}b_{23}\\x_{1}b_{31}+x_{2}b_{32}+x_{3}b_{33}\end{pmatrix}$$
No. It's still only one sum with a fixed ##i##, i.e. a number, not a matrix. Except you used a notation which you have forgotten to tell me. If you refer to Einstein summations, I really recommend to use the upper and lower index notation I mentioned earlier, as it at least encrypts the sum by index positioning, instead of writing everything with lower indices without mentioning what is meant. As long as you're not sure what you should write, it is better to write more than less. E.g.
$$
(b_{i1},b_{i2},b_{i3})^\tau = \begin{pmatrix}b_{i1}\\b_{i2}\\b_{i3}\end{pmatrix}
$$
is a single column vector. The same holds for row vectors. If you mean a matrix, then you should at least write something like
$$
\begin{pmatrix}b_{11}&b_{12}&b_{13}\\b_{21}&b_{22}&b_{23}\\b_{31}&b_{32}&b_{33}\end{pmatrix} = \begin{pmatrix}b_{i1}&b_{i2}&b_{i3}\end{pmatrix}_{1 \leq i \leq 3} = \begin{pmatrix}b_{i1}\\b_{i2}\\b_{i3}\end{pmatrix}^\tau_{1 \leq i \leq 3}=\begin{pmatrix}b_{i1}\\b_{i2}\\b_{i3}\end{pmatrix}^\tau_i
$$
or otherwise explicitly mention what is meant. As far as I know, Einstein notation is an abbreviation for summations, where sums are omitted by positioning indices, see e.g. https://en.wikipedia.org/wiki/Einstein_notation
I've read on another side, that also a complete lower notation can be used, but this is in my opinion not a good idea. In any case it's a sum, hidden behind free indices, no stacks, neither as columns nor as rows. If it is meant this way, an additional index outside (as in my last example) is really useful.

Here I have written vectors as rows and their transposed form as columns, as you did.
Usually - another convention - it is vice versa. ##v## is a column vector, and its transposed form ##v^\tau## is a row vector. It isn't important which one is used, as long
  • The convention used is explained at the beginning.
  • The convention will not change in the middle of the text.
 
Last edited:
  • #16
Ok, let me just get a few things cleared up in my understanding..

Firstly, the questions I am attempting to solve are on Einstein summation convention, so the question should be written..

$$\sum_{j=1}^{3} x_{j}b_{ij} = x^{j}b^{i}_{j}$$

Which means that the sum is equal to ..

$$x_{1}b_{i1}+x_{2}b_{i2}+x_{3}b_{i3}$$

Where the free index ##i## is an abbreviation for three stacked row vectors, giving us the matrix..

$$\begin{pmatrix}x_{1}b_{11}+x_{2}b_{12}+x_{3}b_{13}\\x_{1}b_{21}+x_{2}b_{22}+x_{3}b_{23}\\x_{1}b_{31}+x_{2}b_{32}+x_{3}b_{33}\end{pmatrix}$$

.. and the ##j## index on ##x^{j}## is in the upper position so refers to the row position of the matrix ##x##. Giving..

$$\textbf{x}=\begin{pmatrix}x_{1}\\x_{2}\\x_{3}\end{pmatrix}$$

and..

$$\textbf{b}=b^{i}_{j}=\begin{pmatrix}b_{i1}&b_{i2}&b_{i3}\end{pmatrix}=\begin{pmatrix}b_{11}&b_{12}&b_{13}\\b_{21}&b_{22}&b_{23}\\b_{31}&b_{32}&b_{33}\end{pmatrix}$$

But ##\textbf{x}## & ##\textbf{b}## are matrices and are non-commutative so I must perform ##\textbf{x}\textbf{b}## which is not allowed as the first vector does not have the same number of columns as the second vector. I fear I have made a fundamental misunderstanding somewhere..
 
Last edited:
  • #17
This looks fine. Only if we go back to your first equation, you have
$$
\sum_{j=1}^3 x_jb_{ij} = x^jb_j^i = b_j^i x^j
$$
as they are ordinary scalars, which do commute. This gives you the matrix equation ##\mathbf{b}\mathbf{x}## with the matrix
$$
\mathbf{b}= (\mathbf{b})_{ij}= \mathbf{b}_{ij}= \begin{bmatrix}b_{11}&b_{12}&b_{13}\\b_{21}&b_{22}&b_{23}\\b_{31}&b_{32}&b_{33}\end{bmatrix}
$$
and a column vector
$$
\mathbf{x}=(\mathbf{x})_j = \mathbf{x}_j = \begin{bmatrix}x_1\\x_2\\x_3\end{bmatrix}
$$
The result is again a column vector ##\mathbf{b}\mathbf{x}##, the one you have written in the previous post.

Now we have written the vectors (as usual) as columns. If we want to write the equation by row vectors, we have to transpose it:
$$
(\mathbf{b}\mathbf{x})^\tau = \mathbf{x}^\tau \mathbf{b}^\tau = \left( \begin{bmatrix}x_1&x_2&x_3\end{bmatrix} \cdot \begin{bmatrix}b_{i1}&b_{i2}&b_{i3}\end{bmatrix}^\tau \right)_i
$$

So in this notation, this is basically what you have written, plus an additional transpose of the ##\mathbf{b}_i ## row, because it has to be a column. The confusion with the indices in this transposed row notation comes from the fact, that we first had "##i-##th row of ##\mathbf{b}##" times "column ##\mathbf{x}##", but transposed it is "row ##\mathbf{x}##" times "##i-##th column of ##\mathbf{b}^\tau##".
 

1. What is the Einstein summation convention?

The Einstein summation convention is a mathematical notation used to simplify the writing and manipulation of equations involving tensor products and tensor contractions. It states that whenever there is a repeated index in a product of tensors, it is implicitly summed over all possible values.

2. How does the Einstein summation convention relate to matrix notation?

The Einstein summation convention can be rewritten in matrix notation by representing tensors as matrices and using the summation symbol in place of the index notation. This allows for easier manipulation and computation of equations involving tensors.

3. What is the benefit of using the Einstein summation convention and matrix notation?

The benefit of using the Einstein summation convention and matrix notation is that it simplifies the writing and manipulation of equations involving tensors. It also allows for easier visualization and computation, and reduces the number of indices and summation symbols required.

4. Are there any limitations to using the Einstein summation convention and matrix notation?

One limitation of using the Einstein summation convention and matrix notation is that it can only be applied to linear operations. Nonlinear operations, such as division or multiplication of tensors, cannot be easily represented using this notation.

5. Can the Einstein summation convention and matrix notation be used in all fields of science?

Yes, the Einstein summation convention and matrix notation can be used in various fields of science, including physics, mathematics, and engineering. It is particularly useful in fields involving tensor calculus, such as general relativity and fluid dynamics.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
713
  • Precalculus Mathematics Homework Help
Replies
7
Views
1K
  • Precalculus Mathematics Homework Help
Replies
7
Views
823
  • Precalculus Mathematics Homework Help
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
4
Views
1K
Replies
1
Views
776
  • Linear and Abstract Algebra
Replies
3
Views
2K
Replies
7
Views
2K
  • Engineering and Comp Sci Homework Help
Replies
6
Views
3K
Back
Top