I Problem with notation of matrix elements

  • I
  • Thread starter Thread starter Lambda96
  • Start date Start date
Lambda96
Messages
233
Reaction score
77
TL;DR Summary
What does this notation ##A_{\quad i}^j## and ##A_i^{\quad j}## mean?
Hi,

In one of my assignments, we had to prove that the Frobenius product corresponds to a complex scalar product. For one, we had to prove that the Frobenius product is hermitian symmetric.

I have now received the solution to the problem, and unfortunately I do not understand the notation for the individual matrix elements. I only know the notation ##a_{ij}## but what does it mean when one of the indices is written with a space of A or B, what is this space about? What should be the row and what the column in this kind of notation?

Here is the solution

Bildschirmfoto 2023-07-20 um 15.45.35.png
 
Physics news on Phys.org
Lambda96 said:
TL;DR Summary: What does this notation ##A_{\quad i}^j## and ##A_i^{\quad j}## mean?

Hi,

In one of my assignments, we had to prove that the Frobenius product corresponds to a complex scalar product. For one, we had to prove that the Frobenius product is hermitian symmetric.

I have now received the solution to the problem, and unfortunately I do not understand the notation for the individual matrix elements. I only know the notation ##a_{ij}## but what does it mean when one of the indices is written with a space of A or B, what is this space about? What should be the row and what the column in this kind of notation?

Here is the solution

View attachment 329456

It is called Einstein notation or Einstein summation. Physicists use it all the time.
https://en.wikipedia.org/wiki/Einstein_notation

You can deconstruct it by the image you posted.
\begin{align*}
(A^\dagger B)_{ij}&=\sum_{k=1}^n (A^\dagger )_{ik}\cdot B_{kj} =\sum_{k=1}^n (\overline{A_{ki}})\cdot B_{kj}\\
\operatorname{trace}(A^\dagger B)&=\sum_{p=1}^n (A^\dagger B)_{pp}\\
&=\sum_{p=1}^n \left(\sum_{k=1}^n (\overline{A})_{kp}\cdot B_{kp}\right)\\
&=\sum_{j=1}^n \left(\sum_{i=1}^n (\overline{A})_{ij}\cdot B_{ij}\right)\\
&= (\overline{{A_j}^i})\cdot {B^j}_i
\end{align*}

It is an abbreviation for the summation. Summed is over the indices that occur on top and at the bottom, here twice: sum over ##i## and sum over ##j##.
 
  • Like
Likes Lambda96 and PeroK
Lambda96 said:
TL;DR Summary: What does this notation ##A_{\quad i}^j## and ##A_i^{\quad j}## mean?

Hi,

In one of my assignments, we had to prove that the Frobenius product corresponds to a complex scalar product. For one, we had to prove that the Frobenius product is hermitian symmetric.

I have now received the solution to the problem, and unfortunately I do not understand the notation for the individual matrix elements. I only know the notation ##a_{ij}## but what does it mean when one of the indices is written with a space of A or B, what is this space about? What should be the row and what the column in this kind of notation?

Here is the solution

View attachment 329456
If in an assignment you encounter a notation that you have never seen before, then there must be a serious disconnection between your course syllabus and what you are studying.
 
Thanks fresh_42 for your help 👍
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...

Similar threads

Back
Top