Scalar product and the Kronecker delta symbol

Click For Summary
SUMMARY

The forum discussion centers on the mathematical proof that the scalar product \( A \cdot B \) is indeed a scalar, utilizing the Kronecker delta symbol \( \delta_{jk} \) and rotation matrices \( \lambda_{ij} \). The discussion highlights the transformation of vectors \( A \) and \( B \) into \( A' \) and \( B' \) through the application of rotation matrices, demonstrating that the scalar product remains invariant under these transformations. Key insights include the significance of summing over indices and the orthogonality of direction cosines, which are essential for understanding the relationship between the scalar product and the Kronecker delta.

PREREQUISITES
  • Understanding of vector transformations and rotation matrices
  • Familiarity with the scalar product in linear algebra
  • Knowledge of the Kronecker delta symbol and its properties
  • Basic concepts of orthogonality in vector spaces
NEXT STEPS
  • Study the properties of rotation matrices in linear algebra
  • Learn about the applications of the Kronecker delta in tensor analysis
  • Explore the implications of orthogonality in higher-dimensional spaces
  • Investigate the relationship between eigenvectors and scalar products
USEFUL FOR

Mathematicians, physicists, and students studying linear algebra or quantum mechanics who seek to deepen their understanding of vector transformations and scalar products.

Karol
Messages
1,380
Reaction score
22
From a textbook. proof that the scalar product ##A\centerdot B## is a scalar:
Vectors A' and B' are formed by rotating vectors A and B:
$$A'_i=\sum_j \lambda_{ij} A_j,\; B'_i=\sum_j \lambda_{ij} B_j$$
$$A' \centerdot B'=\sum_i A'_i B'_i =\sum_i \left( \sum_j \lambda_{ij} A_j \right)\left( \sum_k \lambda_{ik} B_k \right)$$
$$=\sum_{i,k} \left( \sum_{i} \lambda_{ij} \lambda_{ik} \right) A_j B_k=\sum_{j} \left( \sum_{k} \delta_{jk} A_j B_k \right) $$
But:
$$\sum_j \lambda_{ij} \lambda_{kj} =\delta_{jk} $$
The order of the indexes in ##\sum_i \lambda_{ij} \lambda_{ik}## is inverse.
$$\lambda_{ij}=\cos(x'_i,x_j)$$
And, for example, for i=2:
$$\lambda^2_{21}+\lambda^2_{22}+\lambda^2_{23}=1$$
The identical index i=2 comes first, while in ##\sum_{i} \lambda_{ij} \lambda_{ik}## the (should be, or not to be) identical indexes j and k are second and the first index, i, changes.
 
Physics news on Phys.org
The relation is true regardless of whether the first or the second index is summed over.
 
Orodruin said:
The relation is true regardless of whether the first or the second index is summed over.
I know, but why?
When the second index is summed over the expression ##\lambda_{ij} \lambda_{kj}## makes sense, the 3 are the direction cosines of different, orthogonal, lines ##x'_i##:
$$\sum_j \lambda_{ij} \lambda_{kj}=\left\{ \begin{array}{l}
\cos^2(x'_1,x_1)+\cos^2(x'_1,x_2)+\cos^2(x'_1,x_3)=1 \\
\cos(x'_1,x_1)\cos(x'_2,x_1)+\cos(x'_1,x_2)\cos(x'_2,x_2)+\cos(x'_1,x_3)\cos(x'_2,x_3)=0 \\
\cos(x'_1,x_1)\cos(x'_3,x_1)+\cos(x'_1,x_2)\cos(x'_3,x_2)+\cos(x'_1,x_3)\cos(x'_3,x_3)=0 \\
\cos(x'_2,x_1)\cos(x'_1,x_1)+\cos(x'_2,x_2)\cos(x'_1,x_2)+\cos(x'_2,x_3)\cos(x'_1,x_3)=0 \\
\cos^2(x'_2,x_1)+\cos^2(x'_2,x_2)+\cos^2(x'_2,x_3)=1 \\
\cos(x'_2,x_1)\cos(x'_3,x_1)+\cos(x'_2,x_2)\cos(x'_3,x_2)+\cos(x'_2,x_3)\cos(x'_3,x_3)=0 \\
\cos(x'_3,x_1)\cos(x'_1,x_1)+\cos(x'_3,x_2)\cos(x'_1,x_2)+\cos(x'_3,x_3)\cos(x'_1,x_3)=0 \\
\cos(x'_3,x_1)\cos(x'_2,x_1)+\cos(x'_3,x_2)\cos(x'_2,x_2)+\cos(x'_3,x_3)\cos(x'_2,x_3)=0 \\
\cos^2(x'_3,x_1)+\cos^2(x'_3,x_2)+\cos^2(x'_3,x_3)=1
\end{array} \right.$$
But in ##\sum_i \lambda_{ij} \lambda_{ik}## the multiplication in each member is between the cosine direction of the same line, which doesn't have a meaning:
$$\sum_i \lambda_{ij} \lambda_{ik} =\left\{ \begin{array}{l}
\cos^2(x'_1,x_1)+\cos^2(x'_2,x_1)+\cos^2(x'_3,x_1)=? \\
\cos(x'_1,x_1)\cos(x'_1,x_2)+\cos(x'_2,x_1)\cos(x'_2,x_2)+\cos(x'_3,x_1)\cos(x'_3,x_2)=? \\
\cos(x'_1,x_1)\cos(x'_1,x_3)+\cos(x'_2,x_1)\cos(x'_2,x_3)+\cos(x'_3,x_1)\cos(x'_3,x_3)=? \\

\cos(x'_1,x_2)\cos(x'_1,x_1)+\cos(x'_2,x_2)\cos(x'_2,x_1)+\cos(x'_3,x_2)\cos(x'_3,x_1)=? \\
\cos^2(x'_1,x_2)+\cos^2(x'_2,x_2)+\cos^2(x'_3,x_2)=? \\
\cos(x'_1,x_2)\cos(x'_1,x_3)+\cos(x'_2,x_2)\cos(x'_2,x_3)+\cos(x'_3,x_2)\cos(x'_3,x_3)=? \\

\cos(x'_1,x_3)\cos(x'_1,x_1)+\cos(x'_2,x_3)\cos(x'_2,x_1)+\cos(x'_3,x_3)\cos(x'_3,x_1)=? \\
\cos(x'_1,x_3)\cos(x'_1,x_2)+\cos(x'_2,x_3)\cos(x'_2,x_2)+\cos(x'_3,x_3)\cos(x'_3,x_2)=? \\
\cos^2(x'_1,x_3)+\cos^2(x'_2,x_3)+\cos^2(x'_3,x_3)\cos(x'_3,x_1)=? \\

\end{array} \right.$$
This problem can be visualized. the multiplication in ##\sum_j \lambda_{ij} \lambda_{kj} =\delta_{jk}## is between 2 rows of the matrices while in ##\sum_i \lambda_{ij} \lambda_{ik}## the multiplication is between 2 columns.
 
Are you familiar with the fact that any matrix commutes with its inverse? In this case, the inverse is the transpose (which follows from the relation you are familiar with).

Naturally, this relation also follows from the fact that it does not matter which system you decided to call the primed system and which system you decided to call unprimed, the completeness relation must be true regardless.
Karol said:
which doesn't have a meaning:
This is wrong, it has a meaning. It is just the decomposition of the unprimed basis in the primed eigenvectors instead of vice versa!
 
Orodruin said:
This is wrong, it has a meaning. It is just the decomposition of the unprimed basis in the primed eigenvectors instead of vice versa!
Thanks Orodruin
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
12K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 16 ·
Replies
16
Views
11K