Doubt about the kronecker delta

  • Thread starter jonjacson
  • Start date
  • #1
409
23
Well I don't understand this equality:

[itex]A^{j}_{i}[/itex]*[itex]A_{j}^{k}[/itex]=[itex]\delta_{i}^{k}[/itex]

It is true because it is the result of a calculation. But assuming it is true ¿What happens if one of the [itex]A_{i}^{j}[/itex] is zero?.

Then it does not matter wich is the value of the [itex]A_{j}^{k}[/itex] the equality will be false because 0*any number= 0, but supposedly it should be 1, due to delta having i=k.

So the point is ¿What happens if the coefficients are zero?¿Is it really true?.
 

Answers and Replies

  • #2
371
1
What is A i,j ? Are these entries of a matrix? What is the matrix?
 
  • #4
409
23
I was making a post but you found my original source XD. So you see the expression, A is simply the coefficients that change one base to another.

¿What would happen if one of that coefficients would be zero?¿Would be true the equality with the kronecker delta?.
 
  • #5
12,082
5,750
speed of light issues my response to wisuze got in first somehow
 
  • #6
371
1
The A i,j' matrix cannot give you a zero vector if you input a basis element, since it represents a map that changes one basis into another. And the bases have the same dimension ( bases for the same space )
 
  • #7
12,082
5,750
The A i,j' matrix cannot give you a zero vector if you input a basis element, since it represents a map that changes one basis into another. And the bases have the same dimension ( bases for the same space )
they could be zero if it was only a scaling transform right? (ie Aij = 0 where i not = j)
 
  • #8
409
23
I was making an example, extraordinarily simple:

e1=(1,0) ; e2=(0,1)

e1'=(2,0); e2'=(0,2)

I see that e' is not orthonormal so the e1'*e1'≠1 or 0 so the expression is not the same as a delta kronecker.

I am going to work with other numerical example and then I will post the result.

*Remark: I am tired of studying this topic without numerical examples to fix ideas, it's really frustrating :( .
 
  • #9
371
1
they could be zero if it was only a scaling transform right? (ie Aij = 0 where i not = j)
By scaling transform, I assume you mean a diagonal matrix? Anyway, these are specific matrices ( ones that take you from the { e_j' } basis to the { e_i } basis ), the matrix is determined by the relations e_j' = a1e1 + ... + anen ( and the ai's then form a column of the matrix ). Whatever this relation is, it could be via scaling , the resulting matrix images of the e_j' 's must be linearly independent and spanning, since the result is another basis for the vector space V. We cannot have a 0 vector in this image ( so a scaling matrix with a 0 factor on the diagonal couldn't be a change of basis matrix from V to V )
 
  • #10
Fredrik
Staff Emeritus
Science Advisor
Gold Member
10,851
412
Recall that the definition of matrix multiplication is ##(AB)^i_j=A^i_k B^k_j## (when the row indices are written upstairs, and the column indices downstairs). So the equality ##A^j_i A^k_j=\delta^j_i## is just saying that the matrix A satisfies ##A^2=I##. If this isn't immediately obvious, first note that the left-hand side can also be written as ##A^k_j A^j_i##, which according to the definition of matrix multiplication is equal to ##(A^2)^k_i##.

There are lots of examples of matrices that satisfy this condition, e.g. the 2×2 matrix
\begin{pmatrix}
0 & 1\\
1 & 0
\end{pmatrix}
 
  • #11
Fredrik
Staff Emeritus
Science Advisor
Gold Member
10,851
412
In the book's notation, ##A^i_{j'}## and ##A^{i'}_j## are the row i, column j components of two different matrices. (Zoom in if you find the primes hard to see). I would prefer a notation that uses different symbols instead of just primes. For example, suppose that ##\{e_i\}## and ##\{f_i\}## are two orthonormal bases for a vector space V. Since ##\{e_i\}## is a basis, every ##f_i## is a linear combination of the ##e_i##. $$f_i=A^j_i \,e_j.\qquad\text{(3.14)}$$ Since ##\{f_i\}## is a basis, every ##e_i## is a linear combination of the ##e_i##. $$e_i=B^j_i\, f_j\qquad\text{(3.15)}$$ Now use (3.14) in (3.15). $$e_i=B^j_i\, f_j=B^j_i\, A^k_j e_k.\qquad\text{(3.16)}$$ Since ##\{e_i\}## is linearly independent, this implies that $$B^j_i\, A^k_j=\delta^k_i.\qquad\text{(3.17)}$$ If we let A denote the matrix with ##A^k_j## on row k, column j, for all k and j, and B the matrix with ##B^j_i## on row j, column i, for all j and i, then (3.17) says that AB=I, or equivalently, that ##B=A^{-1}##.
 
Last edited:

Related Threads on Doubt about the kronecker delta

  • Last Post
2
Replies
31
Views
8K
  • Last Post
Replies
2
Views
5K
  • Last Post
Replies
6
Views
2K
  • Last Post
Replies
16
Views
8K
  • Last Post
Replies
1
Views
1K
Replies
3
Views
861
  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
1
Views
2K
Replies
3
Views
685
Top