What Happens if Coefficients of Kronecker Delta are Zero?

  • Context: Graduate 
  • Thread starter Thread starter jonjacson
  • Start date Start date
  • Tags Tags
    Delta Doubt
Click For Summary

Discussion Overview

The discussion revolves around the implications of having zero coefficients in the context of the equality \( A^{j}_{i}*A_{j}^{k}=\delta_{i}^{k} \), which relates to matrix transformations and the Kronecker delta. Participants explore the conditions under which this equality holds, particularly focusing on the nature of the matrix \( A \) and its coefficients.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants question what happens if one of the coefficients \( A_{i}^{j} \) is zero, suggesting that it would invalidate the equality since \( 0 \times \text{any number} = 0 \).
  • There is a request for clarification on what \( A_{i,j} \) represents, with some suggesting it refers to entries of a matrix that transforms one basis to another.
  • One participant mentions constraints on the coefficients, proposing a relationship such as \( A_{ij} = -A_{ji}^{-1} \), indicating a potential inverse relationship.
  • Another participant provides a numerical example to illustrate the situation, noting that the resulting transformation does not yield orthonormal bases, thus questioning the validity of the Kronecker delta equality.
  • Some participants discuss the implications of scaling transformations and the necessity for the matrix to maintain linear independence and spanning properties, suggesting that a zero coefficient would contradict these requirements.
  • A later reply explains the definition of matrix multiplication and how the equality implies that the matrix \( A \) satisfies \( A^2 = I \), providing an example of a specific matrix that meets this condition.
  • There is a discussion about notation and the representation of different bases, with a participant clarifying how linear combinations relate to the transformation matrices involved.

Areas of Agreement / Disagreement

Participants express differing views on the implications of zero coefficients in the matrix \( A \). While some argue that zero coefficients invalidate the equality, others suggest that the context of the transformation may allow for certain conditions under which the equality could still hold. The discussion remains unresolved regarding the exact consequences of having zero coefficients.

Contextual Notes

Participants note that the discussion is limited by assumptions about the nature of the matrices and the transformations they represent. The relationship between the coefficients and the properties of the bases involved is also a point of contention, with various interpretations presented.

jonjacson
Messages
450
Reaction score
38
Well I don't understand this equality:

[itex]A^{j}_{i}[/itex]*[itex]A_{j}^{k}[/itex]=[itex]\delta_{i}^{k}[/itex]

It is true because it is the result of a calculation. But assuming it is true ¿What happens if one of the [itex]A_{i}^{j}[/itex] is zero?.

Then it does not matter which is the value of the [itex]A_{j}^{k}[/itex] the equality will be false because 0*any number= 0, but supposedly it should be 1, due to delta having i=k.

So the point is ¿What happens if the coefficients are zero?¿Is it really true?.
 
Physics news on Phys.org
What is A i,j ? Are these entries of a matrix? What is the matrix?
 
i think there are some constraints on the Aij in that Aij = - 1/Aji or something like that right?

Anyway I found this reference:

http://books.google.com/books?id=O2...d=0CB0Q6AEwAA#v=onepage&q=Aji*Akj=δki&f=false

which indicates that the Aij are not the same matrix as in your formula instead they are inverses.

As an aside: I cut and pasted your formula (markup and all) into google and it found this reference - just amazing.
 
I was making a post but you found my original source XD. So you see the expression, A is simply the coefficients that change one base to another.

¿What would happen if one of that coefficients would be zero?¿Would be true the equality with the kronecker delta?.
 
speed of light issues my response to wisuze got in first somehow
 
The A i,j' matrix cannot give you a zero vector if you input a basis element, since it represents a map that changes one basis into another. And the bases have the same dimension ( bases for the same space )
 
wisvuze said:
The A i,j' matrix cannot give you a zero vector if you input a basis element, since it represents a map that changes one basis into another. And the bases have the same dimension ( bases for the same space )

they could be zero if it was only a scaling transform right? (ie Aij = 0 where i not = j)
 
I was making an example, extraordinarily simple:

e1=(1,0) ; e2=(0,1)

e1'=(2,0); e2'=(0,2)

I see that e' is not orthonormal so the e1'*e1'≠1 or 0 so the expression is not the same as a delta kronecker.

I am going to work with other numerical example and then I will post the result.

*Remark: I am tired of studying this topic without numerical examples to fix ideas, it's really frustrating :( .
 
jedishrfu said:
they could be zero if it was only a scaling transform right? (ie Aij = 0 where i not = j)

By scaling transform, I assume you mean a diagonal matrix? Anyway, these are specific matrices ( ones that take you from the { e_j' } basis to the { e_i } basis ), the matrix is determined by the relations e_j' = a1e1 + ... + anen ( and the ai's then form a column of the matrix ). Whatever this relation is, it could be via scaling , the resulting matrix images of the e_j' 's must be linearly independent and spanning, since the result is another basis for the vector space V. We cannot have a 0 vector in this image ( so a scaling matrix with a 0 factor on the diagonal couldn't be a change of basis matrix from V to V )
 
  • #10
Recall that the definition of matrix multiplication is ##(AB)^i_j=A^i_k B^k_j## (when the row indices are written upstairs, and the column indices downstairs). So the equality ##A^j_i A^k_j=\delta^j_i## is just saying that the matrix A satisfies ##A^2=I##. If this isn't immediately obvious, first note that the left-hand side can also be written as ##A^k_j A^j_i##, which according to the definition of matrix multiplication is equal to ##(A^2)^k_i##.

There are lots of examples of matrices that satisfy this condition, e.g. the 2×2 matrix
\begin{pmatrix}
0 & 1\\
1 & 0
\end{pmatrix}
 
  • #11
In the book's notation, ##A^i_{j'}## and ##A^{i'}_j## are the row i, column j components of two different matrices. (Zoom in if you find the primes hard to see). I would prefer a notation that uses different symbols instead of just primes. For example, suppose that ##\{e_i\}## and ##\{f_i\}## are two orthonormal bases for a vector space V. Since ##\{e_i\}## is a basis, every ##f_i## is a linear combination of the ##e_i##. $$f_i=A^j_i \,e_j.\qquad\text{(3.14)}$$ Since ##\{f_i\}## is a basis, every ##e_i## is a linear combination of the ##e_i##. $$e_i=B^j_i\, f_j\qquad\text{(3.15)}$$ Now use (3.14) in (3.15). $$e_i=B^j_i\, f_j=B^j_i\, A^k_j e_k.\qquad\text{(3.16)}$$ Since ##\{e_i\}## is linearly independent, this implies that $$B^j_i\, A^k_j=\delta^k_i.\qquad\text{(3.17)}$$ If we let A denote the matrix with ##A^k_j## on row k, column j, for all k and j, and B the matrix with ##B^j_i## on row j, column i, for all j and i, then (3.17) says that AB=I, or equivalently, that ##B=A^{-1}##.
 
Last edited:

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
12K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 6 ·
Replies
6
Views
1K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K