I Why is ∂σ(δxσ) not equal to zero?

  • I
  • Thread starter Thread starter GR191511
  • Start date Start date
  • Tags Tags
    Zero
AI Thread Summary
The discussion centers on the derivation of Noether's theorem, specifically the equation J=1+∂σ(δxσ). The confusion arises from the assumption that ∂σ(δxσ) can be simplified to δ(∂σ(xσ)), leading to the conclusion that it equals zero. However, δxσ is a function of x, which means it cannot be treated as a constant during differentiation. Participants suggest consulting specific sections of relevant literature for clarification on the transformation properties involved in the theorem. Understanding the distinction between the variables and their roles in the symmetry transformation is crucial for resolving the confusion.
GR191511
Messages
76
Reaction score
6
I'm studying Noether theorem.In the derivation process,I saw a equation:J=1+∂σ(δxσ),where J is the Jacobian,the second σ is superscript...Since δ has the property to commute with differentiation,why is ∂σ(δxσ) not equal to δ(∂σ(xσ))=δ (1+1+1+1) =δ (4) =0?
 
Physics news on Phys.org
vanhees71 said:
From which book are you studying from? I cannot read your equations? Is it about Noether for fields? Then maybe Sec. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf

is of some use.
https://physics.stackexchange.com/questions/534699/noethers-theorem-derivation-for-fields
the first answer said"the Jacobian J=1+∂σ(δxσ)"But I think:1+∂σ(δxσ)=1+δ(∂σxσ)=1+δ (1+1+1+1) =1+δ (4) =1+0=1 ...I don't know what I did wrong.Thank you.
 
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
 
Last edited:
vanhees71 said:
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
Thank you very much!I saw##J=1+\partial_\sigma \delta x^\sigma##in that thread(the first answer)But δ has the property to commute with differentiation,so I think it should continue to be equal to##1+\delta \partial_\sigma x^\sigma=1+\delta(1+1+1+1)=1+\delta(4)=1+0=1##...What did I do wrong?
 
No, ##\delta x^{\sigma}## is a given function of ##x##, defining the transformation of the space-time coordinates, which is part of a symmetry transformation (e.g., Poincare transformations) of a field theory. See Sect. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf
 
  • Like
Likes GR191511
vanhees71 said:
Could you please use LaTeX? I really can't read your formulae :-(. The derivation of Noether's theorem for fields, given in the stackexchange article is also in my above quoted FAQ article.

To get the determinant of a matrix ##\hat{A}=\hat{1} + \delta \hat{\omega}## to first order in ##\delta## just use the definition of the determinant using the Levi-Civita symbol (Einstein summation convention applies)
$$\mathrm{det} \hat{A} = \epsilon_{j_1 j_2 \cdots j_n} A_{1j_1} A_{2 j_2} \cdots A_{n j_n}.$$
It's also clear that all products occurring in this sum are of order ##\mathcal{O}(\delta^2)## or higher except the product of the diagonal elements, i.e., (summation convention doesn's apply in the next formula)
$$\mathrm{det} \hat{A} =\prod_{j} A_{jj} + \mathcal{O}(\delta^2) = 1 + \sum_{j} \delta \omega_{jj} + \mathcal{O}(\delta^2) = 1 + \mathrm{Tr} \delta \hat{\omega} + \mathcal{O}(\delta^2).$$
vanhees71 said:
No, ##\delta x^{\sigma}## is a given function of ##x##, defining the transformation of the space-time coordinates, which is part of a symmetry transformation (e.g., Poincare transformations) of a field theory. See Sect. 3.1 in

https://itp.uni-frankfurt.de/~hees/pf-faq/srt.pdf
I get it!I appreciate your help very much
 
  • Like
Likes vanhees71
Back
Top