Reversing Indices in Contractions: Can it be Done?

Click For Summary
SUMMARY

The discussion centers on the interpretation of covariant derivatives in the context of contractions, specifically examining the expression \(\left( \nabla_\mu \nabla_\beta - \nabla_\beta \nabla_\mu \right) V^\mu = R_{\nu \beta} V^\nu\). Participants clarify that each gradient operator acts sequentially on the vector \(V^\mu\), rather than on each other. The confusion arises from the application of the chain rule, where it is emphasized that operators must be applied in a specific order to yield meaningful results. The final consensus indicates that if the connection is symmetric, the expression for \(F_{\mu \nu}\) can be simplified using partial derivatives.

PREREQUISITES
  • Understanding of covariant derivatives and their notation
  • Familiarity with Riemann curvature tensors
  • Knowledge of metric compatibility in differential geometry
  • Basic principles of tensor calculus
NEXT STEPS
  • Study the properties of covariant derivatives in differential geometry
  • Learn about Riemann curvature tensors and their implications
  • Explore the concept of metric compatibility in connections
  • Investigate the application of the chain rule in tensor calculus
USEFUL FOR

This discussion is beneficial for mathematicians, physicists, and students engaged in advanced studies of differential geometry, particularly those focusing on general relativity and tensor analysis.

unscientific
Messages
1,728
Reaction score
13
Suppose I have something like
\left( \nabla_\mu \nabla_\beta - \nabla_\beta \nabla_\mu \right) V^\mu = R_{\nu \beta} V^\nu
Can since all the terms involving ##\mu## on the left and ##\nu## on the right are contractions, can I simply do:
\left( \nabla^\mu \nabla_\beta - \nabla_\beta \nabla^\mu \right) V_{\mu} = R^\nu_{\beta} V_{\nu}?
 
Physics news on Phys.org
As long as the connection is metric compatible.
 
Orodruin said:
As long as the connection is metric compatible.

But my concern is that in the first term on the LHS, the gradient operator is acting on the gradient operator via ##\nabla_\mu (\nabla_\beta) V^\mu##, and not on the vector ##V^\mu##. I thought for contractions, these have to be acting on one another?
 
unscientific said:
But my concern is that in the first term on the LHS, the gradient operator is acting on the gradient operator via ##\nabla_\mu (\nabla_\beta) V^\mu##, and not on the vector ##V^\mu##. I thought for contractions, these have to be acting on one another?
No, this is not how to interpret the LHS. The operators are each acting on ##V^\mu##, one after the other.
 
Orodruin said:
No, this is not how to interpret the LHS. The operators are each acting on ##V^\mu##, one after the other.
Ok, now I'm confused. I thought the chain rule applies:
\nabla^\mu \left( \nabla_\mu A_\nu \right) = \nabla^\mu \left( \nabla_\mu\right) A_\nu + \nabla_\mu \nabla^\mu A_\nu
 
Orodruin said:
No, this is not how to interpret the LHS. The operators are each acting on ##V^\mu##, one after the other.
If I have something like ##F_{\mu \nu} = \nabla_\mu A_\nu - \nabla_\nu A_\mu##, what will ##\nabla^\mu F_{\mu \nu}## look like?
 
What do you mean by ##\nabla_\mu(\nabla^\mu)##? This is an operator applied to an operator. The only way to make sense of this is to apply the right operator to something first and then apply the left to the result.
 
unscientific said:
If I have something like ##F_{\mu \nu} = \nabla_\mu A_\nu - \nabla_\nu A_\mu##, what will ##\nabla^\mu F_{\mu \nu}## look like?
You should write out this covariant derivative first in terms of F, and then in that expression write out F in terms of A. If your connection is symmetric, F can be expressed in terms of partial derivatives only.
 

Similar threads

  • · Replies 124 ·
5
Replies
124
Views
9K
  • · Replies 6 ·
Replies
6
Views
7K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
657
Replies
0
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 62 ·
3
Replies
62
Views
6K