Graduate Decomposing Rank-2 Tensors in Dirac's "General Theory of Relativity

  • Thread starter Thread starter Kostik
  • Start date Start date
  • Tags Tags
    Tensors
Click For Summary
Dirac's "General Theory of Relativity" states that a general rank-2 tensor can be decomposed into a sum of outer products, specifically expressed as T^{\mu\nu} = A^\mu B^\nu + A'^\mu B'^\nu + ... This concept is reiterated in the context of the covariant derivative, where Dirac emphasizes that a tensor T_{\mu\nu} can similarly be represented. The discussion explores whether this decomposition is obvious and seeks clarification on the notation used. A participant explains that by defining vectors A^\nu and B_\nu, the decomposition aligns with Dirac's definition of tensors. The conversation highlights the straightforward nature of this tensor decomposition while addressing LaTeX formatting issues.
Kostik
Messages
274
Reaction score
32
TL;DR
Dirac says that a general rank-2 tensor ## T^{\mu\nu} ## can be decomposed as ## A^\mu B^\nu + A'^\mu B'^\nu + A''^\mu B''^\nu + \cdots\, ##. Is this obvious?
Dirac's book "General Theory of Relativity" says on p. 2 that a general rank-2 tensor can be written as a sum of outer products: $$ T^{\mu\nu} = A^\mu B^\nu + A'^\mu B'^\nu + A''^\mu B''^\nu + \cdots $$ Importantly, he repeats this on p. 18, in developing the covariant derivative, where he mentions that a tensor ## T_{\mu\nu} ## is "expressible as a sum of terms like ## A_\mu B_\nu ##".

Is this obvious? Can someone show or explain this?
 
Last edited:
Physics news on Phys.org
Kostik said:
TL;DR Summary: Dirac says that a general rank-2 tensor can be decomposed: ##T^\mu\nu = A^\mu B^\nu + A'^\mu B'^\nu + A''^\mu B''^\nu + ...##. Is this obvious?

Dirac's book "General Theory of Relativity" says on p. 2 that a general rank-2 tensor can be written as a sum of outer products:

$$T^\mu\nu = A^\mu B^\nu + A'^\mu B'^\nu + A''^\mu B''^\nu + ...$$

Importantly, he repeats this on p. 18, in developing the covariant derivative, where he mentions that a tensor ##T_\mu\nu$ is "expressible as a sum of terms like $A_\muB_\nu##".

Is this obvious? Can someone show or explain this?
By definition, a tensor of rank two can be written as
$$
T = T^{\mu\nu} e_\mu \otimes e_\nu
$$
We can introduce the vectors ##A^\nu = T^{\mu\nu} e_\mu## and ##B_\nu = e_\nu## (note that here ##\nu## is being used as a counter rather than a component index) and therefore
$$
T = A^\nu \otimes B_\nu
$$
 
I'm not familiar with your notation, I wonder if Dirac's decomposition can be explained using only his definition of tensors.
 
Last edited:
Kostik said:
I added the missing braces, but the LaTex still doesn't seem to be working in the original post.
It's a known issue when you make the first post to use LaTeX (OP or reply) on a page. The parser doesn't get loaded until you refresh the page. Your LaTeX looks fine to me, and will look fine to you once you've hit refresh.
 
Ibix said:
It's a known issue when you make the first post to use LaTeX (OP or reply) on a page. The parser doesn't get loaded until you refresh the page. Your LaTeX looks fine to me, and will look fine to you once you've hit refresh.
Aha, yes, I see it now.
 
Oh, I think it's actually fairly straightforward. Write (showing the summation explicitly): $$T^{\mu\nu}=\sum_{\lambda,\kappa}T^{\lambda\kappa}{\delta_\lambda}^\mu{\delta_\kappa}^\nu \,\,\,\,\text{(no Einstein summation)}$$ Then ##A^\mu = T^{\lambda\kappa}{\delta_\lambda}^\mu## (not summed over ##\lambda##) and ##B^\nu={\delta_\kappa}^\nu##. (Regard ##\lambda## and ##\kappa## as fixed.) Since everything in sight is a tensor, the ##A^\mu## and ##B^\nu## are obviously vectors (no need to worry about constructing a non-vector).
 
A good one to everyone. My previous post on this subject here on the forum was a fiasco. I’d like to apologize to everyone who did their best to comment and got ignored by me. In defence, I could tell you I had really little time to spend on discussion, and just overlooked the explanations that seemed irrelevant (why they seemed irrelevant, I will tell you at the end of this). Before we get to the point, I will kindly ask you to comment having considered this text carefully, because...

Similar threads

  • · Replies 38 ·
2
Replies
38
Views
1K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 1 ·
Replies
1
Views
579
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 9 ·
Replies
9
Views
994
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 50 ·
2
Replies
50
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 13 ·
Replies
13
Views
925