cathalcummins
- 43
- 0
Thanks for the help on the other questions.
I am having trouble with another derivation. Unlike the others, it's not abstract whatsoever.
Okay I wish to find the transformation Law for the components of a rank 2 tensor.
Easy, I know: T: V^* \times V \mapsto \mathbb{R}
So
T = T^i_{\phantom{i} j} e_i \otimes e^j
I wish to find
T^i'_{\phantom{i'} j'}
Where the following hold:
e_{i'}=a^{J}_{\phantom{J} i'} e_J
and
e^{i'}=b^{i'}_{\phantom{i'} J} e^J
where, the coefficients are just real numbers. Now T^i_{\phantom{i} j}=T(e^i, e_j) so that:
T^{i'}_{\phantom{i'} j'}=T(e^{i'}, e_{j'})
=T(b^{i'}_{\phantom{i'} J} e^J, a^{L}_{\phantom{L} j'} e_L)
By linearity of \otimes we have:
=b^{i'}_{\phantom{i'} J} a^{L}_{\phantom{L} j'} T( e^J, e_L)
=b^{i'}_{\phantom{i'} J} a^{L}_{\phantom{L} j'} T^{J}_{\phantom{J} L}
Now my lecturer done a funny thing and said,
"it may be shown that b=a"
which confuses the hell outta me because,
e_{i'}=a^{J}_{\phantom{J} i'} e_J
e^{i'}=b^{i'}_{\phantom{i'} J} e^J
abide by the duality relation so that:
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} J} e^J a^{K}_{\phantom{K} j'} e_K=\delta^{i'}_{j'}
So do the original bases obey their own set of duality relations so that:
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} J} a^{K}_{\phantom{K} j'} \delta^J_K=\delta^{i'}_{j'}
So
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} K} a^{K}_{\phantom{K} j'} =\delta^{i'}_{j'}
Is this not the definition of a being the inverse of b. Of course, I know a priori I am wrong as this would give T^{i'}_{\phantom{i'} j'}=T^{i}_{\phantom{i} j} regardless of transformation.
In my definition of a and b, the superindex refers to row and the lower index refers to column.
My gut feeling is that I am using the matrix notation all wrong.
Any takers?
I am having trouble with another derivation. Unlike the others, it's not abstract whatsoever.
Okay I wish to find the transformation Law for the components of a rank 2 tensor.
Easy, I know: T: V^* \times V \mapsto \mathbb{R}
So
T = T^i_{\phantom{i} j} e_i \otimes e^j
I wish to find
T^i'_{\phantom{i'} j'}
Where the following hold:
e_{i'}=a^{J}_{\phantom{J} i'} e_J
and
e^{i'}=b^{i'}_{\phantom{i'} J} e^J
where, the coefficients are just real numbers. Now T^i_{\phantom{i} j}=T(e^i, e_j) so that:
T^{i'}_{\phantom{i'} j'}=T(e^{i'}, e_{j'})
=T(b^{i'}_{\phantom{i'} J} e^J, a^{L}_{\phantom{L} j'} e_L)
By linearity of \otimes we have:
=b^{i'}_{\phantom{i'} J} a^{L}_{\phantom{L} j'} T( e^J, e_L)
=b^{i'}_{\phantom{i'} J} a^{L}_{\phantom{L} j'} T^{J}_{\phantom{J} L}
Now my lecturer done a funny thing and said,
"it may be shown that b=a"
which confuses the hell outta me because,
e_{i'}=a^{J}_{\phantom{J} i'} e_J
e^{i'}=b^{i'}_{\phantom{i'} J} e^J
abide by the duality relation so that:
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} J} e^J a^{K}_{\phantom{K} j'} e_K=\delta^{i'}_{j'}
So do the original bases obey their own set of duality relations so that:
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} J} a^{K}_{\phantom{K} j'} \delta^J_K=\delta^{i'}_{j'}
So
e^{i'} e_{j'}=b^{i'}_{\phantom{i'} K} a^{K}_{\phantom{K} j'} =\delta^{i'}_{j'}
Is this not the definition of a being the inverse of b. Of course, I know a priori I am wrong as this would give T^{i'}_{\phantom{i'} j'}=T^{i}_{\phantom{i} j} regardless of transformation.
In my definition of a and b, the superindex refers to row and the lower index refers to column.
My gut feeling is that I am using the matrix notation all wrong.
Any takers?