How to apply tensor transformation rule

  • #1
guv
123
22
TL;DR Summary
how is the tensor transformation rule applied on a position vector? $$v^\alpha = v^{*\beta} \frac{\partial u^\alpha}{\partial u^{* \beta}}$$
Suppose I have a Cartesian Coordinate system (x,y) and a polar coordinate system (##r, \theta##). The position vector (3,4) and (5, ##\arctan \frac{4}{3}##) are the same except the representation. The position vector is a tensor, how does the position vector follow the tensor transformation rule? Surely I cannot write ##x = r \frac{\partial x}{\partial r} + \theta \frac{\partial x}{\partial \theta}##

It's clear for a function ##f(x(r, \theta),y(r, \theta))##, its derivative ##\frac{\partial f}{\partial r}## which is the gradient vector follows the transformation rule.

Does the transformation rule apply to a position vector?
 
Physics news on Phys.org
  • #2
guv said:
Surely I cannot write ##x = r \frac{\partial x}{\partial r} + \theta \frac{\partial x}{\partial \theta}##
You can obviously write it (you just did), but it would be very wrong in general.

guv said:
Does the transformation rule apply to a position vector?
Yes! Try it in polar coordinates on the Euclidean plane!

$$
x = r \cos(\theta), \quad y = r\sin(\theta)
$$
 
  • Like
Likes guv
  • #3
Thanks, I know how ##x = r \cos \theta , y = r \sin \theta## works. What makes me wonder is why you can't use the tensor transformation rule on the position as I initially wrote.
 
  • #4
You can if you do it correctly.
 
  • #5
Would you mind showing how that works, I am very curious to see how. Thanks!
 
  • #6
For example, the position vector in polar coordinates is ##X = r\partial_r##. In other words, the only non-zero component is ##X^r##. Hence
$$
X^x = X^r \frac{\partial x}{\partial r}
= r \cos(\theta) = x
$$
Similarly for the y-component.
 
  • #7
Don't you need to include both ##r## and ##\theta##? i.e. exactly what I wrote initially? :cool: Why is ##r## non-zero but ##\theta## is zero? ##\theta## is not necessarily zero? Sorry I am not getting it.
 
  • #8
guv said:
Don't you need to include both ##r## and ##\theta##? i.e. exactly what I wrote initially? :cool: Why is ##r## non-zero but ##\theta## is zero? ##\theta## is not necessarily zero? Sorry I am not getting it.
I did include the ##\theta## component (it is zero).

The position vector in polar coordinates does not have ##r## and ##\theta## as its components. It only has a radial component with value ##r##. Whatever point you pick, its position vector is fully in the radial direction.
 
  • #9
If it makes you feel better we can always write
$$
X^x = X^r \frac{\partial x}{\partial r} +
\underbrace{X^\theta}_{= 0}\frac{\partial x}{\partial \theta}
= r \cos(\theta) = x
$$
 
  • Like
Likes guv
  • #10
Silly me. I get it now. Thanks!
 

What is the tensor transformation rule?

The tensor transformation rule refers to how the components of a tensor change when you change the coordinate system. It ensures that the physical phenomena described by tensors remain consistent across different coordinate systems. This rule is crucial in fields like physics and engineering where different coordinate systems are used to describe the same physical situation.

How do you apply the tensor transformation rule to a vector?

To apply the tensor transformation rule to a vector, you use the relation \( V'^i = \frac{\partial x'^i}{\partial x^j} V^j \), where \( V^j \) are the components of the vector in the original coordinate system, \( V'^i \) are the components in the new coordinate system, and \( \frac{\partial x'^i}{\partial x^j} \) is the Jacobian matrix of the transformation. This formula ensures that the vector's representation changes correctly according to the new coordinates.

What is the difference between covariant and contravariant tensors?

Covariant tensors (also known as lower-index tensors) transform with the inverse of the Jacobian matrix, while contravariant tensors (upper-index tensors) transform with the Jacobian matrix itself. This distinction is important because it affects how the components of the tensor behave under coordinate transformations. Covariant components scale inversely with the change in coordinate scale, whereas contravariant components scale directly.

How do you apply the tensor transformation rule to a mixed tensor?

To apply the tensor transformation rule to a mixed tensor, which has both covariant and contravariant components, you use the transformation laws for each type of component separately. For example, a mixed tensor \( T^i{}_j \) transforms as \( T'^i{}_j = \frac{\partial x'^i}{\partial x^k} \frac{\partial x^l}{\partial x'^j} T^k{}_l \). This ensures that each component of the tensor is transformed appropriately according to its variance.

Why is the tensor transformation rule important?

The tensor transformation rule is crucial because it ensures that the mathematical description of physical laws is consistent in different coordinate systems. This is particularly important in physics, where the laws should be independent of the choice of coordinate system. By applying the tensor transformation rule, you can confirm that a tensor equation holds true in any coordinate system, thereby verifying its physical validity.

Similar threads

  • Differential Geometry
Replies
12
Views
3K
Replies
3
Views
1K
  • Differential Geometry
Replies
1
Views
2K
  • Differential Geometry
Replies
2
Views
592
  • Differential Geometry
Replies
7
Views
2K
  • Differential Geometry
Replies
3
Views
3K
Replies
14
Views
3K
  • Special and General Relativity
Replies
14
Views
798
Replies
8
Views
239
  • Differential Geometry
Replies
1
Views
2K
Back
Top