davidge
- 553
- 21
Suppose we have defined a vector ##V## at a point ##x##, so it has components ##V^\mu(x)## at ##x##. Let ##y## be another point, such that ##y^\mu = x^\mu + \epsilon \zeta^\mu(x)##, ##\epsilon## a scalar. Now, since ##x## and ##y## are coordinate points, the vector ##V## should not depend on them. So we have the familiar transformation of a vector from one point to another (a prime denotes quantities in ##y##):
$$ V'^\mu (y) = V^\nu (x) \frac{\partial y^\mu (x)}{\partial x^\nu}$$
We would end up with ##V'^\mu (y) = V^\mu (x) + \epsilon V^\nu (x)\partial_{\nu}\zeta^{\mu}(x)##. My question is if it's correct to assume that relation between ##x## and ##y##.
$$ V'^\mu (y) = V^\nu (x) \frac{\partial y^\mu (x)}{\partial x^\nu}$$
We would end up with ##V'^\mu (y) = V^\mu (x) + \epsilon V^\nu (x)\partial_{\nu}\zeta^{\mu}(x)##. My question is if it's correct to assume that relation between ##x## and ##y##.