Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I Vectors and isometries on a manifold

  1. Feb 3, 2017 #1
    Hi. I've been thinking about vectors, coordinate systems and all things associated for a long time. I'd like to know if (at least in the context of General Relativity) my interpretation of these subjects is correct. I will try to summarize my thoughts as follows:

    - We start with a general manifold.
    - We assign values to each point [itex]P[/itex] on the manifold, through a map from a closed numerical interval, by a function [itex]c[/itex]. So [itex]c: [a, b] \longrightarrow \mathbb{R} [/itex]; [itex]t \mapsto c(t)[/itex]. Additionaly, we impose the condition that no two points on the manifold have the same value.
    - Now we map c(t) into [itex] \mathbb{R} ^m[/itex], where m stands for the dimension of the manifold, through a function [itex]\Psi: \mathbb{R}\longrightarrow \mathbb{R} ^m[/itex]; [itex] c(t) \mapsto \Psi (c(t)) \equiv x[/itex].

    Define a tangent vector V at P by V = [itex]\frac{\partial c(t)}{\partial t}[/itex] and using the chain rule, V = [itex]V^{\mu} \frac{\partial}{\partial x^{\mu}}[/itex], where [itex]V^{\mu} = f(V, x)^{\color{red}*} \frac{\partial x^{\mu}}{\partial t}[/itex]. Define the basis vector to be [itex] \frac{\partial}{\partial x^{\mu}} [/itex].

    Now it is clear that the basis vector in this representation depends on the point [itex]x[/itex] of [itex]\mathbb{R} ^m[/itex] (though the vector itself depends only on the point [itex]P[/itex] on the manifold). So when we change the basis, is it like as we were transporting the vector through [itex]\mathbb{R} ^m[/itex]?

    * - We know that at [itex]P[/itex] we can construct as many vectors as we desire, by multiplying the basis vectors by scalars and adding them together. But since we define V = [itex]V^{\mu}\frac{\partial}{\partial x^{\mu}}[/itex] and [itex]V^{\mu} = \frac{\partial x^{\mu}}{\partial t}[/itex], we see that all vectors would lie in a same line! Because [itex]\frac{\partial x^{\mu}}{\partial t}[/itex] is the same for all of them. But we must be able to construct a tangent plane (or hyperplane) of vectors at [itex]P[/itex], so I inserted that function [itex]f(V, x)[/itex] in *, so that each vector component can be independent from each other, and the vectors can fill an entire plane at [itex]P[/itex]. Is it correct? If not, what would be a solution to get rid of this problem?

    Now, I'd like to talk about isometries of a metric. My doubt here is: when people talk about rotations in General Relativity, what they are talking about is of rotations as we know from elementary math courses, like "rotation of the coordinate axis through an angle about a point x"?
     
    Last edited: Feb 3, 2017
  2. jcsd
  3. Feb 3, 2017 #2

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    This step is not included, at least not in GR. We just go straight to the next step, where you assign m-tuples of real numbers, i.e., members of ##\mathbb{R}^m##, to points ##P## on the manifold, in such a way that no two points have the same values and neighboring points have neighboring values.

    This step is not included either. We just define ##V^\mu = \partial / \partial x^\mu## as the coordinate basis.

    This is a bit misstated. Vectors don't "live" in the actual manifold, because the actual manifold is not a vector space. Vectors at each point ##P## on the manifold "live" in the tangent space at ##P##, which we denoted as ##T(P)##, which is a vector space that is "attached" to ##P##. Any vector ##V## in ##T(P)## is defined independently of any choice of basis on ##T(P)##; but once we have chosen a coordinate chart ##x^\mu## on the manifold, that chart induces a basis on ##T(P)## at each point ##P## on the manifold, which is called the "coordinate basis" for that chart. The basis depends on the choice of chart, so the representation of a given vector ##V## in the coordinate basis on ##T(P)## will depend on the choice of chart, even though ##V## itself does not.

    I don't think this is a useful way to look at it.

    No. ##t## is just a coordinate; it has no special significance. The correct expression for a coordinate basis vector is ##V^\mu = \partial / \partial x^\mu##. See above.

    You might be confusing the basis vectors with 4-velocity, which is a unit vector tangent to a particular worldline at a particular event. This vector is defined as ##U^\mu = \partial x^\mu / \partial \tau##, where ##\tau## is proper time along the worldline (i.e., an affine parameter on the worldline defined such that the change in ##\tau## between two events on the worldline is the same as the proper time elapsed on a clock traveling on the worldline). But ##\tau## is not the same as the "time" coordinate ##t##. (Note that in general you can't even assume that a coordinate chart will have a timelike coordinate.)

    There is no problem; you have just incorrectly understood how a coordinate basis is constructed. See above.

    Are you asking if this is true? That's not answerable without more specific references to "people" talking about rotations. For example, is there a statement in a textbook or paper you have read that you find confusing?
     
  4. Feb 3, 2017 #3
    I used t just as a element of [a, b] to map [a,b] into P. It is not t (time) of the four-dimensional manifold of General Relativity.

    For example, I attended a lecture last month where the lecturer talked about infinitesimal rotations, using Killing vectors. What did he mean by infinitesimal rotations?
     
  5. Feb 3, 2017 #4

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    But you don't need to do that, and GR doesn't do it. GR just does what I said; it maps points ##P## to m-tuples of real numbers. There is no intermediate step where elements of [a, b] are mapped into points ##P##.

    Which I wasn't at, so I don't know what he said. Is there a link? I can't interpret what someone said if I don't know what they said.

    I don't know since I wasn't there and don't have the text of the lecture. I could guess that he was talking about something like this:

    https://en.wikipedia.org/wiki/Rotation_group_SO(3)#Infinitesimal_rotations

    The rotation group SO(3) defines a 3-parameter group of Killing vector fields.
     
  6. Feb 3, 2017 #5
    Ok, can you show me a numerical example of such a rotation in 2-d space?

    I was defining a tangent vector and not a coordinate basis.

    Again, I was not trying to construct a coordinate basis, instead I was trying to define a coordinate basis by constructing a general tangent vector. Please read my first post again.
     
  7. Feb 3, 2017 #6

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    As far as I can tell, you are defining a tangent vector incorrectly. For one thing, you are not saying what it is tangent to.

    You are defining the coordinate basis incorrectly because you are constructing a general tangent vector incorrectly.
     
  8. Feb 3, 2017 #7

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    Sure. It's easiest if you pick coordinates that match up with the rotation symmetry, e.g., polar coordinates on the 2-d Euclidean plane. Then we pick a vector, which for simplicity we will assume to point from the origin in the radial direction along ##\theta = 0##, i.e., it is the column vector

    $$
    \begin{bmatrix}
    1 \\
    0
    \end{bmatrix}
    $$

    An infinitesimal rotation is the operator ##I + A d\theta##, where ##A## is an element of the Lie algebra so(2) (not so(3) because we are only looking at 2-d rotations, not 3-d rotations). I.e., ##A## is a 2 x 2 skew-symmetric matrix. So the infinitesimal rotation operator will be a matrix that looks like this:

    $$
    \begin{bmatrix}
    1 & - d\theta \\
    d\theta & 1
    \end{bmatrix}
    $$

    Multiplying our vector on the left by this matrix gives us the vector

    $$
    \begin{bmatrix}
    1 \\
    d\theta
    \end{bmatrix}
    $$

    which, as you can see, is just a vector that is shifted in the positive ##\theta## direction by an infinitesimal angle.

    A finite rotation by an angle ##\theta## is just the exponential of this operator, which as a matrix looks like this (I'll leave you to verify the calculation):

    $$
    \begin{bmatrix}
    \cos \theta & - \sin \theta \\
    \sin \theta & \cos \theta
    \end{bmatrix}
    $$

    Applying this to our vector gives

    $$
    \begin{bmatrix}
    \cos \theta \\
    \sin \theta
    \end{bmatrix}
    $$

    which is obviously a unit vector pointing radially in the direction labeled by ##\theta##.
     
  9. Feb 4, 2017 #8
    Thank you. How does one work on this same example using Killing Vectors?
     
  10. Feb 4, 2017 #9

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    The Killing vector field in this case is ##\partial / \partial \theta##. The integral curves of this KVF are the same as the integral curves of the rotation operator--this should be obvious from the fact that that operator is parameterized by ##\theta##.

    At the infinitesimal level, note that the difference of the operator from the identity is based on a small increment ##d\theta## of the coordinate ##\theta##. This is equivalent to saying that we are taking whatever vector we start with and adding to it a small increment of the vector ##\partial / \partial \theta##, i.e., a small increment of the Killing vector. This is often expressed as the Killing vector field "generating" rotations.

    The only element missing in the 2-d case is commutation relations, because there is only one Killing vector. In the 3-d case, there is, as I said before, a 3-parameter group of Killing vectors, and if we pick 3 mutually orthogonal vectors from this group, we find that they obey a particular set of commutation relations, the ones that characterize the Lie group SO(3) (the group of 3-d rotations). This additional property is necessary for us to be able to generalize the claim that the Killing vector fields generate rotations to the 3-d case (and to higher dimensions).
     
  11. Feb 4, 2017 #10
    Is this equivalent to say that before we had a vector V = [itex]\frac{\partial}{\partial r} [/itex] and after the rotation the vector transformed to V* = [itex]\frac{\partial}{\partial r} + d \theta \frac{ \partial}{\partial \theta} [/itex]?
     
  12. Feb 4, 2017 #11

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    No. The vector ##V## itself is always a multiple of ##\partial / \partial r##, because it always points in the radial direction. The only change is in the value of ##\theta## that describes which radial direction it points in. Remember that in polar coordinates the vectors ##\partial / \partial r## and ##\partial / \partial \theta## are not constant; the rotation is not just changing the vector ##V## but the basis vectors as well.

    If you want to do the analysis using coordinates in which the basis vectors are constant, you can do it in Cartesian coordinates. The expressions for the Killing vector and the rotation matrix will be more complicated, but it might help in understanding what is being done to the vector ##V## by a rotation.
     
  13. Feb 4, 2017 #12
    I see. Can we identify the elements of the matrix A in your post #7 as the first derivatives of the killing vectors? Since the requirement is that the componentes [itex]\xi ^\mu = 0[/itex] at the point we are performing the rotation, and the covariant derivative in this case is just equal to the ordinary derivative, and I noticed that re-writing A as [itex] \begin{pmatrix}\xi_{r,r}&\xi_{r,\theta}\\\xi_{\theta,r} &\xi_{\theta,\theta}\end{pmatrix} [/itex] we can reproduce the Killing equation, since [itex] \xi_{r,\theta} + \xi_{\theta,r} = d\theta - d\theta = 0[/itex] and [itex]\xi_{r,r} + \xi_{\theta,\theta} = 0[/itex], where I have lowered the indices because [itex]\xi ^\mu = \xi _\mu = 0[/itex].
     
  14. Feb 4, 2017 #13

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    I don't understand. The matrix ##A## is just

    $$
    \begin{bmatrix}
    0 && -1 \\
    1 && 0
    \end{bmatrix}
    $$

    The infinitesimal rotation matrix I wrote down is ##I + A d\theta##, i.e., the identity matrix plus the matrix ##A## times ##d\theta##.

    Also, the Killing vector is ##\partial / \partial \theta##, i.e., it has components ##\xi^\mu = (0, 1)## in polar coordinates. So I don't understand how you're obtaining the partial derivatives you appear to be using.

    I don't understand where you're getting this from either.
     
  15. Feb 5, 2017 #14
    I'm sorry. I mean the matrix [itex]Ad \theta[/itex] and not just [itex]A[/itex].

    According to Weinberg, G&C, if we are to perform a infinitesimal rotation at a point, we must be able to find Killing Vectors whose components are all equal to zero at that point, and whose first derivatives are non-zero.
     
  16. Feb 5, 2017 #15

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    This doesn't sound right as a general statement; either you're misinterpreting something or I'm not understanding the context. Can you give a specific chapter and page reference? I don't have this book but I can probably look up a specific reference. Or a quote to give the context.

    One possibility is that Weinberg is referring to points on the axis. In the example we are discussing, the "axis" is the origin--and at the origin, ##\partial / \partial \theta## does vanish, and "rotation" does nothing (because the axis is left invariant by rotations). But that does not mean ##\partial / \partial \theta## vanishes anywhere else--in particular, it doesn't vanish at the point ##(1, 0)## at the "tip" of the vector we were rotating.
     
  17. Feb 5, 2017 #16
    "A metric space is said to be isotropic about a given point [itex]X[/itex] if there exist infinitesimal isometries that leave the point [itex]X[/itex] fixed, so that [itex]\xi ^{\lambda}(X) = 0[/itex], and for which the first derivatives [itex]\xi_{\lambda ; \ \nu}(X)[/itex] take all possible values [...]. In particular, in N dimensions we can choose a set of N(N-1)/2 Killing vectors [...]"

    "As an example of a maximally symmetric space, consider an N-dimensional flat space, with vanishing curvature tensor. [...]
    We can choose a set of N(N+1)/2 Killing vectors as follows:

    [itex]\xi_{\mu}^{(\nu)}(X) = \delta_{\mu}^{\nu}[/itex]
    [itex]\xi_{\mu}^{(\nu \lambda)}(X) = \delta_{\mu}^{\nu} x^{\lambda} - \delta_{\mu}^{\lambda} x^{\nu}[/itex]

    and the general Killing vector is

    [itex] \xi_{\mu}(X) = a_{\nu} \xi_{\mu}^{(\nu)}(X) + b_{\nu \lambda} \xi_{\mu}^{(\nu \lambda)}(X)[/itex]

    The N vectors [itex]\xi_{\mu}^{(\nu)}(X)[/itex] represent translations, whereas the N(N-1)/2 vectors [itex] \xi_{\mu}^{(\nu \lambda)}(X) [/itex] represent infinitesimal rotations [...]"
     
    Last edited: Feb 5, 2017
  18. Feb 5, 2017 #17

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    Ok, that's what I thought. In the 2-d polar coordinates example, the point ##X## is the origin, and the Killing vector vanishes there. But the expressions I wrote down for transforming the vector ##(1, 0)## were not at the origin; they were at the point ##(1, 0)##. The Killing vector does not vanish there, because the isometry it generates does not leave that point fixed.
     
  19. Feb 5, 2017 #18
    Oh ok. But I don't understand how can the Killing Vector vanish at the origin if it is 1*[itex]\frac{\partial}{\partial \theta}[/itex], that is, its component is a constant at any point. Or it has this form only on the point (1,0)?
     
    Last edited: Feb 5, 2017
  20. Feb 5, 2017 #19
    I'm still trying to find out some possible derivative for the Killing Vector. Let y denote polar coordinate system and a prime denote the transformed metric.

    [itex]
    g'_{\mu \nu}(y) =
    \frac{\partial y^{\sigma}}{\partial x^{\mu}}
    \frac{\partial y^{\kappa}}{\partial x^{\nu}}
    g_{\sigma \kappa}(x)
    [/itex] (1)

    Now I will omit the argument because we know what we are working on. If one work on (1), one get the relation:

    [itex]
    g'_{\mu \nu} =
    g_{\mu \nu} + g_{\mu \kappa} \xi{^\kappa}_{,\nu}
    + g_{\nu \kappa} \xi{^\kappa}_{,\mu}[/itex]
    so that

    [itex]
    g'_{1 1} \equiv g_{r r} =
    g_{1 1} + 2g_{1 1} \xi{^1}_{,1}[/itex]
    [itex]
    g'_{2 2} \equiv g_{\theta \theta} =
    g_{2 2} + 2g_{2 2} \xi{^2}_{,2}[/itex]

    but we know
    [itex]g_{1 1} = g_{2 2} = 1; g_{r r} = 1; g_{\theta \theta} = r^2[/itex]
    therefore
    [itex]\xi{^1}_{,1} = 0 [/itex] and [itex] \xi{^2}_{,2} = \frac{r^2 - 1}{2}[/itex].

    Is this right?
     
    Last edited: Feb 5, 2017
  21. Feb 5, 2017 #20

    PeterDonis

    User Avatar
    2016 Award

    Staff: Mentor

    Heuristically, at the origin, moving in the ##\theta## direction means not moving at all. The only direction in which you can actually move is the ##r## direction.

    Mathematically, the norm of the vector ##\partial / \partial \theta## is ##r##; you can see that by looking at the metric of the 2-d plane in polar coordinates. So at ##r = 0##, the origin, the norm of ##\partial / \partial \theta## is zero.

    Its component in coordinate terms is constant, but that does not mean its norm is constant. See above.

    Also, strictly speaking, polar coordinates have a coordinate singularity in ##\theta## at the origin, so the component of ##\partial / \partial \theta## being 1 is misleading there.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Vectors and isometries on a manifold
  1. Dual Manifold (Replies: 1)

Loading...