1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Differential as generalized directional deriv (Munkres Analysis on Manifolds)

  1. Jul 22, 2012 #1
    1. The problem statement, all variables and given/known data

    Let ##A## be open in ##\mathbb{R}^n##; let ##\omega## be a k-1 form in ##A##. Given ##v_1,...,v_k \in \mathbb{R}^n##, define
    ##h(x) = d\omega(x)((x;v_1),...,(x;v_k)),##
    ##g_j(x) = \omega (x)((x;v_1),...,\widehat{(x;v_j)},...,(x;v_k)),##
    where ##\hat{a}## means that the component ##a## is to be omitted.

    Prove that ##h(x) = \sum _{j=1}^k (-1)^{j-1} Dg_j (x) \cdot v_j . ##


    2. Relevant equations

    The problem is broken into 3 parts:
    (a) Let ##X = \begin{bmatrix} v_1 ... v_k \end{bmatrix}##. For each ##j## let ##Y_j = \begin{bmatrix}v_1 ... \hat{v}_j ... v_k \end{bmatrix}##. Given ##(i, i_1,...,i_{k-1})##, show that

    ##detX(i,i_1,...,i_{k-1}) = \sum _{j=1}^k (-1)^{j-1}v_{ij}detY_j(i_1,...,i_{k-1}).##
    (b) Verify the theorem in the case ##\omega = fdx_I##.
    (c) Complete the proof.

    3. The attempt at a solution

    I'm stuck on part (b), however. By the definition given in the text, if ##\omega = fdx_I## then ##d\omega = df \wedge dx_I##. I'm not quite sure how to link the result of part (a) to prove part (b). If anyone can shed any light on this problem I'd be really grateful! Thanks.
     
  2. jcsd
  3. Jul 22, 2012 #2
    Sorry, but I am not familiar with your notation. What is [itex] (x;v_i ) [/itex]? Also, what is [itex] v_{ij} [/itex]? The [itex] j^{th} [/itex] component of [itex] v_i [/itex]? What is [itex] Dg_j [/itex]? The Jacobian? The gradient (which is possible under the identification of [itex] T_p^* \mathbb R^n \cong \mathbb R^n [/itex]? What is [itex] Dg_j \cdot v [/itex]? Is this the standard Euclidean product? Is [itex] I [/itex] a multi-index or a typo?

    I will assume that [itex] Dg_j [/itex] is the gradient so that [itex] Dg_j \cdot v_j [/itex] is the directional derivative.

    I'm not sure what you are and are not allowed to use, but if [itex] \omega = f \ dx_I [/itex] then you are correct that [itex] d\omega = df \wedge dx_I [/itex]. Thus for two vector fields [itex] v,w [/itex] we have that
    [tex]
    \begin{align*}
    d \omega &= df \wedge dx_I(v,w) \\
    &= df(v) dx_I(w) - dx_I(v)df(w) \\
    &= w_i Df\cdot v - v_i Df\cdot w.
    \end{align*}
    [/tex]
    With the second equality occurring by definition of the wedge product. Appropriate substitution of your vectors yields the desired equality. Perhaps induction will now work?

    Edit: Had to do some craziness with a misplaced tex wrapper.
     
    Last edited: Jul 22, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Differential as generalized directional deriv (Munkres Analysis on Manifolds)
Loading...