Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is this true about integrals?

  1. Oct 4, 2013 #1
    Sorry, the title doesn't match up 100% with the content of the topic, but that's because I've decided to be a little bit more explicit about my question.

    I am trying to walk through the proof of Euler's Equation from Calculus of Variations, and I'm a little bit confused by the final step.

    Right now I have this:
    [itex]\int^{x_{2}}_{x_{1}}{(\frac{\partial f}{\partial y}-\frac{d}{dx}\frac{\partial f}{\partial y'})\eta(x) dx}=0[/itex]

    and then they proceed to say that this implies:
    [itex]\frac{\partial f}{\partial y}-\frac{d}{dx}\frac{\partial f}{\partial y'}=0[/itex]

    Could someone please explain why this is a correct generalization? Isn't the left side of Euler's Equation really just a function of x? And since eta is also a function of x, couldn't their product technically be an odd function and couldn't the interval technically be a symmetric interval? Wouldn't this mean that the left side of Euler's Equation does not necessarily have to equal zero in very rare circumstances?

    [Original Post]
    and F(x) is arbitrary
    Does it imply that
    Last edited: Oct 4, 2013
  2. jcsd
  3. Oct 4, 2013 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Yes, it does. For simplicity let's assume G(x) is continuous. If there is some x0 such that G(x0) > 0, then by continuity there is some interval
    [tex] (x_0-\epsilon,x_0+\epsilon) [/tex]
    on which G(x) is positive. Let F(x) be a function which is zero outside of this interval, and positive in at least some small piece of the interval (you should be able to imagine such functions exist and at least draw a graph of what it looks like). Then necessarily
    [tex] \int_{x_1}^{x_2} F(x) G(x) dx > 0 [/tex]
    which is a contradiction. Similarly if G(x) is negative somewhere we can develop a contradiction.
  4. Oct 5, 2013 #3


    User Avatar
    Science Advisor

    The way you have stated it, it is NOT true. If, however, f(t) is continuous and [itex]\int_{x_1}^{x_2} f(t)dt= 0[/itex] for all [itex]x_1[/itex] and [itex]x_2[/itex] in a given interval, then it must be true that f(t)= 0 in that interval. You can prove that by contradiction: if there exist [itex]x_0[/itex] such that [itex]f(x_0)\ne 0[/itex] then there exist some interval around [itex]x_0[/itex] on which [itex]f(x)[/itex] f(x) is always positive (or always negative) and integrating over that interval will give a non-zero result.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook