Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Integration and the Jacobian

  1. Aug 9, 2012 #1
    Before I ask my question, I'll lead up to it through an example. Just for reference, I have only taken up to Calc 3 and haven't taken Vector Calc. Lets look at this definite integral:

    [tex]∫∫cos(x^2+y^2)dxdy[/tex]

    The bounds on the outer integral is from 0 to 1 while the bounds on the inner integral is from 0 to [itex]\sqrt{1-y^2}[/itex]. I don't know how to include that in the Latex. If you have taken Calc 3 or at least Calc 2, you will notice that this is an impossible integral to take in Cartesian coordinates. Anything learned in Calc 2(u sub, by parts, trig sub, etc...) to take this integral fails. However, if you switch to polar coordinates, it becomes possible to take this integral. You can see the full problem done out here in example 5. The polar form of integration can be derived from the Jacobian Matrix and it is simple to show this(in example 2).

    Here is my question. There are a ton of integrals that look impossible to do in certain coordinate systems but if we switch coordinate systems, they become possible like in this example. Does this mean that all integrals are potentially possible to do if we switch coordinate systems by using Jacobian Matrices?
     
  2. jcsd
  3. Aug 9, 2012 #2
    $${}$$
    [tex]\int_0^1\int_0^{\sqrt{1-y^2}}\cos(x^2+y^2)dxdy[/tex]

    DonAntonio



     
  4. Aug 10, 2012 #3
    That depends on what you mean by "possible". Let's take a step back here and look simply at a function [itex] f:A\rightarrow \mathbb{R} [/itex] for A, some open subset of [itex]\mathbb{R}[/itex]. By the fundamental theorem of calculus, if f is continuous, then it has an antiderivative.

    But let's consider the function [itex] f(x)=e^{-x^2}[/itex]. This function is continuous everywhere, so it admits an antiderivative. However, there is a rather famous theorem by Liouville (see http://en.wikipedia.org/wiki/Liouville%27s_theorem_(differential_algebra [Broken]) which asserts that no closed form expression (a finite sum of funtions like x, e^x, trig fuctions, powers of x, and their compositions and products, etc.) for said antiderivative exists.

    This is why we have this thing called the "error function". It is defined as being a particular antiderivative of [itex]e^{-x^2}[/itex]. Specifically,

    [tex] erf(x)=\int_{0}^{x}e^{-t^2}dt .[/tex]

    That isn't to say it doesn't have a series representation. We have the following representation,

    [tex]erf(x)=\frac{2}{\sqrt{\pi}}\sum_{n=0}^{\infty} \frac{(-1)^n x^{2n+1}}{n! (2n+1)},[/tex]

    which is valid for all x. However, since this is an infinite sum, it is not a closed-form expression.

    So back to your question... If you could find a coordinate transform (in closed form expression) which would allow you to find an antiderivative (in closed form) for e^{-x^2}, then in theory, after you were finished, you should be able to transform back to the original coordinates. But then you would have a closed form expression for the antiderivative, which contradicts Liouville's theorem.

    In conclusion, it is not always possible to find a coordinate transformation which will allow you to integrate a continuous function using traditional methods (integration by parts, etc), because these methods yield closed form expressions, and closed form expressions don't always exist.

    [EDIT] It is worth mentioning that the integral

    [tex]\int_{-\infty}^{\infty}e^{-t^2}dt [/tex]

    can be calculated, and is found to be [itex]\sqrt{\pi}[/itex]. This is found by a little bit of trickery, and you guessed it, a coordinate transformation (see Gaussian Integral) However, for arbitrary limits of integration, you're out of luck.
     
    Last edited by a moderator: May 6, 2017
  5. Aug 10, 2012 #4

    Bacle2

    User Avatar
    Science Advisor

    Just to add a bit to Cristoff's post:

    I guess this is the transform you're referring to:

    [tex]\int_{-\infty}^{\infty}e^{-x^2}dx [/tex]

    Multiply it by :

    [tex]\int_{-\infty}^{\infty}e^{-y^2}dy [/tex]

    To end up with an integrand [tex] e^{-y^2-x^2}dxdy [/tex]

    And then using polar coordinates.

    Also, after rescaling, erf is the probability density function for a normally-distributed

    random variable with mean 0 and σ=1:

    http://en.wikipedia.org/wiki/Normal_distribution

    And so the rescaled version must integrate to 1 in (-∞,∞)



    variable
     
    Last edited by a moderator: May 6, 2017
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Integration and the Jacobian
  1. Question on Jacobians (Replies: 3)

  2. The Jacobian Matrix. (Replies: 0)

Loading...