## Integration and the Jacobian

Before I ask my question, I'll lead up to it through an example. Just for reference, I have only taken up to Calc 3 and haven't taken Vector Calc. Lets look at this definite integral:

$$∫∫cos(x^2+y^2)dxdy$$

The bounds on the outer integral is from 0 to 1 while the bounds on the inner integral is from 0 to $\sqrt{1-y^2}$. I don't know how to include that in the Latex. If you have taken Calc 3 or at least Calc 2, you will notice that this is an impossible integral to take in Cartesian coordinates. Anything learned in Calc 2(u sub, by parts, trig sub, etc...) to take this integral fails. However, if you switch to polar coordinates, it becomes possible to take this integral. You can see the full problem done out here in example 5. The polar form of integration can be derived from the Jacobian Matrix and it is simple to show this(in example 2).

Here is my question. There are a ton of integrals that look impossible to do in certain coordinate systems but if we switch coordinate systems, they become possible like in this example. Does this mean that all integrals are potentially possible to do if we switch coordinate systems by using Jacobian Matrices?
 PhysOrg.com science news on PhysOrg.com >> Galaxies fed by funnels of fuel>> The better to see you with: Scientists build record-setting metamaterial flat lens>> Google eyes emerging markets networks

 Quote by hover Before I ask my question, I'll lead up to it through an example. Just for reference, I have only taken up to Calc 3 and haven't taken Vector Calc. Lets look at this definite integral: $$∫∫cos(x^2+y^2)dxdy$$ The bounds on the outer integral is from 0 to 1 while the bounds on the inner integral is from 0 to $\sqrt{1-y^2}$. I don't know how to include that in the Latex.
$${}$$
$$\int_0^1\int_0^{\sqrt{1-y^2}}\cos(x^2+y^2)dxdy$$

DonAntonio

 If you have taken Calc 3 or at least Calc 2, you will notice that this is an impossible integral to take in Cartesian coordinates. Anything learned in Calc 2(u sub, by parts, trig sub, etc...) to take this integral fails. However, if you switch to polar coordinates, it becomes possible to take this integral. You can see the full problem done out here in example 5. The polar form of integration can be derived from the Jacobian Matrix and it is simple to show this(in example 2). Here is my question. There are a ton of integrals that look impossible to do in certain coordinate systems but if we switch coordinate systems, they become possible like in this example. Does this mean that all integrals are potentially possible to do if we switch coordinate systems by using Jacobian Matrices?
 That depends on what you mean by "possible". Let's take a step back here and look simply at a function $f:A\rightarrow \mathbb{R}$ for A, some open subset of $\mathbb{R}$. By the fundamental theorem of calculus, if f is continuous, then it has an antiderivative. But let's consider the function $f(x)=e^{-x^2}$. This function is continuous everywhere, so it admits an antiderivative. However, there is a rather famous theorem by Liouville (see http://en.wikipedia.org/wiki/Liouvil...ential_algebra) which asserts that no closed form expression (a finite sum of funtions like x, e^x, trig fuctions, powers of x, and their compositions and products, etc.) for said antiderivative exists. This is why we have this thing called the "error function". It is defined as being a particular antiderivative of $e^{-x^2}$. Specifically, $$erf(x)=\int_{0}^{x}e^{-t^2}dt .$$ That isn't to say it doesn't have a series representation. We have the following representation, $$erf(x)=\frac{2}{\sqrt{\pi}}\sum_{n=0}^{\infty} \frac{(-1)^n x^{2n+1}}{n! (2n+1)},$$ which is valid for all x. However, since this is an infinite sum, it is not a closed-form expression. So back to your question... If you could find a coordinate transform (in closed form expression) which would allow you to find an antiderivative (in closed form) for e^{-x^2}, then in theory, after you were finished, you should be able to transform back to the original coordinates. But then you would have a closed form expression for the antiderivative, which contradicts Liouville's theorem. In conclusion, it is not always possible to find a coordinate transformation which will allow you to integrate a continuous function using traditional methods (integration by parts, etc), because these methods yield closed form expressions, and closed form expressions don't always exist. [EDIT] It is worth mentioning that the integral $$\int_{-\infty}^{\infty}e^{-t^2}dt$$ can be calculated, and is found to be $\sqrt{\pi}$. This is found by a little bit of trickery, and you guessed it, a coordinate transformation (see Gaussian Integral) However, for arbitrary limits of integration, you're out of luck.

Recognitions:

## Integration and the Jacobian

 Quote by christoff That depends on what you mean by "possible". Let's take a step back here and look simply at a function $f:A\rightarrow \mathbb{R}$ for A, some open subset of $\mathbb{R}$. By the fundamental theorem of calculus, if f is continuous, then it has an antiderivative. But let's consider the function $f(x)=e^{-x^2}$. This function is continuous everywhere, so it admits an antiderivative. However, there is a rather famous theorem by Liouville (see http://en.wikipedia.org/wiki/Liouvil...ential_algebra) which asserts that no closed form expression (a finite sum of funtions like x, e^x, trig fuctions, powers of x, and their compositions and products, etc.) for said antiderivative exists. This is why we have this thing called the "error function". It is defined as being a particular antiderivative of $e^{-x^2}$. Specifically, $$erf(x)=\int_{0}^{x}e^{-t^2}dt .$$ That isn't to say it doesn't have a series representation. We have the following representation, $$erf(x)=\frac{2}{\sqrt{\pi}}\sum_{n=0}^{\infty} \frac{(-1)^n x^{2n+1}}{n! (2n+1)},$$ which is valid for all x. However, since this is an infinite sum, it is not a closed-form expression. So back to your question... If you could find a coordinate transform (in closed form expression) which would allow you to find an antiderivative (in closed form) for e^{-x^2}, then in theory, after you were finished, you should be able to transform back to the original coordinates. But then you would have a closed form expression for the antiderivative, which contradicts Liouville's theorem. In conclusion, it is not always possible to find a coordinate transformation which will allow you to integrate a continuous function using traditional methods (integration by parts, etc), because these methods yield closed form expressions, and closed form expressions don't always exist. [EDIT] It is worth mentioning that the integral $$\int_{-\infty}^{\infty}e^{-t^2}dt$$ can be calculated, and is found to be $\sqrt{\pi}$. This is found by a little bit of trickery, and you guessed it, a coordinate transformation (see Gaussian Integral) However, for arbitrary limits of integration, you're out of luck.
Just to add a bit to Cristoff's post:

I guess this is the transform you're referring to:

$$\int_{-\infty}^{\infty}e^{-x^2}dx$$

Multiply it by :

$$\int_{-\infty}^{\infty}e^{-y^2}dy$$

To end up with an integrand $$e^{-y^2-x^2}dxdy$$

And then using polar coordinates.

Also, after rescaling, erf is the probability density function for a normally-distributed

random variable with mean 0 and σ=1:

http://en.wikipedia.org/wiki/Normal_distribution

And so the rescaled version must integrate to 1 in (-∞,∞)

variable

 Tags calculus, integrals, jacobian, vector