Automatic differentiation for numerical integration

In summary, the Levenberg-Marquardt nonlinear optimization routine employs the reverse-mode automatic differentiation algorithm for building the Jacobian. However, when using functions that contain integrals that cannot be analytically taken and have to be numerically calculated, the AD implementation gets stuck with the last term.
  • #1
raul_l
105
0
I've written a Levenberg-Marquardt nonlinear optimization routine that employs the reverse-mode automatic differentiation algorithm for building the Jacobian. So far it has worked marvelously for me.
However, now I have to use functions that contain integrals that cannot be analytically taken and have to be numerically calculated, i.e.
[tex] f(x,\mathbf{p}) = g(x,\mathbf{p}) \int^{b(\mathbf{p})}_{a(\mathbf{p})}{h(x,\mathbf{p},t)dt} [/tex]
where p is the parameter vector, f is the fitting function, g is some function of x and p and h is the integrand which is also a function of x and p and to be integrated over t.
Now, most of this can be easily differentiated if I just define the derivatives of elementary functions and take some time to write the rest of AD implementation (which I have done in C++). However, when I use the identity
[tex] \nabla_{ \mathbf{p} } \int^{b(\mathbf{p})}_{a(\mathbf{p})}{h(x,\mathbf{p},t)dt} = h(x,\mathbf{p},b(\mathbf{p}))\nabla_{\mathbf{p}}b(\mathbf{p}) - h(x,\mathbf{p},a(\mathbf{p}))\nabla_{\mathbf{p}}a(\mathbf{p}) + \int^{b(\mathbf{p})}_{a(\mathbf{p})}{\nabla_{ \mathbf{p} } h(x,\mathbf{p},t)dt} [/tex]
to attack the integral I get stuck with the last term. This is my question: Is it possible to use the reverse-mode automatic differentiation to calculate integrals of type
[tex] \int^{b(\mathbf{p})}_{a(\mathbf{p})}{\nabla_{ \mathbf{p} } h(x,\mathbf{p},t)dt} [/tex]
where I first have to calculate partial derivatives with respect to the fitting parameters and then integrate? (The bounds could also be constants, it doesn't matter.)
I've done some googling and come across articles that discuss the application of automatic differentiation to numerical integration algorithms for ordinary differential equations (ODEs). But this is something else because here I explicitly have to calculate the value of the integral. So no help so far. I've done some thinking and it appears that reverse-mode can't be used to handle the integrals, but forward-mode might be doable. Any thoughts from experts?
 
Technology news on Phys.org
  • #2
I see that my question was too complicated.
So I'll ask something simpler. Does anyone know of any AD tools that are capable of calculating the derivatives of integrals (like above)? I just want to make sure that I'm not reinventing the wheel by writing my own implementation of this.
 

1. What is automatic differentiation for numerical integration?

Automatic differentiation is a technique used in numerical analysis to efficiently compute the derivatives of a function with respect to its input parameters. It is commonly used in numerical integration, which is the process of finding the area under a curve or the solution to an integral equation.

2. How does automatic differentiation differ from other methods of computing derivatives?

Automatic differentiation differs from traditional methods of computing derivatives, such as symbolic differentiation and numerical differentiation, in that it leverages the chain rule to compute derivatives of complex functions without explicitly calculating the derivatives. This results in faster and more accurate calculations.

3. What are the benefits of using automatic differentiation for numerical integration?

Using automatic differentiation for numerical integration has several benefits. It allows for more accurate and efficient computation of derivatives, which can improve the overall accuracy of the integration. It also reduces the potential for human error in manual calculations and can handle complex and high-dimensional functions more easily than traditional methods.

4. Are there any limitations to using automatic differentiation for numerical integration?

While automatic differentiation has many advantages, it does have some limitations. It may not be suitable for functions with discontinuities or sharp changes, as it relies on a continuous function to calculate the derivatives. It also requires a certain level of programming knowledge and may not be as intuitive as other methods for those who are not familiar with programming.

5. How is automatic differentiation implemented in practice?

Automatic differentiation is implemented through specialized software libraries or packages that perform the calculations automatically. These libraries typically use a technique called reverse mode differentiation, which is more efficient for functions with a large number of input parameters. There are also some programming languages, such as Julia and TensorFlow, that have built-in support for automatic differentiation.

Similar threads

  • Programming and Computer Science
Replies
2
Views
718
Replies
2
Views
701
Replies
4
Views
1K
  • Special and General Relativity
Replies
10
Views
1K
Replies
2
Views
847
Replies
2
Views
581
  • High Energy, Nuclear, Particle Physics
Replies
14
Views
2K
Replies
2
Views
1K
  • Special and General Relativity
Replies
1
Views
612
  • Classical Physics
Replies
3
Views
1K
Back
Top