Doing Automatic Differentiation

In summary, This conversation is a brief interactive session using the ODE Playground, a repository of Automatic Differentiation code. It demonstrates two solutions of ##\sqrt {2}## with a user-specified function using Newton's method and bisection. The user only needs to provide the function, without any step sizes or other requirements. The conversation also includes an example of using dual numbers in the code.
  • #1
m4r35n357
654
148
This is a brief interactive session using my ODE Playground, which is my repository of Automatic Differentiation code. It illustrates two solutions of ##\sqrt {2}## with the user-specified function provided via a lambda, using Newton's method and then bisection. Note that the user needs to provide no step sizes or other requirements or artifacts of numerical differencing, just the function itself!

Python:
$ ipython3
Python 3.7.1 (default, Oct 22 2018, 11:21:55)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.2.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: from ad import *                                                      
ad module loaded

In [2]: from playground import *                                              
playground module loaded

In [3]: Newton(lambda x: (x**2), x0=1.0, target=2.0)                          
Out[3]: ResultType(count=7, sense='', mode='ROOT', x=1.414213562373095, f=2.0000000000000004, dx=-1.5700924586837747e-16)

In [4]: bisect(lambda x: (x**2), xa=1.4, xb=1.5, target=2.0)                  
Out[4]: ResultType(count=38, sense='', mode='ROOT', x=1.4142135623733338, f=2.0000000000006755, dx=7.276401703393276e-13)
Enjoy!
 
Last edited:
  • Like
Likes jedishrfu and Dale
Technology news on Phys.org
  • #2
OK here's some dual numbers:

Code:
$ ipython3
Python 3.7.1 (default, Oct 22 2018, 11:21:55)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.2.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: from ad import *                                                      
ad module loaded

In [2]: a = Dual.get(3.0)                                                      

In [3]: b = Dual.get(5.0)                                                      

In [4]: c = Dual.get(7.0, variable=True)                                      

In [5]: print(a)                                                              
3.0 0.0

In [6]: print(b)                                                              
5.0 0.0

In [7]: print(c)                                                              
7.0 1.0

In [8]: print(c**2 - a * c)                                                    
28.0 11.0
 

1. What is automatic differentiation?

Automatic differentiation (AD) is a method for computing the derivatives of a function with respect to its input variables. It is also known as algorithmic differentiation or autodiff. Unlike numerical differentiation, which approximates the derivative by evaluating the function at different points, AD uses the chain rule to calculate the derivative at a specific point. This allows for more accurate and efficient computation of derivatives, especially for complex functions with many input variables.

2. How does automatic differentiation work?

Automatic differentiation works by breaking down a function into elementary operations (such as addition, multiplication, and composition) and evaluating the derivative of each operation. This is done recursively, starting from the input variables and working backwards through the function. The derivatives are calculated using the chain rule, and the final result is a derivative vector that represents the gradient of the function at a specific point.

3. What are the benefits of using automatic differentiation?

There are several benefits of using automatic differentiation, including accuracy, efficiency, and flexibility. AD provides more accurate derivatives than numerical differentiation, especially for functions with many input variables. It also avoids the problem of numerical instability that can occur with traditional methods. In addition, AD can handle functions with complex and nested operations, making it a powerful tool for scientific computing and machine learning.

4. Are there different types of automatic differentiation?

Yes, there are two main types of automatic differentiation: forward-mode and reverse-mode. Forward-mode (also known as forward accumulation or tangent linear mode) calculates the derivative of a function with respect to all input variables simultaneously. Reverse-mode (also known as reverse accumulation or adjoint mode) calculates the derivative of a function with respect to a single output variable at a time. Both methods have their own advantages and are often used in combination for more efficient computation.

5. How is automatic differentiation used in scientific research?

Automatic differentiation has a wide range of applications in scientific research, particularly in fields such as optimization, machine learning, and physics. It is used to compute gradients for training neural networks, to solve optimization problems in engineering and economics, and to simulate and analyze physical systems. AD is also used in the development of numerical methods and algorithms, as it allows for more accurate and efficient computation of derivatives.

Similar threads

  • Programming and Computer Science
Replies
18
Views
2K
  • Special and General Relativity
Replies
3
Views
1K
  • Special and General Relativity
3
Replies
75
Views
3K
Replies
2
Views
5K
Back
Top