Automatic differentiation (AD), also called algorithmic differentiation or computational differentiation,is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.
Automatic differentiation is neither:
Symbolic differentiation can lead to inefficient code and faces the difficulty of converting a computer program into a single expression, while numerical differentiation can introduce round-off errors in the discretization process and cancellation
Python has tools for so-called automatic differentiation. Consider the following example
$$ f(x) = \sin\left(2\pi x + x^2\right) $$which has the following derivative
$$ f'(x) = \cos\left(2\pi x + x^2\right)\left(2\pi + 2x\right) $$Using autograd we have
import autograd.numpy as np
# To do elementwise differentiation:
from autograd import elementwise_grad as egrad
# To plot:
import matplotlib.pyplot as plt
def f(x):
return np.sin(2*np.pi*x + x**2)
def f_grad_analytic(x):
return np.cos(2*np.pi*x + x**2)*(2*np.pi + 2*x)
# Do the comparison:
x = np.linspace(0,1,1000)
f_grad = egrad(f)
computed = f_grad(x)
analytic = f_grad_analytic(x)
plt.title('Derivative computed from Autograd compared with the analytical derivative')
plt.plot(x,computed,label='autograd')
plt.plot(x,analytic,label='analytic')
plt.xlabel('x')
plt.ylabel('y')
plt.legend()
plt.show()
print("The max absolute difference is: %g"%(np.max(np.abs(computed - analytic))))