SoFunction
Updated on 2024-11-17

Explanation of several automatic differentiation libraries in python

preamble

A brief introduction to a few of python's autoderivative tools, tangent, autograd, sympy;

In a variety of machine learning, deep learning frameworks include automatic differentiation, differentiation of the main four: manual differentiation, numerical differentiation, symbolic differentiation, automatic differentiation, here are a simple walk (hello world style) to introduce the following kinds of differentiation framework;

sympy powerful scientific computation library, using symbolic differentiation, by generating symbolic expressions for derivatives; the resulting derivatives are not necessarily the simplest, and when the function is more complex, the resulting expression tree is unusually complex;

autograd automatic differentiation first symbolic differentiation for the basic operator, bring the value and save the intermediate results, after the application of the entire function; automatic differentiation is essentially a graph computation, easy to do a lot of optimization so widely used in a variety of machine learning deep learning framework;

tangent is a source-to-source automatic differentiation framework that calculates the differentiation of a function f by generating a new function f_grad, which is different from all other automatic differentiation frameworks that currently exist; because it calculates the differentiation by generating a completely new function, it is very readable and tunable, and this is also a major difference from what the official This is a major difference from current automatic differentiation frameworks;

sympy Derivation

 def grad():
   # Name of the variable defining the expression
   x, y = symbols('x y')
   # Define expressions
   z = x**2 +y**2
   # Calculate the partial derivative of z with respect to the y counterpart
   return diff(z, y)
 func = grad()

The derivative function z' = 2*y of the output resultant expression z

print(func) 

Bringing y equal to 6 results in 12.

print((subs ={'y':3}))

Autograd for partial derivatives

 import  as np
 from autograd import grad
 #Expression f(x,y)=x^2+3xy+y^2
 #df/dx = 2x+3y
 #df/dy = 3x+2y
 #x=1,y=2
 #df/dx=8
 #df/dy=7
 def fun(x, y):
  z=x**2+3*x*y+y**2
  return z
 fun_grad = grad(fun)
 fun_grad(2.,1.)

Output: 7.0

Tangent Derivation

 import tangent
 def fun(x, y):
  z=x**2+3*x*y+y**2
  return z

Defaults to finding the partial derivative of z with respect to x

dy_dx = (fun)

Output the value of the partial derivative as 8, z' = 2 * x, where any value for x is the same

df(4, y=1)

The partial derivative with respect to a parameter can be specified by using the wrt parameter, the following is the partial derivative of z with respect to y

df = (funs, wrt=([1]))

The output value is 10 ,z' = 2 *y, where any value for x is the same.

df(x=0, y=5)

All of the above doesn't capture the core of tangent: source-to-source

In the generation of the derivative function to add verbose = 1 parameter, you can see tangent for us to generate the function used to calculate the derivative, by default the value of 0 so we do not feel the tangent derivatives and other automatic differentiation framework has any difference;

 def df(x):
   z = x**2
   return z
 df = (df, verbose=1)
 df(x=2)

After executing the above code, we see the function that tangent generated for us to use to find the derivative:

 def ddfdx(x, bz=1.0):
  z = x ** 2
  assert tangent.shapes_match(z, bz), 'Shape mismatch between return value (%s) and seed derivative (%s)' % ((z), (bz))
 # Grad of: z = x ** 2
 _bx = 2 * x * bz
 bx = _bx
 return bx

The ddfdx function is the generated function, from which we can also see the derivative function z' = 2 * x of the expression z, which tangent is used to find the derivative by executing this function;

sympy in the automatic differentiation is only one of its powerful features, autograd from the name can also be known that it is for automatic differentiation and born, tangent fledgling at the end of 2017 Google released automatic differentiation method is also relatively new, from 17 years after the release of v0.1.8 version has not seen the release of the version, the source code is not enough to update the active; sympy, autograd is more mature, tangent remains to be seen;

This is the whole content of this article.