pyADiff
is a (yet) very basic algorithmic differentiation package, which implements forward and adjoint/reverse mode differentiation. If you are looking for a fully-featured and faster library, have a look at google/jax, autograd or dco/c++ (or many more), but if you are interested in a package where you are able to quickly "look under the hood", you may be right here.
My motivation to start this project arose from curiosity while listening to the lecture "Computational Differentiation" by Prof. Naumann at RWTH Aachen University. So basically I tried to understand the concepts from the lecture by implementing them by myself. In the end I was (positively) surprised with the outcome and decided to bundle it in a python package. Additionaly this gave me the chance to learn about python packaging, distributing, documentation, ...
Suppose we want to compute the gradient of the function
f(x₀, x₁) = 2 x₀ x₁².
This is a rather trivial task, because by simple calculus, the gradient is:
∇f(x₀, x₁) = (2 x₁², 4 x₀ x₁)
Nevertheless we use this example illustrate the use of pyADiff
.
import pyADiff as ad
# define the function f
def f(x):
return 2.*x[0]*x[1]**2.
# call the gradient function of pyADiff
df = ad.gradient(f)
x = [0.5, 2.0]
# Call the function f and the gradient function df
y = f(x)
dy = df(x)
print("f({}) = {}".format(x, y)) # prints f([0.5, 2.0]) = 4.0
print("f'({}) = {}".format(x, dy)) # prints f'([0.5, 2.0]) = [8. 4.]
Which corresponds to the evaluation of the analytic gradient.
∇f(0.5, 2) = (2*2², 4*0.5*2) = (8, 4)
For more sophisticated examples see the Documentation or have a look at the .ipynb notebooks
> pip install pyADiff
This will clone the repository and install the pyADiff
package using the setup.py
script.
> git clone https://github.com/tam724/pyADiff
> python pyADiff/setup.py install
Availiable on readthedocs.org
- Uwe Naumann, Lecture Computational Differentiation, RWTH Aachen