Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/mattjj/autodidact
A pedagogical implementation of Autograd
https://github.com/mattjj/autodidact
Last synced: about 8 hours ago
JSON representation
A pedagogical implementation of Autograd
- Host: GitHub
- URL: https://github.com/mattjj/autodidact
- Owner: mattjj
- License: mit
- Created: 2018-01-30T14:01:12.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2020-05-26T14:11:59.000Z (over 4 years ago)
- Last Synced: 2024-10-30T03:43:08.359Z (14 days ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 62.5 KB
- Stars: 953
- Watchers: 17
- Forks: 100
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: license.txt
Awesome Lists containing this project
README
# Autodidact: a pedagogical implementation of [Autograd](https://github.com/hips/autograd)
This is a tutorial implementation based on [the full version of
Autograd](https://github.com/hips/autograd).Example use:
```python
>>> import autograd.numpy as np # Thinly-wrapped numpy
>>> from autograd import grad # The only autograd function you may ever need
>>>
>>> def tanh(x): # Define a function
... y = np.exp(-2.0 * x)
... return (1.0 - y) / (1.0 + y)
...
>>> grad_tanh = grad(tanh) # Obtain its gradient function
>>> grad_tanh(1.0) # Evaluate the gradient at x = 1.0
0.41997434161402603
>>> (tanh(1.0001) - tanh(0.9999)) / 0.0002 # Compare to finite differences
0.41997434264973155
```We can continue to differentiate as many times as we like, and use numpy's
vectorization of scalar-valued functions across many different input values:```python
>>> import matplotlib.pyplot as plt
>>> x = np.linspace(-7, 7, 200)
>>> plt.plot(x, tanh(x),
... x, grad(tanh)(x), # first derivative
... x, grad(grad(tanh))(x), # second derivative
... x, grad(grad(grad(tanh)))(x), # third derivative
... x, grad(grad(grad(grad(tanh))))(x), # fourth derivative
... x, grad(grad(grad(grad(grad(tanh)))))(x), # fifth derivative
... x, grad(grad(grad(grad(grad(grad(tanh))))))(x)) # sixth derivative
>>> plt.show()
```Autograd was written by [Dougal Maclaurin](https://dougalmaclaurin.com),
[David Duvenaud](https://www.cs.toronto.edu/~duvenaud/)
and [Matt Johnson](http://people.csail.mit.edu/mattjj/).
See [the main page](https://github.com/hips/autograd) for more information.