{"id":13639836,"url":"https://github.com/hips/autograd","last_synced_at":"2025-09-09T21:15:27.326Z","repository":{"id":23709979,"uuid":"27082457","full_name":"HIPS/autograd","owner":"HIPS","description":"Efficiently computes derivatives of NumPy code.","archived":false,"fork":false,"pushed_at":"2025-04-21T17:44:28.000Z","size":16360,"stargazers_count":7245,"open_issues_count":186,"forks_count":912,"subscribers_count":211,"default_branch":"master","last_synced_at":"2025-04-28T10:53:28.426Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/HIPS.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"license.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2014-11-24T15:50:23.000Z","updated_at":"2025-04-27T14:44:18.000Z","dependencies_parsed_at":"2022-07-12T15:18:13.822Z","dependency_job_id":"e9db28ba-95e1-448e-bc82-5d7361c2ef77","html_url":"https://github.com/HIPS/autograd","commit_stats":{"total_commits":1269,"total_committers":61,"mean_commits":20.80327868852459,"dds":0.7147360126083531,"last_synced_commit":"ce202b97080c560f2b1ae13b5f6f73946fc67fe8"},"previous_names":[],"tags_count":3,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HIPS%2Fautograd","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HIPS%2Fautograd/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HIPS%2Fautograd/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/HIPS%2Fautograd/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/HIPS","download_url":"https://codeload.github.com/HIPS/autograd/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252544730,"owners_count":21765426,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-08-02T01:01:05.378Z","updated_at":"2025-09-09T21:15:27.310Z","avatar_url":"https://github.com/HIPS.png","language":"Python","readme":"# Autograd  [![Checks status][checks-badge]][checks-url] [![Tests status][tests-badge]][tests-url] [![Publish status][publish-badge]][publish-url] [![asv][asv-badge]](#)\n\n[publish-badge]: https://github.com/HIPS/autograd/actions/workflows/publish.yml/badge.svg\n[checks-badge]: https://github.com/HIPS/autograd/actions/workflows/check.yml/badge.svg\n[tests-badge]: https://github.com/HIPS/autograd/actions/workflows/test.yml/badge.svg\n[asv-badge]: http://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat\n[publish-url]: https://github.com/HIPS/autograd/actions/workflows/publish.yml\n[checks-url]: https://github.com/HIPS/autograd/actions/workflows/check.yml\n[tests-url]: https://github.com/HIPS/autograd/actions/workflows/test.yml\n\nAutograd can automatically differentiate native Python and Numpy code. It can\nhandle a large subset of Python's features, including loops, ifs, recursion and\nclosures, and it can even take derivatives of derivatives of derivatives. It\nsupports reverse-mode differentiation (a.k.a. backpropagation), which means it\ncan efficiently take gradients of scalar-valued functions with respect to\narray-valued arguments, as well as forward-mode differentiation, and the two can\nbe composed arbitrarily. The main intended application of Autograd is\ngradient-based optimization. For more information, check out the\n[tutorial](docs/tutorial.md) and the [examples directory](examples/).\n\nExample use:\n\n```python\n\u003e\u003e\u003e import autograd.numpy as np  # Thinly-wrapped numpy\n\u003e\u003e\u003e from autograd import grad    # The only autograd function you may ever need\n\u003e\u003e\u003e\n\u003e\u003e\u003e def tanh(x):                 # Define a function\n...     return (1.0 - np.exp((-2 * x))) / (1.0 + np.exp(-(2 * x)))\n...\n\u003e\u003e\u003e grad_tanh = grad(tanh)       # Obtain its gradient function\n\u003e\u003e\u003e grad_tanh(1.0)               # Evaluate the gradient at x = 1.0\nnp.float64(0.419974341614026)\n\u003e\u003e\u003e (tanh(1.0001) - tanh(0.9999)) / 0.0002  # Compare to finite differences\nnp.float64(0.41997434264973155)\n```\n\nWe can continue to differentiate as many times as we like, and use numpy's\nvectorization of scalar-valued functions across many different input values:\n\n```python\n\u003e\u003e\u003e from autograd import elementwise_grad as egrad  # for functions that vectorize over inputs\n\u003e\u003e\u003e import matplotlib.pyplot as plt\n\u003e\u003e\u003e x = np.linspace(-7, 7, 700)\n\u003e\u003e\u003e plt.plot(x, tanh(x),\n...          x, egrad(tanh)(x),                                     # first  derivative\n...          x, egrad(egrad(tanh))(x),                              # second derivative\n...          x, egrad(egrad(egrad(tanh)))(x),                       # third  derivative\n...          x, egrad(egrad(egrad(egrad(tanh))))(x),)               # fourth derivative\n\u003e\u003e\u003e plt.show()\n```\n\n\u003cimg src=\"examples/tanh.png\" width=\"600\"\u003e\n\nSee the [tanh example file](examples/tanh.py) for the code.\n\n## Documentation\n\nYou can find a tutorial [here.](docs/tutorial.md)\n\n## End-to-end examples\n\n* [Simple neural net](examples/neural_net.py)\n* [Convolutional neural net](examples/convnet.py)\n* [Recurrent neural net](examples/rnn.py)\n* [LSTM](examples/lstm.py)\n* [Neural Turing Machine](https://github.com/DoctorTeeth/diffmem/blob/512aadeefd6dbafc1bdd253a64b6be192a435dc3/ntm/ntm.py)\n* [Backpropagating through a fluid simulation](examples/fluidsim/fluidsim.py)\n\n\u003cimg src=\"examples/fluidsim/animated.gif\" width=\"400\"\u003e\n\n* [Variational inference in Bayesian neural network](examples/bayesian_neural_net.py)\n* [Gaussian process regression](examples/gaussian_process.py)\n* [Sampyl, a pure Python MCMC package with HMC and NUTS](https://github.com/mcleonard/sampyl)\n\n## How to install\n\nInstall Autograd using Pip:\n\n```shell\npip install autograd\n```\n\nSome features require SciPy, which you can install separately or as an\noptional dependency along with Autograd:\n\n```shell\npip install \"autograd[scipy]\"\n```\n\n## Authors and maintainers\n\nAutograd was written by [Dougal Maclaurin](https://dougalmaclaurin.com),\n[David Duvenaud](https://www.cs.toronto.edu/~duvenaud/),\n[Matt Johnson](http://people.csail.mit.edu/mattjj/),\n[Jamie Townsend](https://github.com/j-towns)\nand many other contributors. The package is currently being maintained by\n[Agriya Khetarpal](https://github.com/agriyakhetarpal),\n[Fabian Joswig](https://github.com/fjosw) and\n[Jamie Townsend](https://github.com/j-towns).\nPlease feel free to submit any bugs or\nfeature requests. We'd also love to hear about your experiences with Autograd\nin general. Drop us an email!\n\nWe want to thank Jasper Snoek and the rest of the HIPS group (led by Prof. Ryan\nP. Adams) for helpful contributions and advice; Barak Pearlmutter for\nfoundational work on automatic differentiation and for guidance on our\nimplementation; and Analog Devices Inc. (Lyric Labs) and Samsung Advanced Institute\nof Technology for their generous support.\n","funding_links":[],"categories":["Python / C++"],"sub_categories":["Mojo🔥FastAPI Client"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhips%2Fautograd","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fhips%2Fautograd","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fhips%2Fautograd/lists"}