{"id":16107293,"url":"https://github.com/neonwatty/autograd_tutorials","last_synced_at":"2025-03-18T08:32:19.846Z","repository":{"id":106970926,"uuid":"159950783","full_name":"neonwatty/autograd_tutorials","owner":"neonwatty","description":"A set of autograd tutorial notebooks  ","archived":false,"fork":false,"pushed_at":"2019-04-16T00:41:16.000Z","size":4983,"stargazers_count":7,"open_issues_count":0,"forks_count":2,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-03-16T18:21:29.508Z","etag":null,"topics":["autograd","autograd-tutorials","automatic-differentiation","backpropagation","jupyter-notebook","lecture-notes"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/neonwatty.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-12-01T14:07:49.000Z","updated_at":"2024-10-05T05:17:01.000Z","dependencies_parsed_at":null,"dependency_job_id":"d8410693-0b83-49b3-8e1d-5bbfe720a26b","html_url":"https://github.com/neonwatty/autograd_tutorials","commit_stats":null,"previous_names":["neonwatty/autograd_tutorials"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neonwatty%2Fautograd_tutorials","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neonwatty%2Fautograd_tutorials/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neonwatty%2Fautograd_tutorials/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/neonwatty%2Fautograd_tutorials/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/neonwatty","download_url":"https://codeload.github.com/neonwatty/autograd_tutorials/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":244184300,"owners_count":20412188,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["autograd","autograd-tutorials","automatic-differentiation","backpropagation","jupyter-notebook","lecture-notes"],"created_at":"2024-10-09T19:15:56.697Z","updated_at":"2025-03-18T08:32:19.280Z","avatar_url":"https://github.com/neonwatty.png","language":"Jupyter Notebook","readme":"# A few `autograd` tutorial notebooks  \n\nThis repo contains a set of Jupyter notebook describing how to use various [autograd](https://github.com/HIPS/autograd) functionalities, complementing the excellent tutorial located at the repo itself, including:\n\n- [**basic_autograd_examples.ipynb**](https://nbviewer.jupyter.org/github/jermwatt/autograd_tutorials/blob/b6d264a62d3f3028406c76db4d3f476c6337fdff/basic_examples.ipynb) covering basic functionalities such as: derivative computation using standard and lambda functions, subtleties involved in automatic differentiation and the array of gradient prototypes provided by `autograd`, and computing partial derivatives of multi-input functions\n\n- [**flattening_functions_using_autograd.ipynb**](https://nbviewer.jupyter.org/github/jermwatt/autograd_tutorials/blob/b775b089460e2204a5d37dcaada5e0842ca3f0de/flattening_functions.ipynb) covering usage of `autograd`'s [flatten_func](https://github.com/HIPS/autograd/blob/master/autograd/misc/flatten.py) function \n\n\nThese notebooks were produced as supplementary material for the second edition of the textbook Machine Learning Refined, published Cambridge University Press, set for release in mid-2019.  You can find a host of examples employing `autograd` and - in particular - `flatten_func` on the main repository for the textbook [located here](https://github.com/jermwatt/mlrefined) (see for example the drafts on [multi-class classification](https://jermwatt.github.io/mlrefined/blog_posts/7_Linear_multiclass_classification/7_2_Perceptron.html) [fully connected networks](https://jermwatt.github.io/mlrefined/blog_posts/13_Multilayer_perceptrons/13_1_Multi_layer_perceptrons.html)).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneonwatty%2Fautograd_tutorials","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fneonwatty%2Fautograd_tutorials","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fneonwatty%2Fautograd_tutorials/lists"}