{"id":17793011,"url":"https://github.com/evcu/numpy_autograd","last_synced_at":"2025-09-21T10:32:15.771Z","repository":{"id":79244279,"uuid":"137539621","full_name":"evcu/numpy_autograd","owner":"evcu","description":"a simple implementation of autograd engine","archived":false,"fork":false,"pushed_at":"2018-09-22T01:10:15.000Z","size":66,"stargazers_count":24,"open_issues_count":0,"forks_count":4,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-04-01T23:52:06.202Z","etag":null,"topics":["autograd","ml","numpy","pytorch","variable"],"latest_commit_sha":null,"homepage":null,"language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/evcu.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-06-15T23:24:38.000Z","updated_at":"2024-03-25T14:30:56.000Z","dependencies_parsed_at":null,"dependency_job_id":"f6c57d31-0d20-4dfa-a8f0-ee04a814bd2a","html_url":"https://github.com/evcu/numpy_autograd","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/evcu/numpy_autograd","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/evcu%2Fnumpy_autograd","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/evcu%2Fnumpy_autograd/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/evcu%2Fnumpy_autograd/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/evcu%2Fnumpy_autograd/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/evcu","download_url":"https://codeload.github.com/evcu/numpy_autograd/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/evcu%2Fnumpy_autograd/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":276229080,"owners_count":25606942,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-09-21T02:00:07.055Z","response_time":72,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["autograd","ml","numpy","pytorch","variable"],"created_at":"2024-10-27T11:03:43.392Z","updated_at":"2025-09-21T10:32:15.552Z","avatar_url":"https://github.com/evcu.png","language":"Jupyter Notebook","readme":"# numpy_autograd\nIn this repo I aim to motivate and show how to write an automatic differentiation library. There are various strategies to perform automatic differentiation and they each have different strengths and weaknesses. For a an overview of various methods used please refer to [1]. Py-Torch uses a graph based automatic differentiation.\n\nEvery operation performed on tensors can be shown as a DAG (directed acylic graph). In the case of neural networks, the loss value calculated for a given mini-batch is the last node of the graph. Chain rule is very powerful and yet a very simple rule. Thinking in terms of the DAG, what chain rule tells us to take the derivative on a node if the output gradient of the node is completely accumulated. If we somehow make each node in this graph to remember its parents. We can run a topological sort on the DAG and call the derivative function of the nodes in this order. That's a very simple overview of how auto-grad in [PyTorch](https://pytorch.org/) works and it is very simple to implement! Let's do it.\n\n[1] Automatic differentiation in machine learning: a survey https://arxiv.org/abs/1502.05767\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fevcu%2Fnumpy_autograd","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fevcu%2Fnumpy_autograd","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fevcu%2Fnumpy_autograd/lists"}