{"id":17167886,"url":"https://github.com/ericmjl/dl-workshop","last_synced_at":"2025-09-05T18:34:51.130Z","repository":{"id":45591349,"uuid":"162638246","full_name":"ericmjl/dl-workshop","owner":"ericmjl","description":"Crash course to master gradient-based machine learning. Also secretly a JAX course in disguise!","archived":false,"fork":false,"pushed_at":"2024-02-12T23:57:55.000Z","size":9936,"stargazers_count":224,"open_issues_count":9,"forks_count":54,"subscribers_count":9,"default_branch":"master","last_synced_at":"2025-04-02T06:08:04.576Z","etag":null,"topics":["binder","deep-learning","workshop"],"latest_commit_sha":null,"homepage":"https://ericmjl.github.io/dl-workshop","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/ericmjl.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2018-12-20T22:34:43.000Z","updated_at":"2024-12-04T08:43:20.000Z","dependencies_parsed_at":"2023-01-28T04:00:25.480Z","dependency_job_id":"b78e32dd-d26f-4534-a22c-e364d0995468","html_url":"https://github.com/ericmjl/dl-workshop","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ericmjl%2Fdl-workshop","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ericmjl%2Fdl-workshop/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ericmjl%2Fdl-workshop/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/ericmjl%2Fdl-workshop/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/ericmjl","download_url":"https://codeload.github.com/ericmjl/dl-workshop/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248008630,"owners_count":21032556,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["binder","deep-learning","workshop"],"created_at":"2024-10-14T23:10:21.099Z","updated_at":"2025-04-09T09:07:43.002Z","avatar_url":"https://github.com/ericmjl.png","language":"Jupyter Notebook","readme":"# deep-learning-workshop\n\nIn this workshop, I will build your intuition in deep learning, without using a framework.\n\n## Getting Started\n\nYou can get started using one of the following methods.\n\n### 1. Setup using `conda` environments\n\n```bash\nconda env create -f environment.yml\nconda activate dl-workshop  # older versions of conda use `source activate` rather than `conda activate`\npython -m ipykernel install --user --name dl-workshop\njupyter labextension install @jupyter-widgets/jupyterlab-manager\n```\n\nIf you want `jax` with GPU, you will need to build from source, or follow the [installation instructions](https://github.com/google/jax#installation)\n\n### 2. \"just click Binder\"\n\n[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/ericmjl/dl-workshop/master)\n\n### Notes\n\nIf you are using Jupyter Lab, you will want to also ensure that `ipywidgets` is installed:\n\n```bash\n# only if you don't have ipywidgets installed.\nconda install -c conda-forge ipywidgets\n# the next line is necessary.\njupyter labextension install @jupyter-widgets/jupyterlab-manager\n```\n\n## Key Ideas\n\nThe key idea for this tutorial is that if we really study deep learning's fundamental model, linear regression, then we can get a better understanding of the components - a model with parameters, a loss function, and an optimizer to change the parameters to minimize the loss. Most of us who become practitioners (rather than researchers) can then take for granted that the same ideas apply to any more complex/deeper model.\n\n## Feedback\n\nI'd love to hear how well this workshop went for you. Please consider [leaving feedback so I can improve the workshop](https://ericma1.typeform.com/to/Tv185B).\n\n## Further Reading:\n\n- [Demystifying Different Variants of Gradient Descent Optimization Algorithm](https://hackernoon.com/demystifying-different-variants-of-gradient-descent-optimization-algorithm-19ae9ba2e9bc)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fericmjl%2Fdl-workshop","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fericmjl%2Fdl-workshop","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fericmjl%2Fdl-workshop/lists"}