https://github.com/ericmjl/dl-workshop
Crash course to master gradient-based machine learning. Also secretly a JAX course in disguise!
https://github.com/ericmjl/dl-workshop
binder deep-learning workshop
Last synced: about 1 month ago
JSON representation
Crash course to master gradient-based machine learning. Also secretly a JAX course in disguise!
- Host: GitHub
- URL: https://github.com/ericmjl/dl-workshop
- Owner: ericmjl
- License: mit
- Created: 2018-12-20T22:34:43.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2024-02-12T23:57:55.000Z (over 1 year ago)
- Last Synced: 2025-04-02T06:08:04.576Z (6 months ago)
- Topics: binder, deep-learning, workshop
- Language: Jupyter Notebook
- Homepage: https://ericmjl.github.io/dl-workshop
- Size: 9.48 MB
- Stars: 224
- Watchers: 9
- Forks: 54
- Open Issues: 9
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# deep-learning-workshop
In this workshop, I will build your intuition in deep learning, without using a framework.
## Getting Started
You can get started using one of the following methods.
### 1. Setup using `conda` environments
```bash
conda env create -f environment.yml
conda activate dl-workshop # older versions of conda use `source activate` rather than `conda activate`
python -m ipykernel install --user --name dl-workshop
jupyter labextension install @jupyter-widgets/jupyterlab-manager
```If you want `jax` with GPU, you will need to build from source, or follow the [installation instructions](https://github.com/google/jax#installation)
### 2. "just click Binder"
[](https://mybinder.org/v2/gh/ericmjl/dl-workshop/master)
### Notes
If you are using Jupyter Lab, you will want to also ensure that `ipywidgets` is installed:
```bash
# only if you don't have ipywidgets installed.
conda install -c conda-forge ipywidgets
# the next line is necessary.
jupyter labextension install @jupyter-widgets/jupyterlab-manager
```## Key Ideas
The key idea for this tutorial is that if we really study deep learning's fundamental model, linear regression, then we can get a better understanding of the components - a model with parameters, a loss function, and an optimizer to change the parameters to minimize the loss. Most of us who become practitioners (rather than researchers) can then take for granted that the same ideas apply to any more complex/deeper model.
## Feedback
I'd love to hear how well this workshop went for you. Please consider [leaving feedback so I can improve the workshop](https://ericma1.typeform.com/to/Tv185B).
## Further Reading:
- [Demystifying Different Variants of Gradient Descent Optimization Algorithm](https://hackernoon.com/demystifying-different-variants-of-gradient-descent-optimization-algorithm-19ae9ba2e9bc)