https://github.com/pilotleoyan/inside-deep-learning
Inside deep learning, a repository to explain and apply deep learning concepts.
https://github.com/pilotleoyan/inside-deep-learning
ai deep-learning jupyter-notebook learning machine-learning mathematics neuronal-network perceptron python3 pytorch
Last synced: 3 months ago
JSON representation
Inside deep learning, a repository to explain and apply deep learning concepts.
- Host: GitHub
- URL: https://github.com/pilotleoyan/inside-deep-learning
- Owner: PilotLeoYan
- License: mit
- Created: 2025-01-22T23:34:42.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-04-04T02:57:39.000Z (3 months ago)
- Last Synced: 2025-04-04T03:27:40.022Z (3 months ago)
- Topics: ai, deep-learning, jupyter-notebook, learning, machine-learning, mathematics, neuronal-network, perceptron, python3, pytorch
- Language: Jupyter Notebook
- Homepage:
- Size: 5.76 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
Awesome Lists containing this project
README







This repository is a collection of Jupyer notebooks aimed at exploring the vast field of machine learning. Sometimes it is difficult to find implementations of important concepts or ideas, so here we try to implement and explain those ideas using Markdown and Pytorch.
This repository is not for beginners, not just for LMs, but for anyone who is curious.
> [!NOTE]
> Some formulas in $\LaTeX$ may not render well on Github.> [!TIP]
> All notebooks are supported for Colab.## Table of Contents
1. [Linear regression 📈](1-linear-regression)
1. [Simple linear regression](1-linear-regression/1-1-simple-linear-regression.ipynb)
2. [Multivariate linear regression](1-linear-regression/1-2-multivariate-linear-regression.ipynb)
3. [Weight decay (L2 regularization)](1-linear-regression/1-3-weight-decay.ipynb)
4. [Interpretability and Generalization](1-linear-regression/1-4-interpretability-generalization.ipynb)
+ [Weight decay and Normal equation](1-linear-regression/weight-decay-and-normal-equation.ipynb)
2. [Classification 📊](2-classification)
1. [Multiclass classfication](2-classification/2-1-multiclass-classification.ipynb)
+ [Softmax function and its derivative](2-classification/softmax-function-and-its-derivative.ipynb)
3. [Multilayer Perceptron 🧠](3-multilayer-perceptron)
1. [Multilayer perceptron (MLP)](3-multilayer-perceptron/3-1-mlp.ipynb)
+ [Gradients and activation functions](3-multilayer-perceptron/gradients-and-activation-functions.ipynb)
+ [MLP for classification](3-multilayer-perceptron/mlp-for-classification.ipynb)
+ [MLP like PyTorch](3-multilayer-perceptron/mlp-like-pytorch.ipynb)
> [!TIP]
> The items listed are notebooks that build on the previous ones.
> The parts that aren't numbered are for math development.## How to Use
1. Clone the repository:
```
git clone https://github.com/PilotLeoYan/inside-deep-learning.git
```
2.
A. Install dependencies with cuda:
```
pip install -r requirements-cuda.txt
```
B. Install dependencies without cuda:
```
pip install -r requirements.txt
```## Examples
[Interpretability and Generalization](1-linear-regression/1-4-interpretability-generalization.ipynb)
![]()
## Used hardware
* CPU: AMD A6-9500
* GPU: Nvidia Geforce RTX 2070-SUPER (8GB VRAM)
* RAM: 16GB DDR4## Contributing
Contributions are welcome! If you have suggestions, improvements, or new topics to add, feel free to open an issue. Please follow the [contributing guidelines](CONTRIBUTING.md).
Remember that I am only one person working on this repository.## Main Bibliography
[1]
**Goodfellow, I., Bengio, Y., & Courville, A.** (2016). *Deep Learning*. MIT Press. [URL](http://www.deeplearningbook.org).[2]
**Zhang, A., Lipton, Z. C., Li, M., & Smola, A. J.** (2023). *Dive into Deep Learning*. Cambridge University Press. [URL](https://D2L.ai).[3]
**Deisenroth, M. P., Faisal, A. A., & Ong, C. S.** (2020). *Mathematics for Machine Learning*. Cambridge University Press. [URL](https://mml-book.github.io/).---
If you would like to contact me you can send me an [email](mailto:[email protected]).