https://github.com/notshrirang/optim
A repo that contains source code for my blog "Deep Learning Optimizers: A Comprehensive Guide for Beginners (2024)"
https://github.com/notshrirang/optim
adagrad adam adamw deep-learning gradient-descent neural-network rmsprop sgd
Last synced: 6 months ago
JSON representation
A repo that contains source code for my blog "Deep Learning Optimizers: A Comprehensive Guide for Beginners (2024)"
- Host: GitHub
- URL: https://github.com/notshrirang/optim
- Owner: NotShrirang
- License: apache-2.0
- Created: 2024-07-23T03:27:54.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-07-26T04:16:20.000Z (about 1 year ago)
- Last Synced: 2025-02-11T12:36:28.529Z (8 months ago)
- Topics: adagrad, adam, adamw, deep-learning, gradient-descent, neural-network, rmsprop, sgd
- Language: Python
- Homepage: https://optimizer-visualizer.streamlit.app/
- Size: 29.3 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Optim: A Simple Beginner's Guide to Deep Learning Optimizers
[](https://medium.com/@shrirangmahajan123/optimizers-a-simple-beginners-guide-8ab6942880dd)
[](https://optimizer-visualizer.streamlit.app/)







[]()This repository is dedicated to providing a comprehensive yet beginner-friendly guide to various deep learning optimizers. You'll find detailed explanations and implementations of popular optimizers like SGD, Adam, and RMSProp, along with a visualizer to help you understand their behaviors. Whether you're new to deep learning or looking to solidify your understanding, this resource is designed to make the learning process easier and more interactive.
## Table of Contents
- [Overview](#overview)
- [Getting Started](#getting-started)
- [Installation](#installation)
- [Streamlit App](#streamlit-app)
- [License](#license)
- [Contributing](#contributing)
- [Support](#support)## Overview
### Features:
- Comprehensive Optimizer Implementations: Includes class-based and functional implementations of popular deep learning optimizers such as SGD, Adam, RMSProp, and more.
- Interactive Visualizer: `app.py` provides a visual tool to observe and compare the behavior and performance of different optimizers in real-time.
- Beginner-Friendly Guide: Detailed explanations and step-by-step tutorials make complex concepts accessible for beginners in deep learning.
- Practical Examples: Real-world examples and use cases to demonstrate the practical application of each optimizer in various deep learning scenarios.## Getting Started
### Installation
1. Clone the repository
```sh
git clone https://github.com/NotShrirang/optim.git
cd optim
```2. Install the required dependencies
```sh
pip install -r requirements.txt
```### Streamlit App
To run streamlit app locally.
```sh
streamlit run app.py
```## License
MIT © [Shrirang Mahajan](https://github.com/NotShrirang)## Contributing
Feel free to submit pull requests, create issues, or spread the word!## Support
Support me by simply starring this repository and liking the blog! ⭐