https://github.com/mohammedsaqibms/optimization_methods
Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.
https://github.com/mohammedsaqibms/optimization_methods
adam-optimizer deep-learning gradient-descent machine-learning momentum-gradient-descent neural-networks optimization-algorithms python3
Last synced: 3 months ago
JSON representation
Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.
- Host: GitHub
- URL: https://github.com/mohammedsaqibms/optimization_methods
- Owner: MohammedSaqibMS
- Created: 2024-09-28T12:54:20.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-02-19T15:53:23.000Z (3 months ago)
- Last Synced: 2025-02-19T16:41:02.491Z (3 months ago)
- Topics: adam-optimizer, deep-learning, gradient-descent, machine-learning, momentum-gradient-descent, neural-networks, optimization-algorithms, python3
- Language: Jupyter Notebook
- Homepage:
- Size: 343 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Neural Network Optimization Algorithms 🌐
## Overview 📈
This repository provides an implementation of a 3-layer neural network model that evaluates the performance of three different optimization algorithms: Gradient Descent, Momentum, and Adam. The objective is to compare their effectiveness in terms of accuracy and convergence behavior on a given dataset.## Acknowledgments 🙏
- Special thanks to the [Deep Learning Specialization](https://www.deeplearning.ai/courses/deep-learning-specialization/) for providing the foundational knowledge and skills necessary for implementing this project.