An open API service indexing awesome lists of open source software.

https://github.com/mohammedsaqibms/optimization_methods

Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.
https://github.com/mohammedsaqibms/optimization_methods

adam-optimizer deep-learning gradient-descent machine-learning momentum-gradient-descent neural-networks optimization-algorithms python3

Last synced: 3 months ago
JSON representation

Description: This repository implements a 3-layer neural network to compare the performance of Gradient Descent, Momentum, and Adam optimization algorithms on a dataset, highlighting their training accuracy and convergence behavior.

Awesome Lists containing this project

README

        

# Neural Network Optimization Algorithms 🌐

## Overview 📈
This repository provides an implementation of a 3-layer neural network model that evaluates the performance of three different optimization algorithms: Gradient Descent, Momentum, and Adam. The objective is to compare their effectiveness in terms of accuracy and convergence behavior on a given dataset.

## Acknowledgments 🙏
- Special thanks to the [Deep Learning Specialization](https://www.deeplearning.ai/courses/deep-learning-specialization/) for providing the foundational knowledge and skills necessary for implementing this project.