An open API service indexing awesome lists of open source software.

Projects in Awesome Lists tagged with optimizers

A curated list of projects in awesome lists tagged with optimizers .

https://github.com/labmlai/annotated_deep_learning_paper_implementations

🧑‍🏫 60+ Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠

attention deep-learning deep-learning-tutorial gan literate-programming lora machine-learning neural-networks optimizers pytorch reinforcement-learning transformer transformers

Last synced: 17 Nov 2025

https://github.com/lucidrains/lion-pytorch

🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch

artificial-intelligence deep-learning evolutionary-search optimizers

Last synced: 25 Sep 2025

https://github.com/elixir-nx/axon

Nx-powered Neural Networks

deep-learning elixir neural-networks nx optimizers

Last synced: 13 May 2025

https://github.com/gugarosa/opytimizer

🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.

artificial-intelligence bioinspired meta-heuristic meta-heuristic-optimization optimization optimizers python

Last synced: 14 May 2025

https://github.com/cyberzhg/keras-radam

RAdam implemented in Keras & TensorFlow

adam keras optimizers radam rectified-adam tensorflow

Last synced: 21 Oct 2025

https://github.com/CyberZHG/keras-radam

RAdam implemented in Keras & TensorFlow

adam keras optimizers radam rectified-adam tensorflow

Last synced: 05 Apr 2025

https://github.com/overlordgolddragon/keras-adamw

Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers

adamw adamwr keras learning-rate-multipliers nadam optimizers sgd tensorflow warm-restarts

Last synced: 04 Sep 2025

https://github.com/OverLordGoldDragon/keras-adamw

Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers

adamw adamwr keras learning-rate-multipliers nadam optimizers sgd tensorflow warm-restarts

Last synced: 04 May 2025

https://github.com/lucidrains/adam-atan2-pytorch

Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch

adam artificial-intelligence deep-learning optimizers stability

Last synced: 07 Apr 2025

https://github.com/palle-k/DL4S

Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux

autograd automatic-differentiation convolutional-neural-networks deep-learning deep-neural-networks derivatives gradient-descent machine-learning neural-networks optimizers recurrent-networks recurrent-neural-networks swift swift-machine-learning tensor

Last synced: 06 Aug 2025

https://github.com/warner-benjamin/optimi

Fast, Modern, Memory Efficient, and Low Precision PyTorch Optimizers

deep-learning optimizers pytorch

Last synced: 27 Dec 2025

https://github.com/epfml/llm-optimizer-benchmark

Benchmarking Optimizers for LLM Pretraining

benchmarking llm optimizers

Last synced: 14 Feb 2026

https://github.com/shreyansh26/ml-optimizers-jax

Toy implementations of some popular ML optimizers using Python/JAX

adam adam-optimizer gradient-descent jax machine-learning momentum optimization-algorithms optimizers

Last synced: 10 Apr 2025

https://github.com/devzhk/cgds-package

Package for CGD and ACGD optimizers

optimizers pytorch

Last synced: 30 Apr 2025

https://github.com/harshalmittal4/hypergradient_variants

Improved Hypergradient optimizers for ML, providing better generalization and faster convergence.

adam-optimizer hypergradient learning-rate momentum optimizers step-size

Last synced: 04 Jul 2025

https://github.com/origamidream/lion-tf

Lion - EvoLved Sign Momentum w/ New Optimizer API in TensorFlow 2.11+

deep-learning optimizers tensorflow

Last synced: 26 Jul 2025

https://github.com/avnlp/hyperparameter-tuning

Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance

adam-optimizer hyperparameter-tuning optimizers rmsprop-optimizer sgd-momentum sgd-optimizer

Last synced: 24 Feb 2026

https://github.com/tensor-fusion/sophia-jax

JAX implementation of 'Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training'

deep-learning jax large-language-models llm machine-learning optimization optimizers sophia

Last synced: 14 Jul 2025

https://github.com/ndoll1998/lightgrad

Lightweight automatic differentiation library

accelerator autograd educational-project lightweight opencl optimizers tensor

Last synced: 26 Feb 2025

https://github.com/the-swarm-corporation/multimodeloptimizer

MultiModelOptimizer: A Hierarchical Parameter Synchronization Approach for Joint Training of Multiple Transformer Models

agents ai attention gpt2 gpt3 jax ml multi-agent optimizers transformers

Last synced: 26 Dec 2025

https://github.com/sameetasadullah/neural-network-implementation

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

activation-functions adagrad adam-optimizer cross-entropy-loss gradient-descent hinge-loss jupyter-notebook leaky-relu loss-functions mean-squared-error neural-network optimizers pycharm pycharm-ide python python3 relu-activation rmsprop sigmoid-activation softmax-activation

Last synced: 30 Jan 2026

https://github.com/basileioskal/neuralnet

Neural networks framework built from scratch on Python.

neural-networks optimizers

Last synced: 06 Sep 2025

https://github.com/robjsliwa/adventures_in_dspy

Learn DSPy framework by coding text adventure game

dspy-ai metrics ollama optimizers prompt-engineering signatures text-adventure-game

Last synced: 30 Oct 2025

https://github.com/vishal815/deep-learing-notes

Welcome to the Deep Learning Notes repository! This collection of notes is designed to provide a deep understanding, intuition, and real-world implications of deep learning. Whether you're a beginner or preparing for exams and interviews, these comprehensive and colorful notes will be your go-to readme resource.

ai ann backpropagation cnn computer-science deep-learning education exam-preparation gan interview-preparation learning-resources loss-functions machine-learning neural-networks notes optimizer optimizers rnn transformer

Last synced: 06 Jan 2026