https://github.com/bvsam/nnfs
Basic neural network built using Python and Numpy. Created to better understand neural networks.
https://github.com/bvsam/nnfs
artificial-intelligence deep-learning machine-learning neural-network neural-networks numpy python3
Last synced: 4 months ago
JSON representation
Basic neural network built using Python and Numpy. Created to better understand neural networks.
- Host: GitHub
- URL: https://github.com/bvsam/nnfs
- Owner: bvsam
- License: mit
- Created: 2022-12-29T05:56:40.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2023-01-05T04:59:46.000Z (about 3 years ago)
- Last Synced: 2025-03-04T03:43:03.199Z (11 months ago)
- Topics: artificial-intelligence, deep-learning, machine-learning, neural-network, neural-networks, numpy, python3
- Language: Python
- Homepage:
- Size: 23.4 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Neural Networks from Scratch (nnfs)
This program allows a neural network built using basic Python and Numpy to run on provided inputs. It was created to develop a better understanding of how neural networks work.
This project was based off the [neural networks from scratch tutorial](https://www.youtube.com/watch?v=Wo5dMEP_BbI&list=PLQVvvaa0QuDcjD5BAw2DxE6OF2tius3V3) by Sentdex.
## Features
The neural network uses Numpy whenever possible for more efficient matrix calculations.
So far, the following features of neural networks have been implemented from scratch:
- Dense layers
- ReLU and Softmax activation functions
- Categorical Cross-Entropy Loss
- Forward propagation
- Backpropagation and gradient calculation
- Optimizers with learning rate decay
- Stochastic Gradient Descent (SGD) optimizer with momentum
- Adaptive Gradient (Adagrad) optimizer
- Root Mean Square Propagation (RMSProp) optimizer
- Adaptive Momentum (Adam) optimizer
- L1 and L2 regularization
- Dropout