https://github.com/professornova/ann-scratch
This repository implements a simple Artificial Neural Network (ANN) from scratch using only NumPy.
https://github.com/professornova/ann-scratch
artificial-intelligence neural-network numpy scratch
Last synced: 11 months ago
JSON representation
This repository implements a simple Artificial Neural Network (ANN) from scratch using only NumPy.
- Host: GitHub
- URL: https://github.com/professornova/ann-scratch
- Owner: ProfessorNova
- Created: 2023-09-10T14:11:03.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-08-24T06:02:45.000Z (over 1 year ago)
- Last Synced: 2025-02-08T15:38:04.713Z (about 1 year ago)
- Topics: artificial-intelligence, neural-network, numpy, scratch
- Language: Jupyter Notebook
- Homepage:
- Size: 954 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# ANN-Scratch
This repository implements a simple Artificial Neural Network (ANN) from scratch
using only numpy. It was tested on the MNIST dataset and achieved an accuracy of
around 95% after 50 epochs (The hyperparameters were not tuned so there is room
for improvement).
---
## Getting Started
### Installation
You basically just have to have numpy installed (as well as matplotlib if you want
to plot the data). You can install them using pip:
```bash
pip install numpy matplotlib
```
Then clone the repository:
```bash
git clone https://github.com/ProfessorNova/ANN-Scratch.git
cd ANN-Scratch
```
The code was tested on Python 3.10.11.
### Usage
The functionality is shown with visualisation in [train.ipynb](https://github.com/ProfessorNova/ANN-Scratch/blob/main/train.ipynb). You can also run the code in [train.py](https://github.com/ProfessorNova/ANN-Scratch/blob/main/train.py)
with the following command (there you will only see the loss and accuracy printed in the console):
```bash
python train.py
```
---
## Components
The repository is divided into the following components:
- [lib/activations_functions.py](https://github.com/ProfessorNova/ANN-Scratch/blob/main/lib/activation_functions.py): Contains the activation functions and their derivatives. The following activation
functions are implemented:
- Sigmoid
- Linear
- Softmax
- ReLU
- [lib/neural_layer.py](https://github.com/ProfessorNova/ANN-Scratch/blob/main/lib/neural_layer.py): Contains the NeuralLayer class which represents a layer in the neural network. It contains
the forward and backward methods as well as a method to update the weights and biases.
- [lib/neural_network.py](https://github.com/ProfessorNova/ANN-Scratch/blob/main/lib/neural_network.py): Contains the NeuralNetwork class which represents the neural network. It implements the
backpropagation algorithm and stochastic gradient descent. It also has methods to save and load the model.
- [lib/data_loader.py](https://github.com/ProfessorNova/ANN-Scratch/blob/main/lib/data_loader.py): Contains a function to load the given `mnist_test.csv` and `mnist_train.csv` files.
Furthermore, it automatically preprocesses the data by normalizing it and converting the labels to one-hot encoding.