https://github.com/mattmoony/convnet_mnist
Simple convolutional neural network (purely numpy) to classify the original MNIST dataset. My first project with a convnet. 🖼
https://github.com/mattmoony/convnet_mnist
ann artificial-neural-network batch-normalization convnet convolutional-neural-network datascience dropout machine-learning mnist neural-network pooling stochastic-gradient-descent
Last synced: 7 months ago
JSON representation
Simple convolutional neural network (purely numpy) to classify the original MNIST dataset. My first project with a convnet. 🖼
- Host: GitHub
- URL: https://github.com/mattmoony/convnet_mnist
- Owner: MattMoony
- Created: 2019-08-02T22:37:00.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2021-09-16T17:26:32.000Z (about 4 years ago)
- Last Synced: 2025-01-19T10:31:16.715Z (9 months ago)
- Topics: ann, artificial-neural-network, batch-normalization, convnet, convolutional-neural-network, datascience, dropout, machine-learning, mnist, neural-network, pooling, stochastic-gradient-descent
- Language: Jupyter Notebook
- Homepage:
- Size: 905 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# ConvNet - MNIST Dataset
_Simple ConvNet classifying MNIST data_---
## About
This is supposed to be a little test project. I want to play around with convolutional layers, pooling layers, normalization strategies (dropout, batch normalization), training algorithms (Vanilla SGD, SGD w. Momentum, etc.) and much more.
## To-Do
* [x] Dataset preparation
* [x] Simple weight initialization
* [x] Advanced weigth initialization (Xavier initialization, etc.)
* [x] Convolution-Function
* [ ] Pooling Layers (Max-Pooling, Average-Pooling, etc.)
* [ ] Dropout
* [ ] Batch Normalization
* [x] Activation-Function (ReLU)
* [x] Loss-Function (Cross Entropy)
* [x] Gradient-Computation Function
* [x] Stochastic Mini Batch Gradient Descent
* [x] Advanced SGD (Momentum, RMSprop, Adam, etc.)
* [x] J/epoch-Graph
* [x] Graphical representation of convolutional Layers
* [x] Prediction-Function
* [x] Model evaluation (Accuracy)
* _... probably more to come ..._## Results
Best accuracy so far: **93.14%**

_J/Epoch-Graph over 1024 iterations ..._

_Convolutional weights & activations (examples: 8, 5)_---
... MattMoony (August, 2019)