Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ayooshkathuria/mnist-handwritten-digit-recognition
Classification of MNIST Handwritten Digits Database using Deep Learning
https://github.com/ayooshkathuria/mnist-handwritten-digit-recognition
Last synced: about 2 months ago
JSON representation
Classification of MNIST Handwritten Digits Database using Deep Learning
- Host: GitHub
- URL: https://github.com/ayooshkathuria/mnist-handwritten-digit-recognition
- Owner: ayooshkathuria
- Created: 2016-11-11T21:44:28.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2017-04-10T01:06:43.000Z (almost 8 years ago)
- Last Synced: 2024-11-26T07:19:06.090Z (about 2 months ago)
- Language: Python
- Homepage:
- Size: 92.9 MB
- Stars: 7
- Watchers: 1
- Forks: 3
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
##Classification of MNIST Handwritten Digits Database using Deep Learning
This repository contains code meant to classifiy MNIST Handwritten Digits usind Neural Networks. I have used the data from
http://deeplearning.net/data/mnist/ as well as an algorithmically expanded version of this datset for training the neural
networks. There are 3 code files which are described below.####NeuralNetBasic.py
A very simple neural network implemented in python using Stochastic Gradient Descent and Backpropogation.####NeuralNetOptimised
An optimised version of neural network above. Incorporates cross-entropy cost function instead of quadratic cost function
L2-regularisation, an algorithmically expanded dataset and better support for performance analysis. The weights have been
initialised with mean zero and standard deviation 1/sqrt(Number or outputs) rather than 1.####NeuralNetTheano
Unlike previous two versions, this neural network is implemented using Theano. It also incorporates a couple of convolutional
layers and a softmax output layer in addition to fully connected layers. Dropout has also been implemented in fully connected
layers to address the problem of overfitting.The best efficiency I obtained was 98.87% with a couple of convolutional layers, a fully connected layer of 640 neurons and an
output softmax layer of 10 neuron, learning rate of 0.1. The network trained over 50 epochs, and took a long while on my MacBook Air (I'm never doing that again).This code hs been derived from code samples from the book http://neuralnetworksanddeeplearning.com/ authored by Michael Nielson. If you're looking for an introduction to deep learning, the book can be a great starting point.