https://github.com/vidhi1290/dcnn
Achieving 98.43% Accuracy in Handwritten Digit Recognition Experience the potential of deep learning as we present our project on handwritten digit recognition using the MNIST dataset. Our deep neural network achieves an outstanding accuracy of 98.43%, showcasing the effectiveness of advanced techniques in the realm of machine learning.
https://github.com/vidhi1290/dcnn
artificial-intelligence artificial-neural-networks cnn-keras deep-learning deep-neural-networks machine-learning-algorithms mnist-handwriting-recognition
Last synced: 18 days ago
JSON representation
Achieving 98.43% Accuracy in Handwritten Digit Recognition Experience the potential of deep learning as we present our project on handwritten digit recognition using the MNIST dataset. Our deep neural network achieves an outstanding accuracy of 98.43%, showcasing the effectiveness of advanced techniques in the realm of machine learning.
- Host: GitHub
- URL: https://github.com/vidhi1290/dcnn
- Owner: Vidhi1290
- Created: 2023-08-17T12:17:08.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2023-08-17T12:21:40.000Z (about 2 years ago)
- Last Synced: 2025-02-02T18:33:27.441Z (9 months ago)
- Topics: artificial-intelligence, artificial-neural-networks, cnn-keras, deep-learning, deep-neural-networks, machine-learning-algorithms, mnist-handwriting-recognition
- Homepage:
- Size: 126 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# DCNN
## Deep Neural Network for Handwritten Digit Recognition using MNIST Dataset: Achieving 98.43% Accuracy
### MNIST Handwritten Digit Recognition with Deep Learning
This repository contains the code for building a deep neural network model to recognize handwritten digits from the famous MNIST dataset. The MNIST dataset consists of 28x28 grayscale images of handwritten digits (0-9) along with their corresponding labels.
### Key Features:
Utilizes the Keras library to create a deep neural network for digit recognition.
Achieves an accuracy of 98.43% on the test dataset.
Implements data preprocessing techniques such as normalization and one-hot encoding.
Includes visualization of training loss and accuracy for model evaluation.
Provides code to display a few sample images from the dataset.
The model architecture consists of multiple layers of densely connected neurons, using the Rectified Linear Activation (ReLU) function for hidden layers and the Softmax activation function for the output layer. The Adam optimizer is used to optimize the model's weights and biases during training.
Feel free to explore the code and adapt it for your own projects or learning purposes. If you find this repository helpful, don't forget to give it a star!
Make sure to personalize the description to match the specifics of your project and any additional information you'd like to include.