Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jai0212/koach-sample-project
https://github.com/jai0212/koach-sample-project
Last synced: about 4 hours ago
JSON representation
- Host: GitHub
- URL: https://github.com/jai0212/koach-sample-project
- Owner: Jai0212
- Created: 2024-09-16T19:15:11.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-09-17T00:49:08.000Z (2 months ago)
- Last Synced: 2024-09-18T01:54:50.260Z (2 months ago)
- Language: Python
- Size: 2 MB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Koach Sample Project
A CNN using the **MNIST dataset** to identify the number from its image with an accuracy of **99.18%**, created in Python using TensorFlow.
## Model Architecture
* Convolutional Layer
* Pooling Layer
* Batch Normalization
* Dropout Layer
* Flattening Layer
* Dense Layer### Adam Optimizer:
* Learning Rate: 0.001
* Loss: sparse_categorical_crossentropy## Packages
* TensorFlow
* Keras
* SciKit Learn
* Numpy
* Matplotlib
* Pickle## Files:
* **main.py** - model created and saved
* **predict.py** - used to predict any image, just put any image and put its title in line 9 of the code as image_path (ensure the number is of a lighter shade and the background is darker)
* **plot.py** - used to plot the training and validation loss graph
* **mnist_cnn_model.h5** - the CNN model
* **model_history.pkl** - stores model training history for plotting graph
* **test.png** - sample testing file##
### Batch Normalization
This layer is used to normalize the input ensuring stability and preventing overfitting. It also increases the training speed.### Dropout Layers
This layer randomly drops out some neurons to prevent overfitting and helps in generalization.### Use of these Layers
I have used these layers in this project to improve the model as it prevents overfitting and helps in generalization thereby improving accuracy. Since it is a simple project with a simple dataset it's not super useful. It was only slightly beneficial. For more complex models, these layers will be of great use.