https://github.com/mansurpro/digitrecognizer
DigitRecognizer is a simple, handcrafted neural network implemented with NumPy to classify handwritten digits from the MNIST dataset. This project demonstrates the foundational principles of neural networks, including forward propagation, backpropagation, and gradient descent, all built from scratch without relying on deep learning frameworks.
https://github.com/mansurpro/digitrecognizer
backpropagation custom-dataset data-visualization deep-learning from-scratch gradient-descent handwritten-digit-recognition machine-learning minimal-dependencies mnist neural-network numpy python
Last synced: 5 months ago
JSON representation
DigitRecognizer is a simple, handcrafted neural network implemented with NumPy to classify handwritten digits from the MNIST dataset. This project demonstrates the foundational principles of neural networks, including forward propagation, backpropagation, and gradient descent, all built from scratch without relying on deep learning frameworks.
- Host: GitHub
- URL: https://github.com/mansurpro/digitrecognizer
- Owner: MansurPro
- License: mit
- Created: 2024-12-02T03:14:33.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-12-02T03:40:35.000Z (about 1 year ago)
- Last Synced: 2025-03-28T08:45:18.757Z (10 months ago)
- Topics: backpropagation, custom-dataset, data-visualization, deep-learning, from-scratch, gradient-descent, handwritten-digit-recognition, machine-learning, minimal-dependencies, mnist, neural-network, numpy, python
- Language: Python
- Homepage:
- Size: 10.8 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# ποΈ **DigitRecognizer**
A handcrafted neural network built from scratch using NumPy to classify handwritten digits from the MNIST dataset. Perfect for learning and experimenting with core neural network concepts like forward propagation, backpropagation, and gradient descent.
---
## π **Features**
β
**Custom Neural Network**
A single hidden-layer network with sigmoid activation functions, trained entirely from scratch.
β
**MNIST Dataset Integration**
Seamlessly loads and preprocesses the MNIST dataset for efficient training and testing.
β
**Interactive Testing**
Test the model on MNIST data or your custom handwritten digits (`custom.png`).
β
**Training Visualization**
Monitor the model's performance through loss and accuracy updates after every epoch.
β
**Minimal Dependencies**
Uses only Python, NumPy, and Matplotlibβno heavy frameworks required!
---
## π **Project Structure**
```plaintext
DigitRecognizer/
β
βββ main.py # Main script for training and testing
βββ test.py # Interactive testing script
βββ utils.py # Utility functions for loading the MNIST dataset
βββ mnist.npz # Preloaded MNIST dataset
βββ custom.png # Custom image for testing
βββ tests/
β βββ test2.png # Additional test image 1
β βββ test3.png # Additional test image 2
βββ demo/
β βββ demo # Directory for demo files
βββ .gitignore # Git ignore file for unnecessary files
βββ requirements.txt # List of required Python packages
βββ README.md # Repository documentation
```
---
## π οΈ **Getting Started**
### **1. Clone the Repository**
```bash
git clone https://github.com/MansurPro/DigitRecognizer.git
cd DigitRecognizer
```
### **2. Install Dependencies**
Ensure you have Python installed, then install NumPy and Matplotlib:
```bash
pip install -r requirements.txt
```
### **3. Train the Model**
Run the `main.py` script to train the neural network on the MNIST dataset:
```bash
python main.py
```
### **4. Test the Model**
Use `test.py` to interactively test the model with MNIST data or custom images:
```bash
python test.py
```
---
## π¨ **Preview**
### Training Visualization:

### Interactive Testing (success):

### Interactive Testing (fail):

---
## π **How It Works**
1. **Input Layer**: Processes 784-pixel flattened grayscale images.
2. **Hidden Layer**: Applies sigmoid activation for feature extraction.
3. **Output Layer**: Outputs predictions using sigmoid activation.
4. **Backpropagation**: Optimizes weights and biases to minimize error using gradient descent.
---
## π§βπ» **Contributions**
We welcome contributions! Feel free to fork this repository, open issues, or submit pull requests.
---
## π **License**
This project is licensed under the MIT License. See the [LICENSE](./LICENSE) file for more details.
---
## π **Acknowledgments**
Special thanks to the creators of the MNIST dataset and the open-source community for their resources and support.