https://github.com/264gaurav/deep-learning
Deep Learning and Neural Network learning/building and exploring
https://github.com/264gaurav/deep-learning
artificial-neural-networks dagshub deep-learning deep-neural-networks dvc experiment-tracking hyperparameter-tuning keras-tensorflow keras-tuner mlflow-tracking numpy sklearn tensorflow testing training-data
Last synced: 4 months ago
JSON representation
Deep Learning and Neural Network learning/building and exploring
- Host: GitHub
- URL: https://github.com/264gaurav/deep-learning
- Owner: 264Gaurav
- Created: 2025-08-31T12:12:47.000Z (5 months ago)
- Default Branch: main
- Last Pushed: 2025-09-20T21:57:06.000Z (4 months ago)
- Last Synced: 2025-09-20T23:35:38.591Z (4 months ago)
- Topics: artificial-neural-networks, dagshub, deep-learning, deep-neural-networks, dvc, experiment-tracking, hyperparameter-tuning, keras-tensorflow, keras-tuner, mlflow-tracking, numpy, sklearn, tensorflow, testing, training-data
- Language: Jupyter Notebook
- Homepage:
- Size: 506 KB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: Readme.md
Awesome Lists containing this project
README
# Deep Learning: Underfitting, Overfitting & Regularization
## 📌 Core Libraries
- **NumPy** → Arrays & matrix operations
- **Matplotlib** → Visualization
- **Pandas** → Data handling (CSV, DataFrames)
- **TensorFlow & Keras** → Neural networks
- **Sequential** → Linear stack of layers
- **Dense** → Fully connected layer
- **Dropout** → Reduces overfitting
---
## ⚖️ Underfitting vs Overfitting
### 🔹 Underfitting
Model too simple → poor performance on train & test.
**Fix:** Add layers/neurons, train longer, lower LR, reduce regularization.
### 🔹 Overfitting
Model too complex → memorizes training, fails on test.
**Fix:** Dropout, L2 regularization, early stopping, data augmentation, reduce complexity.
---
## 📊 Comparison
| Feature | Underfitting 🟡 | Good Fit 🟢 | Overfitting 🔴 |
| ----------------- | --------------- | ----------- | -------------- |
| Training Accuracy | Low | High | Very High |
| Test Accuracy | Low | High | Low |
| Model Complexity | Too simple | Balanced | Too complex |
| Generalization | Poor | Good | Poor |
---
## 🎯 Dropout
- Randomly drops neurons (prob _p_) during training.
- Prevents reliance on specific neurons → robust features.
- **Training:** scale surviving neurons `1/(1-p)`
- **Testing:** dropout off, scaling handled automatically.
- Best for large models, avoid overuse on small datasets.
---
## 🛠️ Ways to Reduce Overfitting
- Dropout
- L1/L2 Regularization
- Early Stopping
- Data Augmentation
- Batch Normalization
- Reduce Model Complexity
- Cross-Validation
---
## ✅ Rule of Thumb
- **Underfitting:** Increase complexity, train longer.
- **Overfitting:** Add regularization, dropout, early stopping, augmentation.
📌 **Goal:** Achieve balance → model should **generalize well** to unseen data.
# Keras Tuner - HyperParameter Tuning of models:
### For understanding of KerasTuner, refer to the [KerasTuner Readme](./2-kerasTuner/Readme.md).
# Embeddings with keras and tensoflow :
### For understanding of Embeddings, refer to the [Embeddings Readme](./5-WordEmbeddings/Readme.md).