https://github.com/pngo1997/neural-network-backpropagation-optimization
Explores key concepts in Neural Network training.
https://github.com/pngo1997/neural-network-backpropagation-optimization
backpropagation cross-entropy deep-learning neural-network python quadratic-cost stochastic-gradient-descent
Last synced: 3 months ago
JSON representation
Explores key concepts in Neural Network training.
- Host: GitHub
- URL: https://github.com/pngo1997/neural-network-backpropagation-optimization
- Owner: pngo1997
- Created: 2025-02-01T14:29:33.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-02-01T14:42:04.000Z (4 months ago)
- Last Synced: 2025-02-01T15:31:10.817Z (4 months ago)
- Topics: backpropagation, cross-entropy, deep-learning, neural-network, python, quadratic-cost, stochastic-gradient-descent
- Language: Jupyter Notebook
- Homepage:
- Size: 21.5 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# 🧠 Neural Network Backpropagation & Optimization
## 📜 Overview
This project explores key concepts in **Neural Network Training**, including:
1. **Forward Propagation in Neural Networks**
2. **Alternative Syntax Optimization**
3. **Backpropagation Code Explanation**
4. **Derivation for Cross-Entropy Cost Function Optimization**📌 **Programming Language**: `Python 3`
📌 **Frameworks Used**: `NumPy`, `Jupyter Notebook`## 🚀 1️⃣ Forward Propagation in Neural Networks
Analyze and compare **two implementations of forward propagation**:
- `feedforward()` – Used for evaluation/testing.
- `backprop()` – Performs forward propagation as part of backpropagation.## 🛠 2️⃣ Alternative Syntax for Mini-Batch Creation
Optimize the mini-batch creation syntax from `SGD()` function.## 🔢 3️⃣ Derivation of Cross-Entropy Cost Function
Compare Quadratic vs. Cross-Entropy Cost Functions and derive why bias terms vanish in optimization.