https://github.com/pngo1997/recurrent-neural-networks-with-pytorch
Explores Recurrent Neural Networks using PyTorch.
https://github.com/pngo1997/recurrent-neural-networks-with-pytorch
backpropagation python pytorch recurrent-neural-networks rnn sequential-data text-prediction text-processing
Last synced: 3 months ago
JSON representation
Explores Recurrent Neural Networks using PyTorch.
- Host: GitHub
- URL: https://github.com/pngo1997/recurrent-neural-networks-with-pytorch
- Owner: pngo1997
- Created: 2025-02-05T22:26:30.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-02-05T22:37:06.000Z (4 months ago)
- Last Synced: 2025-02-21T05:16:33.578Z (3 months ago)
- Topics: backpropagation, python, pytorch, recurrent-neural-networks, rnn, sequential-data, text-prediction, text-processing
- Language: Jupyter Notebook
- Homepage:
- Size: 534 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# π Recurrent Neural Networks (RNN) with PyTorch
## π Overview
This project explores **Recurrent Neural Networks (RNNs)** using **PyTorch**, and processes sequential data.π **Key Concepts Covered**:
- **Recurrent Neural Networks (RNNs)**
- **Handling Sequential Data**
- **Training an RNN in PyTorch**
- **Backpropagation Through Time (BPTT)**
- **Evaluating RNN Predictions**## π Implementation Details
### **Step 1: Data Preprocessing**
β **Load and preprocess sequential data**.
β **Convert data into tensors for PyTorch**.### **Step 2: Building the RNN Model**
β **Define an RNN model** using `torch.nn.RNN()`.
β **Customize hidden layers and activation functions**.
β **Initialize weights and biases**.### **Step 3: Training the RNN**
β **Define loss function and optimizer**.
β **Train the model over multiple epochs**.
β **Backpropagate errors using PyTorchβs autograd system**.### **Step 4: Model Evaluation & Visualization**
β **Evaluate performance on test sequences**.
β **Visualize predictions vs. actual data**.
- Overall, there is overfitting here since there is a significant gap between the training and validation accuracy. The training accuracy quickly reaches near 100%, indicating that the model is effectively learning the training data; however, the validation accuracy remains significantly lower compared to the training accuracy and is relatively flat throughout the epochs. On the other hand, the training loss decreases rapidly and approaches zero, which corresponds with the high training accuracy. Meanwhile, the validation loss fluctuates and slowly increases over time. This indicates that the model learns the training set extremely well but fails to perform effectively on unseen data.
## π Summary
β Implemented an RNN from scratch using PyTorch.β Trained the model on sequential data.
β Visualized model performance and predictions.