An open API service indexing awesome lists of open source software.

https://github.com/legalaspro/rnn_gru_lstm_experiments

RNN, GRU, LSTM implementation using PyTorch and Numpy
https://github.com/legalaspro/rnn_gru_lstm_experiments

Last synced: 3 months ago
JSON representation

RNN, GRU, LSTM implementation using PyTorch and Numpy

Awesome Lists containing this project

README

        

# Deep Reinforcement Learning with RNN

In this project, I explored the blog by Andrej Karpathy about Recurrent Neural Networks (RNNs). Additionally, I delved into Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) networks out of my own interest. I implemented these models using PyTorch and also performed backpropagation from scratch using Numpy. A valuable lesson learned with Numpy was the importance of setting the initial weights correctly.

## Implementations

1. **Recurrent Neural Networks (RNNs)**

- Studied the theoretical concepts from Karpathy's blog.
- Implemented RNNs using PyTorch.
- Performed backpropagation from scratch using Numpy.

2. **Gated Recurrent Units (GRUs)**

- Explored the architecture and benefits of GRUs.
- Implemented GRUs using PyTorch.
- Conducted backpropagation from scratch using Numpy.

3. **Long Short-Term Memory (LSTM) Networks**
- Investigated the structure and advantages of LSTMs.
- Implemented LSTMs using PyTorch.
- Executed backpropagation from scratch using Numpy.

## Conclusion

Through this project, I gained a deeper understanding of various recurrent neural network architectures and their implementations. The hands-on experience with PyTorch and Numpy reinforced my learning and provided valuable insights into the workings of RNNs, GRUs, and LSTMs. A valuable lesson learned with Numpy was the importance of setting the initial weights correctly.