Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/alireza-akhavan/rnn-notebooks
RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
https://github.com/alireza-akhavan/rnn-notebooks
deep-learning gru jupyter-notebook keras lstm rnn tensorflow2
Last synced: 18 days ago
JSON representation
RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)
- Host: GitHub
- URL: https://github.com/alireza-akhavan/rnn-notebooks
- Owner: Alireza-Akhavan
- Created: 2019-12-02T18:36:15.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2024-05-30T11:27:35.000Z (7 months ago)
- Last Synced: 2024-05-30T12:59:50.036Z (7 months ago)
- Topics: deep-learning, gru, jupyter-notebook, keras, lstm, rnn, tensorflow2
- Language: Jupyter Notebook
- Homepage: http://class.vision
- Size: 6.87 MB
- Stars: 97
- Watchers: 8
- Forks: 39
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# rnn-notebooks
RNN(SimpleRNN, LSTM, GRU) Tensorflow2.0 & Keras Notebooks (Workshop materials)[class.vision](http://Class.vision)
# Slides
[RNN.pdf](./Slides/RNN.pdf)
# Video
Some parts are freely available from our [Aparat channel](https://www.aparat.com/v/qD1Mi?playlist=287685) or
you can purchase a full package including 32 videos in Persian from [class.vision](http://class.vision/deeplearning2/)# Notebooks
## Intro to RNN:
[01_simple-RNN.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/01_simple-RNN.ipynb)## How we can inference with diffrent sequence length?!
[02_1_simple-RNN-diffrent-sequence-length.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/02_1_simple-RNN-diffrent-sequence-length.ipynb)[02_2_simple-RNN-diffrent-sequence-length.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/02_2_simple-RNN-diffrent-sequence-length.ipynb)
## Cryptocurrency predicting
- when we use return_sequences=True ?
- Stacked RNN (Deep RNN)
- using a LSTM layer
[03_1_Cryptocurrency-predicting.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/03_1_Cryptocurrency-predicting.ipynb)[03_2_Cryptocurrency-predicting.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/03_2_Cryptocurrency-predicting.ipynb)
## CNN + LSTM for Ball movement classification
- what is TimeDistributed layer in Keras?
- Introduction to video classification
- CNN + LSTM
[04_simple-CNN-LSTM.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/04_simple-CNN-LSTM.ipynb)## Action Recognition with pre-trained CNN and LSTM
- How using pre-trained CNN as a feature extracture for RNN
- using GRU layer
[05-1-video-action-recognition-train-extract-features-with-cnn](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/05-1-video-action-recognition-train-extract-features-with-cnn.ipynb)
[05-2_video-action-recognition-train-rnn.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/05-2_video-action-recognition-train-rnn.ipynb)## Word Embedding and Analogy
- Using Glove
- Cosine Similarity
- Analogy
[06_analogy-using-embeddings.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/06_analogy-using-embeddings.ipynb)## Text Classification
- What is Bag of Embeddings?
- Using Embedding Layer in keras
- Set embedding layer with pre-trained embedding
- Using RNN for NLP Tasks
[07_text-classification-Emojify.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/07_text-classification-Emojify.ipynb)
## Language Model and Text generation (On Persian poetry, Shahnameh)- what is TF Dataset
- Stateful VS Stateless
- When we need batch_input_shape ?[08_shahnameh-text-generation-language-model.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/08_shahnameh-text-generation-language-model.ipynb)
# Seq2Seq networks (Encoder-Decoder)
## Understanding a mathematical strings with seq2seq
- using RepeatVector for connecting encoder to decoder
- use encoder hidden state as an input decoder
[09_add-numbers-with-seq2seq.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/09_add-numbers-with-seq2seq.ipynb)## NMT (Natural Machine Trnslate) with Attention in Keras
[10_Neural-machine-translation-with-attention-for-date-convert.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/10_Neural-machine-translation-with-attention-for-date-convert.ipynb)
## NMT with Attention and teacher forcing in TF2.0
- Teacher forcing
- Loss with Mask for zero padding!
- Using Model-Subclassing
[11_nmt-with-attention.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/11_nmt-with-attention.ipynb)## Image Captioning with Attention
[12_image-captioning-with-attention.ipynb](https://nbviewer.jupyter.org/github/Alireza-Akhavan/rnn-notebooks/blob/master/12_image-captioning-with-attention.ipynb)