Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/ltfschoen/aind2-rnn

Term 2 Project 2 RNN and LSTM for time-series prediction and text generation
https://github.com/ltfschoen/aind2-rnn

jupyter-notebook keras lstm-networks multi-class-classification recurrent-neural-networks

Last synced: 25 days ago
JSON representation

Term 2 Project 2 RNN and LSTM for time-series prediction and text generation

Awesome Lists containing this project

README

        

# Recurrent Neural Networks course project: time series prediction and text generation

## Run the app

`jupyter notebook --NotebookApp.iopub_data_rate_limit=100000000 RNN_project.ipynb`

## Accelerating the Training Process

If your code is taking too long to run, you will need to either reduce the complexity of your chosen RNN architecture or switch to running your code on a GPU. If you'd like to use a GPU, you have two options:

#### Build your Own Deep Learning Workstation

If you have access to a GPU, you should follow the Keras instructions for [running Keras on GPU](https://keras.io/getting-started/faq/#how-can-i-run-keras-on-gpu).

#### Amazon Web Services

Instead of a local GPU, you could use Amazon Web Services to launch an EC2 GPU instance. (This costs money.)


## Rubric items

#### Files Submitted

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Submission Files | RNN_project.ipynb, my_answers.py --> both the completed notebook RNN_project.ipynb as well as all completed python functions requested in the main notebook RNN_project.ipynb (TODO items) should be copied into this python script and submitted for grading. |

#### Documentation

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Comments | The submission includes comments that describe the functionality of the code. Every line of code is preceded by a meaningful comment. 1. describing input parameters to Keras module functions. 2. function calls 3. explaning thought process in common language |

#### Step 1: Implement a function to window time series
| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Window time series data. | The submission returns the proper windowed version of input time series of proper dimension listed in the notebook. |

#### Step 2: Create a simple RNN model using keras to perform regression

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Build an RNN model to perform regression. | The submission constructs an RNN model in keras with LSTM module of dimension defined in the notebook. |

#### Step 3: Clean up a large text corpus

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Find and remove all non-english or punctuation characters from input text data. The submission removes all non-english / non-punctuation characters. |

#### Step 4: Implement a function to window a large text corpus

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Implement a function to window input text data| The submission returns the proper windowed version of input text of proper dimension listed in the notebook. |

#### Step 5: Create a simple RNN model using keras to perform multiclass classification

| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Build an RNN model to perform multiclass classification. | The submission constructs an RNN model in keras with LSTM module of dimension defined in the notebook. |

#### Step 6: Generate text using a fully trained RNN model and a variety of input sequences
| Criteria | Meets Specifications |
|:---------------------:|:---------------------------------------------------------:|
| Generate text using a trained RNN classifier. | The submission presents examples of generated text from a trained RNN module. The majority of this generated text should consist of real english words. |