https://github.com/hassanalgoz/text-generation
Generate and predict text, using Recurrent Neural Networks. (Keras+Tensorflow+Gensim)
https://github.com/hassanalgoz/text-generation
gensim-word2vec gru keras-tensorflow lstm machine-learning nlp rnn text-processing word2vec
Last synced: 5 months ago
JSON representation
Generate and predict text, using Recurrent Neural Networks. (Keras+Tensorflow+Gensim)
- Host: GitHub
- URL: https://github.com/hassanalgoz/text-generation
- Owner: HassanAlgoz
- Created: 2018-04-24T15:14:36.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-05-12T18:14:15.000Z (over 7 years ago)
- Last Synced: 2025-04-07T16:15:17.575Z (6 months ago)
- Topics: gensim-word2vec, gru, keras-tensorflow, lstm, machine-learning, nlp, rnn, text-processing, word2vec
- Language: Jupyter Notebook
- Homepage:
- Size: 104 KB
- Stars: 6
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
### About the Project
The model can be trained on stories, books, articles, and the like, to learn to produce similar text. After training, the model can be given a sequence of words as its input to predict the next word. So, if the model is trained on stories, for example, you can ask the model to complete an incomplete story.### Training
To train the model, simply run the following:
```
python 1-preprocess.py && python 2-word2vec.py && python 3-train.py
```### Prediction
For prediction you will need:
1. The Trained model: `./results/model.h5`
2. Tokenizer: `./processed/tokenizer.pkl`
3. Sequences (to sample from): `./processed/sequences.txt`### Configuration
Although not exhaustive, the configuration of the training can be found in `./helper/args.py`. Feel free to modify the model's architecture and functions at `./3-train.py`.### Team
This work was part of our senior project in 2018 at KFUPM. Thanks to my colleagues Saleh Alresaini and Faris Alasmari, and also to our supervisor Dr. Lahouari Ghouti, for this great learning experience.