https://github.com/sk-g/sequencegan
Working on a blackbox GAN model to generate text sequences based on given closed corpus.
https://github.com/sk-g/sequencegan
Last synced: 6 months ago
JSON representation
Working on a blackbox GAN model to generate text sequences based on given closed corpus.
- Host: GitHub
- URL: https://github.com/sk-g/sequencegan
- Owner: sk-g
- Created: 2018-04-05T18:58:48.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-04-05T19:17:06.000Z (over 7 years ago)
- Last Synced: 2025-02-09T01:18:22.950Z (8 months ago)
- Language: Python
- Size: 498 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Changes
Will add two more approaches in addition to the use of REINFORCE algorithm,
a vanilla D vs G trained simultaneously (LSTM D and CNN G).Soon to be added: A comparision between existing GANs for text generation.
# sequence_gan
Tensorflow implementation of generative adversarial networks
(GAN) applied to sequential data via recurrent neural networks
(RNN).See simple_demo.py for a demonstration of the model on toy
data.The basic idea of a generator and discriminator alternatively
optimizing their own objectives is maintained. Because of
discrete sequential data, a standard backpropogation from
discriminator to generator is not possible. Rather, we employ
the REINFORCE algorithm, to encourage the generator to choose
the correct discrete output at each point in the sequence.The REINFORCE algorithm is prone to issues with credit assignment.
To alleviate this, the model provides 'supervised training' (as
opposed to the 'unsupervised training' via the discriminator).
During supervised training, the generator is trained to predict the
correct tokens based on a groundtruth sequence, optimizing cross
entropy loss.Original Code by [ofirnachum](https://github.com/ofirnachum/sequence_gan)