https://github.com/naver/gdc
Code accompanying our papers on the "Generative Distributional Control" framework
https://github.com/naver/gdc
ai controlled-nlg exponential-family fairness-ml gpt-2 gpt3 information-geometry language-model machine-learning nlg nlp reinforcement-learning
Last synced: about 1 month ago
JSON representation
Code accompanying our papers on the "Generative Distributional Control" framework
- Host: GitHub
- URL: https://github.com/naver/gdc
- Owner: naver
- License: other
- Created: 2021-03-05T08:39:56.000Z (about 4 years ago)
- Default Branch: master
- Last Pushed: 2022-12-07T17:29:32.000Z (over 2 years ago)
- Last Synced: 2025-03-28T11:21:18.920Z (about 2 months ago)
- Topics: ai, controlled-nlg, exponential-family, fairness-ml, gpt-2, gpt3, information-geometry, language-model, machine-learning, nlg, nlp, reinforcement-learning
- Language: Python
- Homepage:
- Size: 26.4 MB
- Stars: 118
- Watchers: 9
- Forks: 20
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Generative Distributional Control
Generative Distributional Control (GDC) is a general framework for imposing constraints on samples of pretrained language models. The constraints can be either pointwise (e.g. all samples must be non-offensive) or distributional (e.g. a specified percentage of samples must mention females).
This repo contains code accompanying the following three papers:
* [`/dpg`](/dpg): [A Distributional Approach to Controlled Text Generation](https://arxiv.org/abs/2012.11635) (ICLR 2021)
* [`/cdpg`](/cdpg): [Controlling Conditional Language Models without Catastrophic Forgetting](https://arxiv.org/abs/2112.00791) (ICML 2022)
* [`/rm_vs_dm`](/rm_vs_dm): [On Reinforcement Learning and Distribution Matching for Fine-Tuning Language Models with no Catastrophic Forgetting](https://arxiv.org/abs/2206.00761) (NeurIPS 2022)