Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/nishant2018/text-generation-gpt-1-and-gpt-2

Generative Pre-trained Transformer 1 and Generative Pre-trained Transformer 2 models
https://github.com/nishant2018/text-generation-gpt-1-and-gpt-2

generative generative-adversarial-network generative-ai gpt gpt-2 text text-generation transformers

Last synced: 3 days ago
JSON representation

Generative Pre-trained Transformer 1 and Generative Pre-trained Transformer 2 models

Awesome Lists containing this project

README

        

## GPT-1 and GPT-2 Models

### GPT-1
GPT-1 (Generative Pre-trained Transformer 1) was the first in the series of Generative Pre-trained Transformer models developed by OpenAI. Released in 2018, GPT-1 utilized the transformer architecture and was trained on a large corpus of text data. The key innovation of GPT-1 was the concept of pre-training a language model on a large dataset and then fine-tuning it for specific tasks. GPT-1 demonstrated significant improvements in various NLP tasks, laying the foundation for subsequent models.

- **Architecture:** Transformer
- **Training Data:** BooksCorpus (over 7,000 unpublished books)
- **Parameters:** 110 million

### GPT-2
GPT-2 (Generative Pre-trained Transformer 2) is the successor to GPT-1 and was released in 2019. GPT-2 is much larger than GPT-1, with 1.5 billion parameters, and was trained on a more diverse and larger dataset. The model demonstrated the ability to generate coherent and contextually relevant text, even with minimal input. GPT-2's release sparked discussions around the ethical implications of advanced AI models due to its ability to generate realistic and human-like text.

- **Architecture:** Transformer
- **Training Data:** WebText dataset (8 million web pages)
- **Parameters:** 1.5 billion
- **Notable Features:** Zero-shot learning, text generation

GPT-2 remains a significant milestone in the development of NLP models and has inspired further advancements in the field, leading to even more powerful models like GPT-3 and GPT-4.

For more detailed information, you can explore the [OpenAI GPT-2 paper](https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf).