https://github.com/krisharul26/text-generation-using-gpt2
GPT-2 is a pre-trained language model that can be used for various NLP tasks such as text generation, data summarization, and translation. Language models are statistical tools to predict/generate the next word(s) in a sequence based on the preceding word(s). The GPT-2 architecture is based on the Transformers concept. The Transformer provides a mechanism based on encoder-decoders to detect input-output dependencies. At every stage, the model takes the previously generated data as additional input when generating the next output. GPT-2 has outperformed other language models when it comes to generating articles based on small input content.
https://github.com/krisharul26/text-generation-using-gpt2
Last synced: 8 months ago
JSON representation
GPT-2 is a pre-trained language model that can be used for various NLP tasks such as text generation, data summarization, and translation. Language models are statistical tools to predict/generate the next word(s) in a sequence based on the preceding word(s). The GPT-2 architecture is based on the Transformers concept. The Transformer provides a mechanism based on encoder-decoders to detect input-output dependencies. At every stage, the model takes the previously generated data as additional input when generating the next output. GPT-2 has outperformed other language models when it comes to generating articles based on small input content.
- Host: GitHub
- URL: https://github.com/krisharul26/text-generation-using-gpt2
- Owner: KrishArul26
- Created: 2021-11-07T14:23:19.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2021-11-07T20:37:03.000Z (almost 4 years ago)
- Last Synced: 2025-01-04T18:25:31.290Z (9 months ago)
- Homepage:
- Size: 4.88 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Text-Generation-using-GPT2
![]()
Introduction
The process of computationally identifying and categorizing opinions expressed in a piece of text, especially to determine whether the writer's attitude towards a particular topic, product, etc. is positive, negative, or neutral. Understanding peopleโs emotions is essential for businesses since customers are able to express their thoughts and feelings more openly than ever before. By automatically analysing customer feedback, from survey responses to social media conversations, brands are able to listen attentively to their customers, and tailor products and services to meet their needs.
![]()
Technologies Used
```
1. IDE - Pycharm
2. GPT2 Large Pre-Trained Model
3. GPU - P-4000
4. Google Colab - Text Analysis
5. Flask- Fast API
6. Postman - API Tester
```
๐ Prerequisites
All the dependencies and required libraries are included in the file requirements.txtPython 3.6
๐ Installation Text-Generation-using-GPT2
1. Clone the repo
```
git clone https://github.com/KrishArul26/Text-Generation-using-GPT2.git
```
2. Change your directory to the cloned repo```
cd Text-Generation-using-GPT2```
3. Create a Python 3.6 version of virtual environment name 'lstm' and activate it```
pip install virtualenvvirtualenv gpt2
gpt2\Scripts\activate
```
4. Now, run the following command in your Terminal/Command Prompt to install the libraries required!!!
```
pip install -r requirements.txt```
๐ก Working
Type the following command:
```
python app.py```
After that You will see the running IP adress just copy and paste into you browser and import or upload your speech then closk the predict button.7. Result
![]()
![]()