https://github.com/snigdho8869/text-summarization-with-transformers
A collection of text summarization projects using ensemble methods and T5 model.
https://github.com/snigdho8869/text-summarization-with-transformers
ai-projects ai-summarization bart-model css deep-learning ensemble-learning flask flask-application gpt-2 html html-css-javascript javascript natural-language-processing nlp python t5-model text-summarization text-summarization-with-transformers transformers web-development
Last synced: 3 months ago
JSON representation
A collection of text summarization projects using ensemble methods and T5 model.
- Host: GitHub
- URL: https://github.com/snigdho8869/text-summarization-with-transformers
- Owner: Snigdho8869
- Created: 2025-03-16T18:27:08.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-03-16T19:35:26.000Z (3 months ago)
- Last Synced: 2025-03-16T19:44:42.205Z (3 months ago)
- Topics: ai-projects, ai-summarization, bart-model, css, deep-learning, ensemble-learning, flask, flask-application, gpt-2, html, html-css-javascript, javascript, natural-language-processing, nlp, python, t5-model, text-summarization, text-summarization-with-transformers, transformers, web-development
- Language: Jupyter Notebook
- Homepage:
- Size: 126 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Text Summarization Projects
This repository contains two text summarization projects built using natural language processing (NLP) models. The projects are implemented using Flask for the backend and HTML/CSS/JavaScript for the frontend.
---
## Projects Overview
### 1. **Ensemble Summarization**
This project uses an ensemble of two pre-trained models, **BART** and **GPT-2**, to generate summaries of input text. The ensemble method combines the outputs of both models and selects the most frequent summary as the final result.- **Models Used**:
- **BART (Bidirectional and Auto-Regressive Transformers)**: A transformer-based model fine-tuned for summarization tasks.
- **GPT-2 (Generative Pre-trained Transformer 2)**: A transformer-based model capable of generating coherent and contextually relevant text.- **Features**:
- Input text is summarized using both models.
- The final summary is selected based on the most frequent output.
- Interactive web interface for input and output.---
### 2. **T5 Summarization**
This project uses the **T5 (Text-To-Text Transfer Transformer)** model for text summarization. T5 is a versatile transformer model that treats all NLP tasks as a text-to-text problem, making it highly effective for summarization.- **Model Used**:
- **T5 (Text-To-Text Transfer Transformer)**: A transformer-based model fine-tuned for summarization tasks.- **Features**:
- Input text is summarized using the T5 model.
- Interactive web interface for input and output.---
## Technologies Used
- **Backend**:
- Python
- Flask (Web framework)
- Hugging Face Transformers (for pre-trained models)
- PyTorch (for model inference)- **Frontend**:
- HTML
- CSS
- JavaScript- **Pre-trained Models**:
- BART (`facebook/bart-large-cnn`)
- GPT-2 (`gpt2`)
- T5 (`t5-base`)---
## Application Interface
**T5 Summarization**
**Ensemble Summarization**