https://github.com/muhamadazharrudin/bleu-metrix-evaluation
This repository contains Python scripts to calculate BLEU (Bilingual Evaluation Understudy) scores to assess the quality of automatic translation systems. BLEU scores are a popular metric for measuring how close a model's output is to a reference translation.
https://github.com/muhamadazharrudin/bleu-metrix-evaluation
bleu-metric bleu-score ipynb-jupyter-notebook lstm-neural-networks
Last synced: 3 months ago
JSON representation
This repository contains Python scripts to calculate BLEU (Bilingual Evaluation Understudy) scores to assess the quality of automatic translation systems. BLEU scores are a popular metric for measuring how close a model's output is to a reference translation.
- Host: GitHub
- URL: https://github.com/muhamadazharrudin/bleu-metrix-evaluation
- Owner: MuhamadAzharrudin
- Created: 2025-07-08T17:49:13.000Z (3 months ago)
- Default Branch: main
- Last Pushed: 2025-07-08T17:53:18.000Z (3 months ago)
- Last Synced: 2025-07-08T18:58:51.619Z (3 months ago)
- Topics: bleu-metric, bleu-score, ipynb-jupyter-notebook, lstm-neural-networks
- Language: Jupyter Notebook
- Homepage:
- Size: 20.5 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# BLEU Metric Evaluation
This repository contains Python scripts to calculate BLEU (Bilingual Evaluation Understudy) scores to assess the quality of automatic translation systems. BLEU scores are a popular metric for measuring how close a model's output is to a reference translation.
## 🎯 Fitur
- Calculate BLEU score
- Supports **smoothing** for short output
- Evaluate on multiple reference & hypothesis pairs
- Input format: CSV/Excel
- Output: BLEU score and summary statistics## 📦 Installation
```bash
git clone https://github.com/MuhamadAzharrudin/BLEU-metrix-evaluation.git