https://github.com/0x7o/bleurt-deploy
Bleurt metric deployment
https://github.com/0x7o/bleurt-deploy
bert-model bleurt metrics transformer
Last synced: 3 months ago
JSON representation
Bleurt metric deployment
- Host: GitHub
- URL: https://github.com/0x7o/bleurt-deploy
- Owner: 0x7o
- License: apache-2.0
- Created: 2023-01-19T02:40:13.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-02-26T12:24:04.000Z (over 2 years ago)
- Last Synced: 2025-01-17T12:46:06.541Z (5 months ago)
- Topics: bert-model, bleurt, metrics, transformer
- Language: Python
- Homepage:
- Size: 8.79 KB
- Stars: 0
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Bleurt Model API
This project is a REST API that allows you to use the pre-trained "Elron/bleurt-large-512" model to predict the similarity scores between a set of reference and candidate sentences. The API is built using the FastAPI framework and the Hugging Face Transformers library.
## Requirements
- Python 3.6 or higher
- PyTorch 1.5 or higher
- FastAPI
- Transformers## Docker
You can run the API using Docker. You must have Docker installed on your machine to do this.
First, build the Docker image:
```bash
docker build -t bleurt-api .
```Then, run the Docker container:
```bash
docker run -d -p 5000:5000 --gpus=all bleurt-api
```Test the API by sending a POST request to http://localhost:8000/predict with a JSON payload containing the references and candidates.
## Example
```json
{
"references": ["This is a great product", "This is a terrible product"],
"candidates": ["This is a fantastic product", "This is a horrible product"]
}
``````json
{
"scores": [0.9656828045845032, 0.04987005889415741]
}
```## Note
Make sure that the GPU is available if you are running the API in a container. If you are running the API on a machine without a GPU, you can remove the line `model.to("cuda")` from the code.