https://github.com/veralvx/babellica
Command-line tool for text translation using LLMs or ArgosTranslate. Optional Gradio Web-UI.
https://github.com/veralvx/babellica
ai argos artificial-intelligence bash bashscript docker dockerfile gradio llm llms ollama podman python shellscript translate translation translations translator webui
Last synced: 5 months ago
JSON representation
Command-line tool for text translation using LLMs or ArgosTranslate. Optional Gradio Web-UI.
- Host: GitHub
- URL: https://github.com/veralvx/babellica
- Owner: veralvx
- License: mit
- Created: 2025-03-01T21:15:31.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-07-28T19:44:43.000Z (5 months ago)
- Last Synced: 2025-07-28T21:39:23.711Z (5 months ago)
- Topics: ai, argos, artificial-intelligence, bash, bashscript, docker, dockerfile, gradio, llm, llms, ollama, podman, python, shellscript, translate, translation, translations, translator, webui
- Language: Python
- Homepage:
- Size: 41 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Babellica
Command-line tool for text translation using LLMs or ArgosTranslate. Optional Gradio Web-UI.
## Summary
- [Usage](#usage)
- [Requirements](#requirements)
- [Packages](#packages)
- [Argos Translate](#argos-translate)
- [Docker/Podman](#dockerpodman)
## Usage
`./babellica.sh [llm|argos]`
`from_lang` and `to_lang` arguments may follow [ISO 639](https://en.wikipedia.org/wiki/List_of_ISO_639_language_codes). There are 50 language names and codes specified in `babellica.sh`. If there is no correspondence, the program will use what was given as an argument, being that a valid language or not.
The last argument can be either `llm` or `argos`. If it is not provided, it will use the default value: `llm`.
Example:
`./babellica.sh input.srt output.srt en pt llm`
### Set a specific model
`./babellica.sh setmodel [model_name] [temperature] [system_message]`
Example:
`./babellica.sh setmodel aya-expanse:32b 0.2`
## Requirements
### Packages
#### System Packages
- [`ollama`](https://github.com/ollama/ollama)
- [`pandoc`](https://github.com/jgm/pandoc)
- `poppler-utils`
- `perl`
#### Pip Packages
- [`ollama`](https://github.com/ollama/ollama)
- [`ebooklib`]
- [`argostranslate`](https://github.com/argosopentech/argos-translate)
- [`torch`](https://pytorch.org/get-started/locally/)
## Docker/Podman
### CLI:
```
podman build -f Dockerfile.cli -t babellica:cli
```
Then, run mounting your `pwd` to `/workspace`:
```
podman run -it --rm --volume $(pwd):/workspace babellica:cli --gpus=all babellica:cli
```
Alias:
```
echo "alias babellica:cli='podman run -it --rm --volume \$(pwd):/workspace --gpus=all babellica:cli'" >> ~/.bashrc
```
### Gradio:
```
podman build -f Dockerfile.gradio -t babellica:gradio
```
```
podman run -it --rm -p 7860:7680 --gpus=all babellica:gradio
```
Alias:
```
echo "alias babellica:gradio='podman run -it --rm --gpus=all babellica:gradio'" >> ~/.bashrc
```