https://github.com/explosion/spacy-ray
☄️ Parallel and distributed training with spaCy and Ray
https://github.com/explosion/spacy-ray
distributed-computing machine-learning natural-language-processing parallel-training ray spacy training
Last synced: 5 months ago
JSON representation
☄️ Parallel and distributed training with spaCy and Ray
- Host: GitHub
- URL: https://github.com/explosion/spacy-ray
- Owner: explosion
- License: mit
- Created: 2020-06-12T00:23:34.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2023-07-31T14:57:37.000Z (almost 2 years ago)
- Last Synced: 2025-01-29T18:38:17.284Z (5 months ago)
- Topics: distributed-computing, machine-learning, natural-language-processing, parallel-training, ray, spacy, training
- Language: Python
- Homepage:
- Size: 95.7 KB
- Stars: 53
- Watchers: 10
- Forks: 9
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# spacy-ray: Parallel and distributed training with spaCy and Ray
> ⚠️ This repo is still a work in progress and requires the new **spaCy v3.0**.
[Ray](https://ray.io/) is a fast and simple framework for building and running
**distributed applications**. This very lightweight extension package lets you
use Ray for parallel and distributed training with [spaCy](https://spacy.io). If
`spacy-ray` is installed in the same environment as spaCy, it will automatically
add `spacy ray` commands to your spaCy CLI.The main command is `spacy ray train` for parallel and distributed training, but
we expect to add `spacy ray pretrain` and `spacy ray parse` as well.[](https://github.com/explosion/spacy-ray/actions/workflows/tests.yml)
[](https://github.com/explosion/spacy-ray/releases)
[](https://pypi.python.org/pypi/spacy-ray)## 🚀 Quickstart
You can install `spacy-ray` from pip:
```bash
pip install spacy-ray
```To check if the command has been registered successfully:
```bash
python -m spacy ray --help
```Train a model using the same API as `spacy train`:
```bash
python -m spacy ray train config.cfg --n-workers 2
```