Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/d-one/d-one-mlops
Repository with sample code and instructions for creating a complete MLOps training pipeline.
https://github.com/d-one/d-one-mlops
Last synced: about 1 month ago
JSON representation
Repository with sample code and instructions for creating a complete MLOps training pipeline.
- Host: GitHub
- URL: https://github.com/d-one/d-one-mlops
- Owner: d-one
- License: mit
- Created: 2022-04-07T08:10:50.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2022-11-16T18:24:47.000Z (about 2 years ago)
- Last Synced: 2023-06-13T06:28:28.841Z (over 1 year ago)
- Language: Jupyter Notebook
- Size: 16.3 MB
- Stars: 19
- Watchers: 2
- Forks: 15
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# D ONE MLOps
Full Maching Learning Lifecycle using open source technologies. This repository offers a fully functioning end-to-end MLOps training pipeline that runs with Docker Compose. The goal is to (1) provide you with a MLOps training tool and (2) give you a head start when building your production machine learning (“ML”) pipeline for your own project.
The built pipeline uses:
- DVC to track data
- MLflow to track experiments and register models
- Apache Airflow to orchestrate the MLOps pipeline
- Docker## How to work with this repo
1. Clone the repository to your machine
```
[email protected]:d-one/d-one-mlops.git
```
2. Install Dockercheck https://docs.docker.com/get-docker/ and install according to your OS
Make sure that docker Deskop is running before continuing.3. Run
```
echo -e "AIRFLOW_UID=$(id -u)" > .env
```4. Run
```
pip install docker-compose
```5. Run
```
docker-compose up
```6. Open handout.md
## Requirements
Please find the requirements of airflow environment [here](dockerfiles/airflow/requirements.txt)## Access
- http://localhost:8080 airflow, credentials airflow/airflow
- http://localhost:8888 jupyterlab, token cd4ml
- http://localhost:5000 mlflow
- http://localhost:9001 minio S3 server credentials mlflow_access/mlflow_secret
- http://localhost:5555 flower for monitoring celery cluster.## Cleanup
Run the following to stop all running docker containers through docker compose
```
docker-compose stop
```
or run the following to stop and delete all running docker containers through docker
```
docker stop $(docker ps -q)
```
```
docker rm $(docker ps -aq)
```
Finally run the following to delete all (named) volumes
```
docker volume rm $(docker volume ls -q)
```## Disclaimer
This repo has been tested on MacOs and Linux with:
```
1. Python 3.10.6
2. Docker version 20.10.10
3. docker-compose version 1.29.2
```