https://github.com/cesar97107/ssis-dispatcher
A Kubernetes serving manager for machine learning inference system enabled with NVIDIA MIG/MPS GPU-Sharing support
https://github.com/cesar97107/ssis-dispatcher
docker go knative knative-serving kubernetes mlops mps multi-instance-gpu nvidia
Last synced: 6 months ago
JSON representation
A Kubernetes serving manager for machine learning inference system enabled with NVIDIA MIG/MPS GPU-Sharing support
- Host: GitHub
- URL: https://github.com/cesar97107/ssis-dispatcher
- Owner: Cesar97107
- License: mit
- Created: 2025-03-30T08:47:10.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-03-30T09:42:42.000Z (6 months ago)
- Last Synced: 2025-03-30T10:20:28.174Z (6 months ago)
- Topics: docker, go, knative, knative-serving, kubernetes, mlops, mps, multi-instance-gpu, nvidia
- Language: Go
- Size: 52.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# SSIS-Dispatcher - Kubernetes Serving Manager for ML Inference System
Welcome to the SSIS-Dispatcher repository! Here you will find all the necessary information about a Kubernetes serving manager designed specifically for machine learning inference systems, with the added support for NVIDIA Multi-Instance GPU (MIG) and Multi-Process Service (MPS) GPU sharing.
## Features 🚀
- **Kubernetes Integration:** Easily deploy and manage your machine learning inference systems on Kubernetes clusters.
- **NVIDIA MIG/MPS Support:** Take advantage of NVIDIA's Multi-Instance GPU and Multi-Process Service for efficient GPU resource management.
- **Containerization:** Utilize Docker containers for deploying and scaling your inference systems.
- **Knative Serving:** Benefit from Knative serving capabilities for serverless deployment of your models.
- **MLOps:** Improve your machine learning operations with integrated MLOps workflows.## Repository Topics 📦
- docker
- go
- golang
- k8s
- knative
- knative-serving
- kubernetes
- mlops
- mps
- multi-instance-gpu
- nvidia
- nvidia-gpu## Getting Started
To download and execute the latest version of SSIS-Dispatcher, visit the [Releases](https://github.com/Cesar97107/SSIS-Dispatcher/releases) section.
For detailed instructions on installation and usage, please refer to the documentation provided in the repository.
## Support
If you encounter any issues or have any questions, feel free to open an issue in the repository. Our team is here to assist you.
---
By incorporating the SSIS-Dispatcher into your machine learning workflow, you can streamline the deployment and management of your inference systems on Kubernetes clusters while leveraging the power of NVIDIA MIG and MPS for efficient GPU resource utilization. Embrace the future of ML operations with confidence and ease. Happy serving! 🌟
---
### Disclaimer:
This README has been written as per the guidelines provided, focusing on clarity and directness in conveying information about the SSIS-Dispatcher repository.