Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/blacksujit/ml-deploy-lite
ml-deploy-lite is a Python library designed to simplify the deployment of machine learning models. It allows developers to quickly turn their models into REST APIs or gRPC services with minimal configuration. The library integrates seamlessly with Docker and Kubernetes, providing built-in monitoring and logging for performance and error tracking.
https://github.com/blacksujit/ml-deploy-lite
deployement-strategy docker kubernesties-deployement kubernetes-deployment ml-deployment ml-deployment-developers-experiment ml-if-git mlops my-library own-library python-3 python-lambda python-library
Last synced: 21 days ago
JSON representation
ml-deploy-lite is a Python library designed to simplify the deployment of machine learning models. It allows developers to quickly turn their models into REST APIs or gRPC services with minimal configuration. The library integrates seamlessly with Docker and Kubernetes, providing built-in monitoring and logging for performance and error tracking.
- Host: GitHub
- URL: https://github.com/blacksujit/ml-deploy-lite
- Owner: Blacksujit
- Created: 2024-11-21T10:01:10.000Z (about 2 months ago)
- Default Branch: main
- Last Pushed: 2024-11-26T15:26:16.000Z (about 2 months ago)
- Last Synced: 2024-12-22T01:59:56.229Z (21 days ago)
- Topics: deployement-strategy, docker, kubernesties-deployement, kubernetes-deployment, ml-deployment, ml-deployment-developers-experiment, ml-if-git, mlops, my-library, own-library, python-3, python-lambda, python-library
- Language: Python
- Homepage: https://pypi.org/project/ml-deploy-lite/
- Size: 198 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# ML-Deploy-Lite:
`ml_deploy_lite` is a Python library designed to simplify the deployment of machine learning models. It was created to address the common challenges faced during the deployment process, such as the complexity of setting up REST APIs or gRPC services, the need for Docker and Kubernetes integration, and the lack of built-in monitoring and logging for performance and error tracking.
## Table of Contents:
- [Challenges in Deployments i have faced](#challenges-in-deployments-i-have-faced-)
- [Installation](#installation)
- [Usage](#usage)
- [Features](#features)
- [Why `ml_deploy_lite`?](#why-ml-deploy-lite)
- [Creating a Sample Model](#creating-a-sample-model)
- [Docker Integration](#docker-integration)
- [Kubernetes Integration](#kubernetes-integration)
- [Testing](#testing)
- [Contributing](#contributing)
- [License](#license)## Challenges in Deployments i have faced :
- **Complexity**: Setting up REST APIs or gRPC services for machine learning models can be complex and time-consuming.
- **Docker and Kubernetes Integration**: Integrating machine learning models with Docker and Kubernetes can be challenging, especially for developers new to these technologies.
- **Monitoring and Logging**: Without built-in support for monitoring and logging, it can be difficult to track the performance of the deployed models and identify errors.
## Why `ml_deploy_lite`?:
`ml_deploy_lite` was created to simplify the deployment process for machine learning models. It provides a user-friendly interface for quickly converting models into REST APIs or gRPC services, automatically generates Dockerfiles for containerization, and simplifies the generation of Kubernetes deployment configurations. It also includes built-in support for monitoring performance and logging errors, making it easier for developers to track the performance of their deployed models.
## Features
- **Easy Deployment**: Quickly convert machine learning models into REST APIs.
- **Docker Integration**: Automatically generate a Dockerfile for containerization.
- **Kubernetes Support**: Generate Kubernetes deployment configurations easily.
- **Monitoring and Logging**: Built-in support for monitoring performance and logging errors.
- **User-Friendly**: Designed to be easy to use for developers of all skill levels.
## Installation
To install `ml_deploy_lite`, you can use pip. Make sure you have Python 3.6 or higher installed.
```
pip install ml_deploy_lite```
## Usage
Here’s a simple example of how to use `ml_deploy_lite` to deploy a machine learning model.
1. **Import the Library**:
```python
from ml_deploy_lite import MLDeployLite
```2. **Create an Instance**:
```python
deployer = MLDeployLite('path/to/your/model.pkl')
```3. **Run the API**:
```python
deployer.run()
```4. **Make Predictions**:
## Test with curl:
``Open a terminal and run the following command:``
```
curl -X POST http://127.0.0.1:5000/predict \
-H "Content-Type: application/json" \
-d '{"features": [5.1, 3.5, 1.4, 0.2]}'```
``You should see a JSON response like:``
```
{
"prediction": 0
}```
`OR`
You can make predictions by sending a POST request to the `/predict` endpoint with the following JSON body:
```json
{
"features": [5.1, 3.5, 1.4, 0.2]
}
```## Creating a Sample Model
To create a sample machine learning model, you can use the following script:
```python
import joblib
from sklearn.datasets import load_iris
from sklearn.ensemble import RandomForestClassifier# Load dataset
iris = load_iris()
X, y = iris.data, iris.target# Train a model
model = RandomForestClassifier()
model.fit(X, y)# Save the model
joblib.dump(model, 'ml_deploy_lite/model/sample_model.pkl')
```Run this script to generate a sample model that you can use with `ml_deploy_lite`.
## Docker Integration
To create a Docker image for your application, you can use the provided `create_dockerfile` function in `ml_deploy_lite/docker.py`. This will generate a `Dockerfile` in the root directory of your project.
1. **Generate the Dockerfile**:
```python
from ml_deploy_lite.docker import create_dockerfilecreate_dockerfile()
```2. **Build the Docker Image**:
Run the following command in the terminal:
```bash
docker build -t your_docker_image:latest .
```3. **Run the Docker Container**:
After building the image, you can run the container with:
```bash
docker run -p 5000:5000 your_docker_image:latest
```## Kubernetes Integration
To create a Kubernetes deployment configuration, you can use the `create_k8s_deployment` function in `ml_deploy_lite/k8s.py`. This will generate a `k8s_deployment.yaml` file that you can apply to your Kubernetes cluster.
1. **Generate the Kubernetes Deployment File**:
```python
from ml_deploy_lite.k8s import create_k8s_deploymentcreate_k8s_deployment()
```2. **Apply the Configuration**:
Run the following command to deploy to your Kubernetes cluster:
```bash
kubectl apply -f k8s_deployment.yaml
```## Testing
To run the tests for the API, you can use the following command:
```bash
python -m unittest discover -s tests
```Make sure you have the necessary test data and models in place before running the tests.
## Contributing
Contributions are welcome! If you have suggestions for improvements or new features, please open an issue or submit a pull request.
1. Fork the repository.
2. Create a new branch (`git checkout -b feature-branch`).
3. Make your changes and commit them (`git commit -m 'Add new feature'`).
4. Push to the branch (`git push origin feature-branch`).
5. Open a pull request.## License
This project is licensed under the MIT License.
---
For more information, please refer to the [documentation](https://flask.palletsprojects.com/) for Flask and the [Docker SDK for Python](https://docker-py.readthedocs.io/en/stable/).
---
# Conclusion
This `README.md` file provides a comprehensive overview of your `ml_deploy_lite` library, including installation instructions, usage examples, and details on Docker and Kubernetes integration. It is structured to help users understand how to use the library effectively without encountering errors. Feel free to modify any sections to better fit your project's specifics or to add any additional information you think is necessary!