{"id":13753969,"url":"https://github.com/yu-iskw/machine-learning-microservice-python","last_synced_at":"2025-05-09T21:36:15.525Z","repository":{"id":27449142,"uuid":"113961101","full_name":"yu-iskw/machine-learning-microservice-python","owner":"yu-iskw","description":"Example to implement machine learning microservice with gRPC and Docker in Python","archived":true,"fork":false,"pushed_at":"2021-12-27T07:12:49.000Z","size":42,"stargazers_count":81,"open_issues_count":0,"forks_count":21,"subscribers_count":13,"default_branch":"master","last_synced_at":"2024-11-16T06:31:15.622Z","etag":null,"topics":["grpc","machine-learning","microservice","python"],"latest_commit_sha":null,"homepage":"https://medium.com/@yuu.ishikawa/machine-learning-as-a-microservice-in-python-16ba4b9ea4ee","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/yu-iskw.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-12-12T08:01:28.000Z","updated_at":"2024-08-05T10:39:00.000Z","dependencies_parsed_at":"2022-09-17T16:12:14.292Z","dependency_job_id":null,"html_url":"https://github.com/yu-iskw/machine-learning-microservice-python","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yu-iskw%2Fmachine-learning-microservice-python","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yu-iskw%2Fmachine-learning-microservice-python/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yu-iskw%2Fmachine-learning-microservice-python/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/yu-iskw%2Fmachine-learning-microservice-python/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/yu-iskw","download_url":"https://codeload.github.com/yu-iskw/machine-learning-microservice-python/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253329068,"owners_count":21891572,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["grpc","machine-learning","microservice","python"],"created_at":"2024-08-03T09:01:36.020Z","updated_at":"2025-05-09T21:36:15.207Z","avatar_url":"https://github.com/yu-iskw.png","language":"Python","readme":"# Machine learning as a microservice in python\n\nThis is an example to service machine larning as a microservice in python.\nThe model predicts iris species by given sepal length, sepal width, petal length and petal width.\n\n## Requirements\n\n- Docker\n- Anaconda\n- Make\n\n## Implement the files\n\n1. Train a model for iris data with `./model/train.py`.\n  - As a result, it saves a model to predict iris species in `iris_model.pickle`.\n2. Define the protocol-buffer in `iris.proto`.\n3. Implement a command to generate python files from `iris.proto` in `codegen.py`.\n  - `iris_pb2.py` and `iris_pb2_grpc.py` are generated.\n4. Implement `grpc_server.py`.\n  - We predict iris species by given features in `grpc_server.py`.\n5. Implement `iris_client.py`.\n  - The files is just a client to request judging iris species with features which are fixed values of sepal length, sepal width, petal length and petal width.\n\n\n## How to set up an environment on our local machine\nThe command creates an anaconda environment.\nWe can activate the environment with `source activate iris-predictor`, since the environment name is `iris-predictor`.\n```\n# Create an anaconda environment.\nconda env create -f environment.yml -n iris-predictor\n\n# Verify that the new environment was installed correctly, active environment is shown with '*'.\nconda env list\n\n# Remove the anaconda environment.\nconda env remove -y -n iris-predictor\n```\n\n## How to run the server and the client on our local machine\nBefore running the predictor as a docker container, we can run the server and client on our local machine.\n```\n# Run serve.\npython grpc_server.py\n\n# Run client.\npython iris_client.py\n```\n\n## How to build and run a docker image\nWe put the python files and saved model in the docker image.\nBesides, the docker image is used for running `grpc_server.py`.\n\nThe host name depends on your environment.\nIf you use `docker-machine`, we can see the IP address with `docker-machine ip YOUR_DOCKER_MACHINE`.\n\nThe docker image exposes `50052` port for the gRPC server.\nAs well as, the gRPC server uses `50052`.\nThat's why we put `-p 50052:50052` in the `docker run` command.\n```\n# Build a docker image.\ndocker build . -t iris-predictor\n\n# Run a docker container.\ndocker run --rm -d -p 50052:50052 --name iris-predictor iris-predictor\n\n# Kill the docker container\ndocker kill iris-predictor\n```\n\nAnd then, we check if the client can access the server on docker or not:\n\n```\n# Execute it on your local machine, not a docker container.\npython iris_client.py --host HOST_NAME --port 50052\nPredicted species number: 0\n```\n\n## Appendix: HTTP/REST API\nSometimes you are faced with a situation that you need to offer both gRPC API and RESTful API.\nTo avoid duplicated work, we can also define the HTTP/REST API as just proxy to the gRPC API.\nI know having requests internally can be in vein.\nBut, in terms of software development, a benefit that we don't need to develop different prediction functions is true as well.\n\nThe REST API as proxy is `rest_proxy.py`\nIt is simply implemented with [Flask\\-RESTful](https://flask-restful.readthedocs.io/en/latest/).\n\nAnd the definition to launch both the gRPC API and the RESTful API is in `docker-compose.yml`.\n\n![docker architecture](./docs/docker-architecture.png)\n\n```\n# Launch the gRPC server and REST server on docker\ndocker-composer -d\n\n# Request to the REST API.\nDOCKER_HOST=\"...\"\ncurl http://${DOCKER_HOST}:5000/ -X POST \\\n  -d \"sepal_length=6.8\" \\\n  -d \"sepal_width=3.2\" \\\n  -d \"petal_length=5.9\" \\\n  -d \"petal_width=2.3\"\n\n{\"species\": \"2\"}\n```\n","funding_links":[],"categories":["Python","python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fyu-iskw%2Fmachine-learning-microservice-python","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fyu-iskw%2Fmachine-learning-microservice-python","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fyu-iskw%2Fmachine-learning-microservice-python/lists"}