{"id":27786177,"url":"https://github.com/angel-ml/serving","last_synced_at":"2025-04-30T15:57:42.625Z","repository":{"id":35526961,"uuid":"150675191","full_name":"Angel-ML/serving","owner":"Angel-ML","description":"A stand alone industrial serving system for angel. ","archived":false,"fork":false,"pushed_at":"2022-04-12T21:55:41.000Z","size":3971,"stargazers_count":62,"open_issues_count":6,"forks_count":34,"subscribers_count":9,"default_branch":"master","last_synced_at":"2025-04-30T15:57:41.502Z","etag":null,"topics":["machine-learning","serving","serving-recommendation"],"latest_commit_sha":null,"homepage":null,"language":"Java","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Angel-ML.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2018-09-28T02:33:44.000Z","updated_at":"2024-06-17T04:19:49.000Z","dependencies_parsed_at":"2022-08-28T15:02:07.700Z","dependency_job_id":null,"html_url":"https://github.com/Angel-ML/serving","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Angel-ML%2Fserving","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Angel-ML%2Fserving/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Angel-ML%2Fserving/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Angel-ML%2Fserving/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Angel-ML","download_url":"https://codeload.github.com/Angel-ML/serving/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":251737471,"owners_count":21635604,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["machine-learning","serving","serving-recommendation"],"created_at":"2025-04-30T15:57:41.395Z","updated_at":"2025-04-30T15:57:42.611Z","avatar_url":"https://github.com/Angel-ML.png","language":"Java","readme":"## Angel Serving\n**Angel Serving** is standalone industrial serving system for machine/deep learning models, it is designed\nto flexible and high-performance.\n\n### Architecture\n\n----\n\n![Angel Serving Architecture][1]\n\n### Features\n- One can access Angel Serving through gRPC and Restful API\n\n- Angel Serving is a general machine learning serving framework which means models from other training platform can server on Angel Serving. \nThere is a pluggable mechanism for the third party platform join in, now we support: Angel, PyTorch and PMML format. Through the PMML format, Angel can server Spark and XGBoost models.\n\n- Similar to TensorFlow Serving, we provide fine grain version control: earliest, latest and specified versions.\n\n- Apart from version control, angel serving also provide fine grain service monitoring:\n  - QPS: Query per second\n  - Success/Total requests\n  - Response time distribution\n  - Average response Time\n\n### Serve a model in 60 seconds\n\n```$bash\n# Download the Angel Serving Docker image and repo\ndocker pull tencentangel/serving\n\ngit clone https://github.com/Angel-ML/serving.git\n# Location of demo models\nTESTDATA=\"$(pwd)/serving/models/angel/lr/lr-model\"\n\n# Start Angel Serving container and open the REST API port\ndocker run -t --rm -p 8501:8501 \\\n    -v \"$TESTDATA:/models\" \\\n    -e MODEL_NAME=lr \\\n    -e MODEL_PLATFORM=angel \\\n    tencentangel/serving \u0026\n\n# Query the model using the predict API\ncurl -H \"Content-Type: application/json\" -X POST -d '{\"instances\": [[0.51483303, 0.99900955, 0.9477888, 0.6912188, 0.41446745, 0.2525878, 0.6014038, 0.46847868, 0.12854028, 0.8306037, 0.3461753, 0.1129151, 0.6229094, 0.90299904, 0.50834644, 0.34843314, 0.95900637, 0.9437762, 0.31707388, 0.73501045, 0.05600065, 0.47225082, 0.28908283, 0.7371853, 0.55928135, 0.81367457, 0.91782594, 0.008230567, 0.9317811, 0.0061050057, 0.7060979, 0.51740277, 0.07297987, 0.34826308, 0.43395072, 0.5017575, 0.73248106, 0.7576818, 0.43087876, 0.9380423, 0.5226082, 0.9813176, 0.20717019, 0.42229313, 0.8274106, 0.6791944, 0.48174334, 0.77374876, 0.56179315, 0.6584269, 0.7635249, 0.9949779, 0.84034514, 0.7586089, 0.74443096, 0.21172583, 0.7850719, 0.5341459, 0.84134424, 0.06459451, 0.1270392, 0.41439575, 0.98234355, 0.5515572, 0.9594097, 0.18379861, 0.8221523, 0.23739898, 0.07713032, 0.66251403, 0.84977543, 0.905998, 0.21836805, 0.40002906, 0.6271626, 0.37708586, 0.20958215, 0.051997364, 0.6841619, 0.22454417, 0.34285623, 0.19205093, 0.35783356, 0.29280972, 0.19194472, 0.42898583, 0.27232456, 0.12662607, 0.74165606, 0.43464816, 0.8310301, 0.012846947, 0.9810947, 0.43377626, 0.3608846, 0.22756284, 0.6404164, 0.7243295, 0.68765146, 0.12439847, 0.25675082, 0.26143825, 0.41246158, 0.867953, 0.2895738, 0.3916427, 0.93816304, 0.27819514, 0.1989426, 0.62377095, 0.9969712, 0.4159639, 0.70966166, 0.29150474, 0.6492832, 0.10598481, 0.44674253, 0.03885162, 0.25127923, 0.60202503, 0.6067293, 0.94750637, 0.97315085]]}' \\\n  localhost:8501/v1/models/lr/versions/6:predict\n\n# Returns =\u003e {\"predictions\": [{\"trueLabel\":\"0.0\",\"proba\":\"0.024818711775534966\",\"pred\":\"-3.671025514602661\",\"attached\":\"NaN\",\"predLabel\":\"-1.0\",\"sid\":\"0\"}]}\n```\n\n### Setup\n1. **Compile Environment Requirements**\n   - jdk \u003e=1.8\n   - maven \u003e= 3.0.5\n   - protobuf \u003e= 3.5.1\n\n2. **Source Code Download**\n\n   ```$xslt\n   git clone https://github.com/Angel-ML/serving.git\n   ```\n\n3. **Compile**\n\n   Run the following command in the root directory of the source code\n   ```$xslt\n   mvn clean package -Dmaven.test.skip=true\n   ```\n   After compiling, a distribution package named `serving-0.1.0-bin.zip` will be generated under dist/target in the root directory.\n\n4. **Distribution Package**\n   Unpacking the distribution package, subdirectories will be generated under the root directory:\n   - bin: contains Angel Serving start scripts.\n   - conf: contains system config files.\n   - lib: contains jars for Angel Serving and dependencies.\n   - models: contains trained example models.\n   - docs: contains user manual and restful api documentation.\n\n### Deployment Guide\n1. **Execution Environment Requirements**\n   - jdk \u003e= 1.8\n   - set JAVA_HOME\n\n2. **Start Server**\n\n   Run the `serving-submit` with args to start Angel Serving, example:\n   ```$xslt\n   $SERVING_HOME/bin/serving-submit \\\n      --port 8500 \\\n      --rest_api_port 8501 \\\n      --model_base_path $SERVING_HOME/models/angel/lr/lr-model/ \\\n      --model_name lr \\ \n      --model_platform angel \\\n      --enable_metric_summary true\n   ```\n\n### Documentation\n\n* [User Manual](./docs/serving_doc.md)\n* [Restful API](./docs/restful-api.md)\n\n[1]: ./docs/img/AngelServing_framework.png\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fangel-ml%2Fserving","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fangel-ml%2Fserving","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fangel-ml%2Fserving/lists"}