{"id":15646765,"url":"https://github.com/alexellis/tensorflow-serving-openfaas","last_synced_at":"2025-04-30T12:23:22.169Z","repository":{"id":87754293,"uuid":"171439673","full_name":"alexellis/tensorflow-serving-openfaas","owner":"alexellis","description":"Example of using TensorFlow Serving with OpenFaaS","archived":false,"fork":false,"pushed_at":"2019-02-19T15:24:28.000Z","size":5,"stargazers_count":46,"open_issues_count":0,"forks_count":6,"subscribers_count":3,"default_branch":"master","last_synced_at":"2025-02-25T07:45:19.279Z","etag":null,"topics":["ai","docker","function","machine-learning","openfaas","serverless","tensorflow","tensorflow-serving","tf"],"latest_commit_sha":null,"homepage":null,"language":"Dockerfile","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alexellis.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-02-19T08:55:09.000Z","updated_at":"2024-11-15T17:43:22.000Z","dependencies_parsed_at":null,"dependency_job_id":"0ae378d1-caf5-4654-a4f9-51e429ed242d","html_url":"https://github.com/alexellis/tensorflow-serving-openfaas","commit_stats":null,"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alexellis%2Ftensorflow-serving-openfaas","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alexellis%2Ftensorflow-serving-openfaas/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alexellis%2Ftensorflow-serving-openfaas/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alexellis%2Ftensorflow-serving-openfaas/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alexellis","download_url":"https://codeload.github.com/alexellis/tensorflow-serving-openfaas/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":242650897,"owners_count":20163610,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai","docker","function","machine-learning","openfaas","serverless","tensorflow","tensorflow-serving","tf"],"created_at":"2024-10-03T12:14:39.308Z","updated_at":"2025-03-09T05:30:46.576Z","avatar_url":"https://github.com/alexellis.png","language":"Dockerfile","readme":"# tensorflow-serving-openfaas\n\nExample of packaging TensorFlow Serving with OpenFaaS to be deployed and managed through OpenFaaS with auto-scaling, scale-from-zero and a sane configuration for Kubernetes.\n\nThis example was adapted from: https://www.tensorflow.org/serving\n\n## Pre-reqs\n\n* [OpenFaaS](https://docs.openfaas.com/)\n* OpenFaaS CLI\n* Docker\n\n## Instructions\n\n* Clone the repo\n\n```sh\n$ mkdir -p ~/dev/\n$ cd ~/dev/\n$ git clone https://github.com/alexellis/tensorflow-serving-openfaas\n```\n\n* Clone the sample model and copy it to the function's build context\n\n```sh\n$ cd ~/dev/tensorflow-serving-openfaas\n\n$ git clone https://github.com/tensorflow/serving\n\n$ cp -r serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu ./ts-serve/saved_model_half_plus_two_cpu\n```\n\n* Edit the Docker Hub username\n\nYou need to edit the stack.yml file and replace `alexellis2` with your Docker Hub account.\n\n* Build the function image\n\n```sh\n$  faas-cli build\n```\n\nYou should now have a Docker image in your local library which you can deploy to a cluster with `faas-cli up`\n\n* Test the function locally\n\nAll OpenFaaS images can be run stand-alone without OpenFaaS installed, let's do a quick test, but replace `alexellis2` with your own name.\n\n```sh\n$ docker run -p 8081:8080 -ti alexellis2/ts-serve:latest\n```\n\nNow in another terminal:\n\n```sh\n$ curl -d '{\"instances\": [1.0, 2.0, 5.0]}' \\\n   -X POST http://127.0.0.1:8081/v1/models/half_plus_two:predict\n\n{\n    \"predictions\": [2.5, 3.0, 4.5\n    ]\n}\n```\n\nFrom here you can run `faas-cli up` and then invoke your function from the OpenFaaS UI, CLI or REST API.\n\n```sh\n$ export OPENFAAS_URL=http://127.0.0.1:8080\n\n$ curl -d '{\"instances\": [1.0, 2.0, 5.0]}' $OPENFAAS_URL/function/ts-serve/v1/models/half_plus_two:predict\n\n{\n    \"predictions\": [2.5, 3.0, 4.5\n    ]\n}\n```\n\n\u003c!-- You can even run the inference asynchronously with a callback-URL or separate function set-up to receive the result. --\u003e\n\nFind out more information about your function's endpoints:\n\n```sh\n$ faas-cli describe ts-serve\nName:                ts-serve\nStatus:              Ready\nReplicas:            1\nAvailable replicas:  1\nInvocations:         5\nImage:               alexellis2/ts-serve:latest\nFunction process:    \nURL:                 http://127.0.0.1:8080/function/ts-serve\nAsync URL:           http://127.0.0.1:8080/async-function/ts-serve\nLabels:              com.openfaas.function : ts-serve\n                     function : true\n```\n\n\u003c!-- \nCreate a request bin: https://requestbin.fullcontact.com/ i.e. `http://requestbin.fullcontact.com/tgjgrrtg` then run:\n\n```\n$ export OPENFAAS_URL=http://127.0.0.1:8080\n$ export CALLBACK_URL=http://requestbin.fullcontact.com/tgjgrrtg\n\n$ curl -H \"X-Callback-Url: $CALLBACK_URL\" -d '{\"instances\": [1.0, 2.0, 5.0]}' $OPENFAAS_URL/async-function/ts-serve/v1/models/half_plus_two:predict\n``` --\u003e\n\n* Try your own model\n\nNow you can try your own model by editing ts-serve/Dockerfile.\n\nLet me know what you think via [Twitter](https://twitter.com/alexellisuk)\n\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falexellis%2Ftensorflow-serving-openfaas","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falexellis%2Ftensorflow-serving-openfaas","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falexellis%2Ftensorflow-serving-openfaas/lists"}