{"id":19792286,"url":"https://github.com/making/fastchat-on-mac","last_synced_at":"2025-06-18T06:33:13.331Z","repository":{"id":197174099,"uuid":"698105600","full_name":"making/fastchat-on-mac","owner":"making","description":"FastChat on Mac (M2)","archived":false,"fork":false,"pushed_at":"2023-10-19T09:00:05.000Z","size":6,"stargazers_count":4,"open_issues_count":0,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-02-28T15:18:05.673Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":"Procfile","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/making.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-09-29T07:04:50.000Z","updated_at":"2024-03-08T08:57:10.000Z","dependencies_parsed_at":null,"dependency_job_id":"8c72d5f7-4fb6-4747-a7eb-1a67ccd834c4","html_url":"https://github.com/making/fastchat-on-mac","commit_stats":null,"previous_names":["making/fastchat-on-mac"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/making/fastchat-on-mac","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/making%2Ffastchat-on-mac","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/making%2Ffastchat-on-mac/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/making%2Ffastchat-on-mac/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/making%2Ffastchat-on-mac/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/making","download_url":"https://codeload.github.com/making/fastchat-on-mac/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/making%2Ffastchat-on-mac/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":260506183,"owners_count":23019410,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-12T07:06:35.564Z","updated_at":"2025-06-18T06:33:08.297Z","avatar_url":"https://github.com/making.png","language":"Procfile","readme":"# FastChat on Mac (Apple Silicon)\n\nhttps://github.com/lm-sys/FastChat on Mac\n\n## How to run OpenAI compatible FastChat server on Mac (Apple Silicon) with its Metal GPU\n\nDownload the `docker-compose.yaml` and run\n\n```\nwget https://github.com/making/fastchat-on-mac/raw/main/docker-compose.yaml\ndocker-compose up\n```\n\nIt launches controller, gradio (Web UI, optional), and openai api server. These do not use GPU, so it is easy to start them with docker.\n\n\nNext, start the model worker outside of docker and specify the `--device mps` option to take advantage of Metal GPU.\n\n```\npip3 install \"fschat[model_worker,webui]\"\n```\n\nThen\n\n```\npython3 -m fastchat.serve.model_worker --port 21002 --model-path lmsys/vicuna-7b-v1.3 --device mps --load-8bit --controller-address http://localhost:21001 --worker-address http://host.docker.internal:21002\n```\n\n* http://localhost:7860 WebUI\n* http://localhost:8000 OpenAI API\n\n\n## Access the OpenAI API\n\nhttps://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md\n\n```\n$ curl -s http://localhost:8000/v1/chat/completions \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"model\": \"vicuna-7b-v1.3\",\n    \"messages\": [{\"role\": \"user\", \"content\": \"Tell me a joke\"}]\n  }'\n\n{\n  \"id\": \"chatcmpl-fiy5p8FeTLjUNdfL2yS28q\",\n  \"object\": \"chat.completion\",\n  \"created\": 1695969537,\n  \"model\": \"vicuna-7b-v1.3\",\n  \"choices\": [\n    {\n      \"index\": 0,\n      \"message\": {\n        \"role\": \"assistant\",\n        \"content\": \"Sure, here's a joke for you:\\n\\nWhy don't scientists trust atoms?\\n\\nBecause they make up everything!\"\n      },\n      \"finish_reason\": \"stop\"\n    }\n  ],\n  \"usage\": {\n    \"prompt_tokens\": 44,\n    \"total_tokens\": 75,\n    \"completion_tokens\": 31\n  }\n}\n```\n\n\n## Run the Spring AI OpenAI demo\n\n```\ngit clone https://github.com/rd-1-2022/ai-openai-helloworld.git\ncd ai-openai-helloworld\n./mvnw clean package -DskipTests\n\njava -jar target/ai-openai-helloworld-0.0.1-SNAPSHOT.jar --spring.ai.openai.base-url=http://localhost:8000 --spring.ai.openai.api-key=dummy --spring.ai.openai.model=vicuna-7b-v1.3\n```\n\n\n```\n$ curl -s --get --data-urlencode 'message=Tell me a joke about a cow.' http://localhost:8080/ai/simple | jq .\n{\n  \"completion\": \"Why couldn't the cow play in the band?\\n\\nBecause it was too moo-sical!\"\n}\n```\n\n## How to build the docker image\n\n```\npack build ghcr.io/making/fastchat --builder paketobuildpacks/builder:base --path image --publish\n```","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmaking%2Ffastchat-on-mac","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmaking%2Ffastchat-on-mac","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmaking%2Ffastchat-on-mac/lists"}