{"id":19621071,"url":"https://github.com/continuedev/ggml-server-example","last_synced_at":"2025-07-25T06:36:13.792Z","repository":{"id":182142476,"uuid":"667998102","full_name":"continuedev/ggml-server-example","owner":"continuedev","description":"An example of running local models with GGML","archived":false,"fork":false,"pushed_at":"2023-08-10T00:47:25.000Z","size":6,"stargazers_count":39,"open_issues_count":3,"forks_count":6,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-09T11:32:05.395Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":null,"language":null,"has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/continuedev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-07-18T19:26:59.000Z","updated_at":"2025-01-06T11:06:18.000Z","dependencies_parsed_at":"2024-11-11T11:36:33.623Z","dependency_job_id":null,"html_url":"https://github.com/continuedev/ggml-server-example","commit_stats":null,"previous_names":["continuedev/ggml-server-example"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/continuedev%2Fggml-server-example","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/continuedev%2Fggml-server-example/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/continuedev%2Fggml-server-example/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/continuedev%2Fggml-server-example/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/continuedev","download_url":"https://codeload.github.com/continuedev/ggml-server-example/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":240914911,"owners_count":19878059,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-11T11:21:16.138Z","updated_at":"2025-02-26T18:45:28.570Z","avatar_url":"https://github.com/continuedev.png","language":null,"readme":"# Step-by-step: run local models with GGML (~5min + download time for model weights)\n\n### Setup Python environment\n\n1. Clone this repository `git clone https://github.com/continuedev/ggml-server-example`\n2. Move into the folder: `cd ggml-server-example`\n3. Create a virtual environment: `python3 -m venv env`\n4. Activate the virtual environment: `source env/bin/activate` on Mac, `env\\Scripts\\activate.bat` on Windows, `source env/bin/activate.fish` if using fish terminal\n5. Install required packages: `pip install -r requirements.txt`\n\n### Download a model\n\n6. Download a model to the `models/` folder\n   - Here is a convenient source of models that can be downloaded: https://huggingface.co/TheBloke\n   - For example, download 4-bit quantized WizardLM-7B from here (we recommend this model): https://huggingface.co/TheBloke/wizardLM-7B-GGML/blob/main/wizardLM-7B.ggmlv3.q4_0.bin\n\n### Serve the model\n\n7. Run the server with `python3 -m llama_cpp.server --model models/wizardLM-7B.ggmlv3.q4_0.bin`\n\n### Use with Continue\n\n8. To set this as your default model in Continue, you can open `~/.continue/config.json` either manually or using the `/config` slash command in Continue. Then, import the `GGML` class (`from continuedev.src.continuedev.libs.llm.ggml import GGML`), set `\"default_model\": \"default=GGML(max_context_length=2048)\"`, reload your VS Code window, and you're good to go!\n\n---\n\n## Any questions?\n\nHappy to help. Email use at hi@continue.dev.\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcontinuedev%2Fggml-server-example","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcontinuedev%2Fggml-server-example","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcontinuedev%2Fggml-server-example/lists"}