{"id":47620352,"url":"https://github.com/harshaneel/localaik","last_synced_at":"2026-04-04T03:01:09.836Z","repository":{"id":346964722,"uuid":"1192117157","full_name":"harshaneel/localaik","owner":"harshaneel","description":"localaik is a local compatibility server for a subset of the Gemini and OpenAI APIs. Run one container locally, point your SDK to it, and the proxy serves both protocol shapes on the same port for tests and development.","archived":false,"fork":false,"pushed_at":"2026-03-26T07:29:46.000Z","size":51,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2026-03-27T01:55:50.595Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/harshaneel.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2026-03-25T22:56:37.000Z","updated_at":"2026-03-26T07:29:50.000Z","dependencies_parsed_at":null,"dependency_job_id":null,"html_url":"https://github.com/harshaneel/localaik","commit_stats":null,"previous_names":["harshaneel/localaik"],"tags_count":2,"template":false,"template_full_name":null,"purl":"pkg:github/harshaneel/localaik","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/harshaneel%2Flocalaik","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/harshaneel%2Flocalaik/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/harshaneel%2Flocalaik/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/harshaneel%2Flocalaik/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/harshaneel","download_url":"https://codeload.github.com/harshaneel/localaik/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/harshaneel%2Flocalaik/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31385935,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-04T01:22:39.193Z","status":"online","status_checked_at":"2026-04-04T02:00:07.569Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2026-04-01T22:00:32.901Z","updated_at":"2026-04-04T03:01:09.811Z","avatar_url":"https://github.com/harshaneel.png","language":"Go","readme":"# localaik\n\n[![CI](https://github.com/harshaneel/localaik/actions/workflows/release.yml/badge.svg)](https://github.com/harshaneel/localaik/actions/workflows/release.yml)\n[![Docker Hub](https://img.shields.io/docker/v/gokhalh/localaik?sort=semver\u0026label=Docker%20Hub)](https://hub.docker.com/r/gokhalh/localaik)\n[![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](LICENSE)\n[![Go Report Card](https://goreportcard.com/badge/github.com/harshaneel/localaik)](https://goreportcard.com/report/github.com/harshaneel/localaik)\n[![Go Version](https://img.shields.io/github/go-mod/go-version/harshaneel/localaik)](https://github.com/harshaneel/localaik/blob/main/go.mod)\n[![Go Reference](https://pkg.go.dev/badge/github.com/harshaneel/localaik.svg)](https://pkg.go.dev/github.com/harshaneel/localaik)\n\nA local compatibility server for the Gemini and OpenAI APIs. Run one container, point your SDK at `http://localhost:8090`, and get both protocol shapes on the same port for tests and development.\n\n## Motivation\n\nTesting code that calls Gemini or OpenAI is painful: real API calls are slow, cost money, and need network access. localaik gives you a single Docker container that speaks both protocols backed by a local model — no API key, no internet, deterministic enough for CI.\n\n## Architecture\n\n```\n┌────────────────────────────────────────────────────────┐\n│  localaik container                                    │\n│                                                        │\n│  ┌──────────────────────────┐    ┌──────────────────┐  │\n│  │  localaik proxy (:8090)  │    │ llama.cpp (:8080)│  │\n│  │                          │    │                  │  │\n│  │  /v1beta/* (Gemini)  ────┼──▶ │  Gemma 3 model   │  │\n│  │  /v1/*     (OpenAI)  ────┼──▶ │                  │  │\n│  │                          │    └──────────────────┘  │\n│  │                          │                          │\n│  │                          │    ┌──────────────────┐  │\n│  │  PDF uploads ────────────┼──▶ │    pdftoppm      │  │\n│  │                          │    │  PDF ─▶ images   │  │\n│  └──────────────────────────┘    └──────────────────┘  │\n└────────────────────────────────────────────────────────┘\n```\n\nSDK requests hit the localaik proxy, which translates Gemini or OpenAI wire format and forwards to the local llama.cpp server running a Gemma 3 model.\n\n## Quick start\n\n```bash\ndocker run -d -p 8090:8090 gokhalh/localaik\n```\n\nOr with Docker Compose:\n\n```yaml\nservices:\n  localaik:\n    image: gokhalh/localaik\n    ports:\n      - \"8090:8090\"\n```\n\nlocalaik is a plain HTTP server, so any language or SDK that can set a base URL will work.\n\n### Gemini SDK\n\n**Go:**\n```go\nclient, err := genai.NewClient(ctx, \u0026genai.ClientConfig{\n    APIKey:      \"test\",\n    HTTPOptions: genai.HTTPOptions{BaseURL: \"http://localhost:8090\"},\n})\n```\n\n**Python:**\n```python\nfrom google import genai\n\nclient = genai.Client(\n    api_key=\"test\",\n    http_options=genai.types.HttpOptions(api_version=\"v1beta\", base_url=\"http://localhost:8090\"),\n)\n```\n\nOr set the environment variable for any language:\n\n```bash\nexport GOOGLE_GEMINI_BASE_URL=http://localhost:8090\n```\n\n### OpenAI SDK\n\n**Python:**\n```python\nfrom openai import OpenAI\n\nclient = OpenAI(api_key=\"test\", base_url=\"http://localhost:8090/v1\")\n```\n\n**Go:**\n```go\nclient := openai.NewClient(\n    option.WithAPIKey(\"test\"),\n    option.WithBaseURL(\"http://localhost:8090/v1\"),\n)\n```\n\n## Docker tags\n\n\n| Tag                   | Model              | Image size |\n| --------------------- | ------------------ | ---------- |\n| `latest`, `gemma3-4b` | Gemma 3 4B Q4_K_M  | ~3 GB      |\n| `gemma3-12b`          | Gemma 3 12B Q4_K_M | ~7 GB      |\n\n\nVersion-pinned tags follow the pattern `v0.1.1-gemma3-4b`, `v0.1.1-gemma3-12b`.\n\n## Implemented routes\n\n\n| Route                                               | Used by                        | Notes                                   |\n| --------------------------------------------------- | ------------------------------ | --------------------------------------- |\n| `POST /v1beta/models/{model}:generateContent`       | Gemini `GenerateContent`       | Translated to upstream chat completions |\n| `POST /v1beta/models/{model}:streamGenerateContent` | Gemini `GenerateContentStream` | Gemini-style SSE (typically `?alt=sse`) |\n| `POST /v1/chat/completions`                         | OpenAI chat completions        | Forwarded to upstream                   |\n| `GET /health`                                       | Health checks                  | Custom route                            |\n\n\nAll other API routes return `404`.\n\n## Tested SDKs\n\nAutomated contract tests validate against:\n\n- `google.golang.org/genai` v1.51.0\n- `github.com/openai/openai-go/v3` v3.30.0\n\nOther SDK versions and languages may work if they emit the same HTTP shapes.\n\n## Use in CI\n\nRun localaik as a GitHub Actions service container so your tests hit a real local model instead of mocks:\n\n```yaml\njobs:\n  test:\n    runs-on: ubuntu-latest\n    services:\n      localaik:\n        image: gokhalh/localaik\n        ports:\n          - 8090:8090\n        options: \u003e-\n          --health-cmd \"curl -f http://localhost:8090/health\"\n          --health-interval 10s\n          --health-timeout 5s\n          --health-retries 30\n    steps:\n      - uses: actions/checkout@v4\n      - run: go test ./...\n        env:\n          GOOGLE_GEMINI_BASE_URL: http://localhost:8090\n          OPENAI_BASE_URL: http://localhost:8090/v1\n```\n\n## Gemini compatibility\n\n**Supported features:**\n\n- Text, image (`inlineData`), and PDF input (auto-converted to page images)\n- `fileData` for image URLs and local/`data:`-URI PDF/text files\n- `systemInstruction`\n- `generationConfig`: temperature, topP, topK, candidateCount, maxOutputTokens, stopSequences, responseLogprobs, logprobs, presencePenalty, frequencyPenalty, seed\n- Structured output via `responseMimeType`, `responseSchema`, `responseJsonSchema`\n- Function declarations via `tools`, function calling config via `toolConfig`\n- `functionCall` and `functionResponse` parts\n- Streaming SSE responses\n- Usage metadata and finish reasons\n\n**Partial support:**\n\n- `top_k`, `n`, logprobs, and tool choice behavior depends on the upstream runtime\n- `executableCode`, `codeExecutionResult`, `toolCall`, `toolResponse` parts preserved as text context\n\n**Not supported:**\n\n- SDK methods outside `GenerateContent` / `GenerateContentStream`\n- Non-function tools (Google Search, Maps, URL context, code execution)\n- Embeddings, token counting, cached content, live/bidi sessions, uploads\n\n## OpenAI compatibility\n\n**Supported:** text chat completions, structured output, vision inputs, tool-related fields (all passed through to upstream).\n\n**Not supported:** Responses API, Assistants, Embeddings, Images, Audio, Files, Vector stores.\n\n## Development\n\n\u003e **Tip:** Run `make docker-up` to build and start the localaik container, which includes a local llama.cpp server with a bundled model. This is the easiest way to get a working upstream for development.\n\n```bash\n# Run the proxy locally (requires a running llama.cpp server)\ngo run ./cmd/localaik --port 8090 --upstream http://127.0.0.1:8080/v1\n\n# Common commands\nmake help              # Show all targets\nmake lint              # Format check + go vet\nmake test-unit         # Unit tests\nmake test-integration  # Integration tests (requires docker-up)\nmake test              # All of the above\nmake docker-up         # Build and start container\nmake docker-down       # Stop container\n```\n\n### Building the image\n\n```bash\n# Default (Gemma 3 4B)\ndocker build -t gokhalh/localaik .\n\n# Custom model\ndocker build \\\n  --build-arg MODEL_URL=... \\\n  --build-arg MODEL_SHA256=... \\\n  --build-arg MMPROJ_URL=... \\\n  --build-arg MMPROJ_SHA256=... \\\n  -t gokhalh/localaik:custom .\n```\n\n## Limitations\n\n- Intended for tests and development, not production\n- Image size is dominated by model weights\n- Cold starts can take tens of seconds while the model loads\n- PDF rendering adds latency per page\n\n","funding_links":[],"categories":["Artificial Intelligence"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fharshaneel%2Flocalaik","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fharshaneel%2Flocalaik","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fharshaneel%2Flocalaik/lists"}