{"id":19947344,"url":"https://github.com/firstbatchxyz/dkn-compute-node","last_synced_at":"2026-04-01T22:23:12.055Z","repository":{"id":238808444,"uuid":"787406064","full_name":"firstbatchxyz/dkn-compute-node","owner":"firstbatchxyz","description":"Compute Node of Dria Knowledge Network.","archived":false,"fork":false,"pushed_at":"2026-03-19T10:36:46.000Z","size":1719,"stargazers_count":228,"open_issues_count":6,"forks_count":57,"subscribers_count":3,"default_branch":"master","last_synced_at":"2026-03-20T02:49:11.930Z","etag":null,"topics":["dria","firstbatch","gossip","libp2p","ollama","rust"],"latest_commit_sha":null,"homepage":"https://dria.co/edge-ai","language":"Rust","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/firstbatchxyz.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2024-04-16T13:13:17.000Z","updated_at":"2026-03-19T10:36:49.000Z","dependencies_parsed_at":"2024-09-09T10:33:18.551Z","dependency_job_id":"4d61affd-84b6-45cd-954a-20459de5639b","html_url":"https://github.com/firstbatchxyz/dkn-compute-node","commit_stats":null,"previous_names":["firstbatchxyz/dkn-compute-node"],"tags_count":82,"template":false,"template_full_name":null,"purl":"pkg:github/firstbatchxyz/dkn-compute-node","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/firstbatchxyz%2Fdkn-compute-node","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/firstbatchxyz%2Fdkn-compute-node/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/firstbatchxyz%2Fdkn-compute-node/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/firstbatchxyz%2Fdkn-compute-node/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/firstbatchxyz","download_url":"https://codeload.github.com/firstbatchxyz/dkn-compute-node/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/firstbatchxyz%2Fdkn-compute-node/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":31292639,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-04-01T21:15:39.731Z","status":"ssl_error","status_checked_at":"2026-04-01T21:15:34.046Z","response_time":53,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["dria","firstbatch","gossip","libp2p","ollama","rust"],"created_at":"2024-11-13T00:35:40.827Z","updated_at":"2026-04-01T22:23:12.043Z","avatar_url":"https://github.com/firstbatchxyz.png","language":"Rust","readme":"\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://raw.githubusercontent.com/firstbatchxyz/.github/refs/heads/master/branding/dria-logo-square.svg\" alt=\"logo\" width=\"168\"\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n  \u003ch1 align=\"center\"\u003e\n    Dria Compute Node\n  \u003c/h1\u003e\n  \u003cp align=\"center\"\u003e\n    \u003ci\u003eRun AI inference on the Dria network. Earn rewards by serving models from your machine.\u003c/i\u003e\n  \u003c/p\u003e\n\u003c/p\u003e\n\n\u003cp align=\"center\"\u003e\n    \u003ca href=\"https://opensource.org/license/apache-2-0\" target=\"_blank\"\u003e\n        \u003cimg alt=\"License: Apache-2.0\" src=\"https://img.shields.io/badge/license-Apache%202.0-7CB9E8.svg\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"./.github/workflows/test.yml\" target=\"_blank\"\u003e\n        \u003cimg alt=\"Workflow: Tests\" src=\"https://github.com/firstbatchxyz/dkn-compute-node/actions/workflows/tests.yml/badge.svg?branch=master\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://github.com/firstbatchxyz/dkn-compute-node/releases\" target=\"_blank\"\u003e\n        \u003cimg alt=\"Downloads\" src=\"https://img.shields.io/github/downloads/firstbatchxyz/dkn-compute-node/total?logo=github\u0026logoColor=%23F2FFEE\u0026color=%2332C754\"\u003e\n    \u003c/a\u003e\n    \u003ca href=\"https://dria.co/discord\" target=\"_blank\"\u003e\n        \u003cimg alt=\"Discord\" src=\"https://dcbadge.vercel.app/api/server/dria?style=flat\"\u003e\n    \u003c/a\u003e\n\u003c/p\u003e\n\n## Quick Start\n\n### Install\n\nChoose one installation method:\n\n**Homebrew (macOS / Linux):**\n\n```sh\nbrew install firstbatchxyz/dkn/dria-node\ndria-node --version\n```\n\nHomebrew will add the tap automatically.\n\n**Shell script (macOS / Linux):**\n\n```sh\ncurl -fsSL https://raw.githubusercontent.com/firstbatchxyz/dkn-compute-node/master/install.sh | sh\ndria-node --version\n```\n\n**AMD ROCm (Linux x86_64):**\n\n```sh\ncurl -fsSL https://raw.githubusercontent.com/firstbatchxyz/dkn-compute-node/master/install-rocm.sh | bash\ndria-node --version\n```\n\nRequires ROCm 6.x to already be installed on your machine.\n\n**PowerShell (Windows):**\n\n```powershell\nirm https://raw.githubusercontent.com/firstbatchxyz/dkn-compute-node/master/install.ps1 | iex\ndria-node --version\n```\n\n**From GitHub Releases:**\n\nDownload the latest file for your platform from [Releases](https://github.com/firstbatchxyz/dkn-compute-node/releases), then run `dria-node --version` to verify it.\n\n### Setup\n\nRun the interactive setup:\n\n```sh\ndria-node setup\n```\n\nThis will:\n\n1. Detect your system RAM and list models that fit\n2. Let you pick a model from the available options\n3. Download the GGUF model file from HuggingFace\n4. Run a test inference to verify everything works\n5. Print a benchmark (tokens per second)\n\nUse `--gpu-layers -1` to offload all layers to GPU (Metal on macOS, CUDA on NVIDIA builds, ROCm on AMD Linux builds):\n\n```sh\ndria-node setup --gpu-layers -1\n```\n\n### Start\n\nOnce setup is complete, start the node:\n\n```sh\ndria-node start --wallet \u003cYOUR_SECRET_KEY\u003e --model \u003cMODEL_NAME\u003e\n```\n\nThe node will connect to the Dria network, register your models, and start serving inference requests. You can increase throughput with `--max-concurrent`:\n\n```sh\ndria-node start --wallet \u003cKEY\u003e --model lfm2.5:1.2b --max-concurrent 4\n```\n\n## Available Models\n\n| Model | Type | Quant | ~Size |\n|-------|------|-------|-------|\n| `lfm2.5:1.2b` | Text | Q4_K_M | 0.8 GB |\n| `lfm2.5-audio:1.5b` | Audio | Q4_0 | 1.0 GB |\n| `lfm2.5-vl:1.6b` | Vision | Q4_0 | 1.2 GB |\n| `nanbeige:3b` | Text | Q4_K_M | 2.0 GB |\n| `locooperator:4b` | Text | Q4_K_M | 2.5 GB |\n| `qwen3.5:9b` | Vision | Q4_K_M | 6.0 GB |\n| `lfm2:24b-a2b` | Text | Q4_K_M | 14 GB |\n| `qwen3.5:27b` | Vision | Q4_K_M | 16 GB |\n| `qwen3.5:35b-a3b` | Vision | Q4_K_M | 20 GB |\n\nServe multiple models by comma-separating them: `--model \"qwen3.5:9b,lfm2.5:1.2b\"`\n\nOverride quantization with `--quant Q8_0` (applies to all models).\n\n## CLI Reference\n\n```\ndria-node \u003cCOMMAND\u003e\n\nCommands:\n  setup    Interactive setup: pick a model, download it, and run a test\n  start    Start the compute node\n\nsetup options:\n  --data-dir \u003cPATH\u003e        Data directory [env: DRIA_DATA_DIR]\n  --gpu-layers \u003cN\u003e         GPU layers to offload (0 = CPU only) [default: 0]\n\nstart options:\n  --wallet \u003cKEY\u003e           Wallet secret key, hex-encoded [env: DRIA_WALLET]\n  --model \u003cMODELS\u003e         Model(s) to serve, comma-separated [env: DRIA_MODELS]\n  --router-url \u003cURL\u003e       Router URL [default: quic.dria.co:4001] [env: DRIA_ROUTER_URL]\n  --gpu-layers \u003cN\u003e         GPU layers to offload (-1 = all, 0 = CPU) [default: 0]\n  --max-concurrent \u003cN\u003e     Max concurrent inference requests [default: 1]\n  --data-dir \u003cPATH\u003e        Data directory [env: DRIA_DATA_DIR]\n  --quant \u003cQUANT\u003e          Override GGUF quantization [env: DRIA_QUANT]\n  --insecure               Skip TLS verification [env: DRIA_INSECURE]\n```\n\nAll flags can also be set via environment variables.\n\n## Building from Source\n\n```sh\ngit clone https://github.com/firstbatchxyz/dkn-compute-node.git\ncd dkn-compute-node\ncargo build --release\n```\n\n**Feature flags:**\n\n- `--features metal` — Apple Metal GPU acceleration (macOS)\n- `--features cuda` — NVIDIA CUDA GPU acceleration\n- `--features rocm` — AMD ROCm GPU acceleration (Linux x86_64)\n\n### Testing\n\n```sh\ncargo test\n```\n\n### Linting\n\n```sh\ncargo clippy\ncargo fmt --check\n```\n\n## License\n\nThis project is licensed under the [Apache License 2.0](https://opensource.org/license/Apache-2.0).\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffirstbatchxyz%2Fdkn-compute-node","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Ffirstbatchxyz%2Fdkn-compute-node","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Ffirstbatchxyz%2Fdkn-compute-node/lists"}