{"id":21296107,"url":"https://github.com/beam-cloud/beta9","last_synced_at":"2026-03-11T21:23:55.484Z","repository":{"id":239640744,"uuid":"718873539","full_name":"beam-cloud/beta9","owner":"beam-cloud","description":"Run serverless GPU workloads with fast cold starts on bare-metal servers, anywhere in the world","archived":false,"fork":false,"pushed_at":"2025-04-14T18:06:30.000Z","size":11677,"stargazers_count":751,"open_issues_count":12,"forks_count":43,"subscribers_count":4,"default_branch":"main","last_synced_at":"2025-04-14T18:16:42.379Z","etag":null,"topics":["autoscaler","cloudrun","cuda","developer-productivity","distributed-computing","faas","fine-tuning","functions-as-a-service","generative-ai","gpu","large-language-models","llm","llm-inference","ml-platform","paas","self-hosted","serverless","serverless-containers"],"latest_commit_sha":null,"homepage":"https://docs.beam.cloud","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"agpl-3.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/beam-cloud.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2023-11-15T00:53:21.000Z","updated_at":"2025-04-14T18:05:52.000Z","dependencies_parsed_at":"2024-12-23T15:29:15.171Z","dependency_job_id":"bc182385-811d-4f95-b33b-3e8bca743e20","html_url":"https://github.com/beam-cloud/beta9","commit_stats":null,"previous_names":["beam-cloud/beta9","beam-cloud/beam"],"tags_count":790,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beam-cloud%2Fbeta9","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beam-cloud%2Fbeta9/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beam-cloud%2Fbeta9/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/beam-cloud%2Fbeta9/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/beam-cloud","download_url":"https://codeload.github.com/beam-cloud/beta9/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248933345,"owners_count":21185460,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["autoscaler","cloudrun","cuda","developer-productivity","distributed-computing","faas","fine-tuning","functions-as-a-service","generative-ai","gpu","large-language-models","llm","llm-inference","ml-platform","paas","self-hosted","serverless","serverless-containers"],"created_at":"2024-11-21T14:19:48.881Z","updated_at":"2026-03-11T21:23:55.478Z","avatar_url":"https://github.com/beam-cloud.png","language":"Go","funding_links":[],"categories":["Inference","[Beam Cloud](https://beam.cloud)","Go"],"sub_categories":["Inference Platform"],"readme":"\u003cdiv align=\"center\"\u003e\n\u003cp align=\"center\"\u003e\n\u003cimg alt=\"Logo\" src=\"static/beam-logo-white.png#gh-dark-mode-only\" width=\"30%\"\u003e\n\u003cimg alt=\"Logo\" src=\"static/beam-logo-dark.png#gh-light-mode-only\" width=\"30%\"\u003e\n\u003c/p\u003e\n\n## Run AI Workloads at Scale\n\n\u003cp align=\"center\"\u003e\n  \u003c/a\u003e\n    \u003ca href=\"https://colab.research.google.com/drive/1jSDyYY7FY3Y3jJlCzkmHlH8vTyF-TEmB?usp=sharing\"\u003e\n    \u003cimg alt=\"Colab\" src=\"https://colab.research.google.com/assets/colab-badge.svg\"\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://github.com/beam-cloud/beta9/stargazers\"\u003e\n    \u003cimg alt=\"⭐ Star the Repo\" src=\"https://img.shields.io/github/stars/beam-cloud/beta9\"\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://docs.beam.cloud\"\u003e\n    \u003cimg alt=\"Documentation\" src=\"https://img.shields.io/badge/docs-quickstart-purple\"\u003e\n  \u003c/a\u003e\n  \u003ca href=\"https://join.slack.com/t/beam-cloud/shared_invite/zt-39hbkt8ty-CTVv4NsgLoYArjWaVkwcFw\"\u003e\n    \u003cimg alt=\"Join Slack\" src=\"https://img.shields.io/badge/Beam-Join%20Slack-orange?logo=slack\"\u003e\n  \u003c/a\u003e\n    \u003ca href=\"https://twitter.com/beam_cloud\"\u003e\n    \u003cimg alt=\"Twitter\" src=\"https://img.shields.io/twitter/follow/beam_cloud.svg?style=social\u0026logo=twitter\"\u003e\n  \u003c/a\u003e\n    \u003ca href=\"https://github.com/beam-cloud/beta9?tab=AGPL-3.0-1-ov-file\"\u003e\n    \u003cimg alt=\"AGPL\" src=\"https://img.shields.io/badge/License-AGPL-green\"\u003e\n  \u003c/a\u003e\n\u003c/p\u003e\n\n\u003c/div\u003e\n\n**[Beam](https://beam.cloud?utm_source=github_readme)** is a fast, open-source runtime for serverless AI workloads. It gives you a Pythonic interface to deploy and scale AI applications with zero infrastructure overhead.\n\n![Watch the demo](static/readme.gif)\n\n## ✨ Features\n\n- **Fast Image Builds**: Launch containers in under a second using a custom container runtime\n- **Parallelization and Concurrency**: Fan out workloads to 100s of containers\n- **First-Class Developer Experience**: Hot-reloading, webhooks, and scheduled jobs\n- **Scale-to-Zero**: Workloads are serverless by default\n- **Volume Storage**: Mount distributed storage volumes\n- **GPU Support**: Run on our cloud (4090s, H100s, and more) or bring your own GPUs\n\n## 📦 Installation\n\n```shell\npip install beam-client\n```\n\n## ⚡️ Quickstart\n\n1. Create an account [here](https://beam.cloud?utm_source=github_readme)\n2. Follow our [Getting Started Guide](https://platform.beam.cloud/onboarding?utm_source=github_readme)\n\n## Creating a sandbox\n\nSpin up isolated containers to run LLM-generated code:\n\n```python\nfrom beam import Image, Sandbox\n\n\nsandbox = Sandbox(image=Image()).create()\nresponse = sandbox.process.run_code(\"print('I am running remotely')\")\n\nprint(response.result)\n```\n\n## Deploy a serverless inference endpoint\n\nCreate an autoscaling endpoint for your custom model:\n\n```python\nfrom beam import Image, endpoint\nfrom beam import QueueDepthAutoscaler\n\n@endpoint(\n    image=Image(python_version=\"python3.11\"),\n    gpu=\"A10G\",\n    cpu=2,\n    memory=\"16Gi\",\n    autoscaler=QueueDepthAutoscaler(max_containers=5, tasks_per_container=30)\n)\ndef handler():\n    return {\"label\": \"cat\", \"confidence\": 0.97}\n```\n\n## Run background tasks\n\nSchedule resilient background tasks (or replace your Celery queue) by adding a simple decorator:\n\n```python\nfrom beam import Image, TaskPolicy, schema, task_queue\n\n\nclass Input(schema.Schema):\n    image_url = schema.String()\n\n\n@task_queue(\n    name=\"image-processor\",\n    image=Image(python_version=\"python3.11\"),\n    cpu=1,\n    memory=1024,\n    inputs=Input,\n    task_policy=TaskPolicy(max_retries=3),\n)\ndef my_background_task(input: Input, *, context):\n    image_url = input.image_url\n    print(f\"Processing image: {image_url}\")\n    return {\"image_url\": image_url}\n\n\nif __name__ == \"__main__\":\n    # Invoke a background task from your app (without deploying it)\n    my_background_task.put(image_url=\"https://example.com/image.jpg\")\n\n    # You can also deploy this behind a versioned endpoint with:\n    # beam deploy app.py:my_background_task --name image-processor\n```\n\n\u003e ## Self-Hosting vs Cloud\n\u003e\n\u003e Beta9 is the open-source engine powering [Beam](https://beam.cloud), our fully-managed cloud platform. You can self-host Beta9 for free or choose managed cloud hosting through Beam.\n\n## 👋 Contributing\n\nWe welcome contributions big or small. These are the most helpful things for us:\n\n- Submit a [feature request](https://github.com/beam-cloud/beta9/issues/new?assignees=\u0026labels=\u0026projects=\u0026template=feature-request.md\u0026title=) or [bug report](https://github.com/beam-cloud/beta9/issues/new?assignees=\u0026labels=\u0026projects=\u0026template=bug-report.md\u0026title=)\n- Open a PR with a new feature or improvement\n\n## ❤️ Thanks to Our Contributors\n\n\u003ca href=\"https://github.com/beam-cloud/beta9/graphs/contributors\"\u003e\n  \u003cimg src=\"https://contrib.rocks/image?repo=beam-cloud/beta9\" /\u003e\n\u003c/a\u003e\n","project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbeam-cloud%2Fbeta9","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbeam-cloud%2Fbeta9","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbeam-cloud%2Fbeta9/lists"}