{"id":19653699,"url":"https://github.com/brevdev/setup-scripts","last_synced_at":"2025-10-25T10:34:42.230Z","repository":{"id":105409567,"uuid":"477961417","full_name":"brevdev/setup-scripts","owner":"brevdev","description":"Brev Setup Scripts","archived":false,"fork":false,"pushed_at":"2025-10-18T05:45:55.000Z","size":109,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":3,"default_branch":"main","last_synced_at":"2025-10-18T21:42:05.640Z","etag":null,"topics":[],"latest_commit_sha":null,"homepage":"https://developer.nvidia.com/brev","language":"Shell","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/brevdev.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2022-04-05T03:30:20.000Z","updated_at":"2025-10-18T05:45:58.000Z","dependencies_parsed_at":null,"dependency_job_id":"4c89b389-9427-4a10-ba2a-ee30f4639732","html_url":"https://github.com/brevdev/setup-scripts","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/brevdev/setup-scripts","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/brevdev%2Fsetup-scripts","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/brevdev%2Fsetup-scripts/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/brevdev%2Fsetup-scripts/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/brevdev%2Fsetup-scripts/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/brevdev","download_url":"https://codeload.github.com/brevdev/setup-scripts/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/brevdev%2Fsetup-scripts/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":280943377,"owners_count":26417743,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-25T02:00:06.499Z","response_time":81,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":[],"created_at":"2024-11-11T15:14:57.896Z","updated_at":"2025-10-25T10:34:42.225Z","avatar_url":"https://github.com/brevdev.png","language":"Shell","readme":"# Brev Setup Scripts\n\nSimple, practical setup scripts for common developer environments.\n\n**What Brev already provides:** NVIDIA drivers, CUDA toolkit, Docker, NVIDIA Container Toolkit\n\n## Available Scripts\n\n### 🐍 Python Development\n```bash\ncd python-dev \u0026\u0026 bash setup.sh\n```\n**Installs:** pyenv, Python 3.11, Jupyter Lab, common packages (pandas, numpy, matplotlib)  \n**Time:** ~3-5 minutes\n\n### 📦 Node.js Development\n```bash\ncd nodejs-dev \u0026\u0026 bash setup.sh\n```\n**Installs:** nvm, Node LTS, pnpm, TypeScript, ESLint, Prettier  \n**Time:** ~2-3 minutes\n\n### 💻 Terminal Setup\n```bash\ncd terminal-setup \u0026\u0026 bash setup.sh\n```\n**Installs:** zsh, oh-my-zsh, fzf, ripgrep, bat, eza (modern CLI tools)  \n**Time:** ~2-3 minutes  \n**Note:** Automatically switches to zsh when complete\n\n### ☸️ Local Kubernetes\n```bash\ncd k8s-local \u0026\u0026 bash setup.sh\n```\n**Installs:** microk8s, kubectl, helm, k9s, GPU operator  \n**Time:** ~3-5 minutes  \n**Note:** All tools work immediately, no group membership or logout needed\n\n### 🤖 ML Quickstart\n```bash\ncd ml-quickstart \u0026\u0026 bash setup.sh\n```\n**Installs:** Miniconda, PyTorch with CUDA, Jupyter Lab, transformers  \n**Time:** ~5-8 minutes (PyTorch is large)\n\n### ⚡ RAPIDS\n```bash\ncd rapids \u0026\u0026 bash setup.sh\n```\n**Installs:** GPU-accelerated pandas (cuDF), scikit-learn (cuML), example notebooks  \n**Time:** ~8-12 minutes  \n**Note:** 10-50x faster data processing on GPU. Requires NVIDIA GPU\n\n### 🦙 Ollama\n```bash\ncd ollama \u0026\u0026 bash setup.sh\n```\n**Installs:** Ollama with GPU support, llama3.2 model (pre-downloaded)  \n**Time:** ~3-5 minutes  \n**Port:** 11434/tcp for API access\n\n### 🚀 Unsloth\n```bash\ncd unsloth \u0026\u0026 bash setup.sh\n```\n**Installs:** Unsloth for fast fine-tuning, PyTorch with CUDA, LoRA/QLoRA support  \n**Time:** ~5-8 minutes  \n**Note:** Requires NVIDIA GPU\n\n### 🔄 LiteLLM\n```bash\ncd litellm \u0026\u0026 bash setup.sh\n```\n**Installs:** Universal LLM proxy (use any LLM with OpenAI API format)  \n**Time:** ~1-2 minutes  \n**Port:** 4000/tcp for API access\n\n### 🔍 Qdrant\n```bash\ncd qdrant \u0026\u0026 bash setup.sh\n```\n**Installs:** Vector database for RAG and semantic search  \n**Time:** ~1-2 minutes  \n**Port:** 6333/tcp for API + dashboard\n\n### 🎨 ComfyUI\n```bash\ncd comfyui \u0026\u0026 bash setup.sh\n```\n**Installs:** Node-based UI for Stable Diffusion, SD 1.5 model  \n**Time:** ~5-10 minutes  \n**Port:** 8188/tcp for web interface  \n**Note:** Requires NVIDIA GPU\n\n### 🗄️ Databases\n```bash\ncd databases \u0026\u0026 bash setup.sh\n```\n**Installs:** PostgreSQL 16, Redis 7 (in Docker containers)  \n**Time:** ~1-2 minutes\n\n### 📓 Marimo\n```bash\ncd marimo \u0026\u0026 bash setup.sh\n```\n**Installs:** Marimo reactive notebooks as systemd service  \n**Time:** ~2-3 minutes  \n**Port:** 8080/tcp for web access\n\n### 🛡️ earlyoom\n```bash\ncd earlyoom \u0026\u0026 bash setup.sh\n```\n**Installs:** Early OOM daemon to prevent system freezes  \n**Time:** ~1-2 minutes  \n**Note:** Monitors memory/swap and kills processes before OOM hangs\n\n## Quick Start\n\n**Pick what you need:**\n\n```bash\n# Python ML developer\ncd ml-quickstart \u0026\u0026 bash setup.sh\n\n# Web developer\ncd nodejs-dev \u0026\u0026 bash setup.sh\ncd databases \u0026\u0026 bash setup.sh\n\n# Terminal power user\ncd terminal-setup \u0026\u0026 bash setup.sh\n\n# Kubernetes developer\ncd k8s-local \u0026\u0026 bash setup.sh\n```\n\n## Design Philosophy\n\nEach script is:\n- ✅ **Simple** - One purpose, no complexity\n- ✅ **Short** - Under 150 lines each\n- ✅ **Fast** - Takes 2-8 minutes\n- ✅ **Standalone** - No dependencies between scripts\n- ✅ **Practical** - Installs what developers actually use\n\nWe don't:\n- ❌ Install what Brev already provides (NVIDIA drivers, CUDA, Docker)\n- ❌ Add complex GPU detection logic\n- ❌ Support multi-node/HPC scenarios\n- ❌ Over-engineer solutions\n\n## Examples\n\n**Python data science:**\n```bash\ncd python-dev \u0026\u0026 bash setup.sh\n# Then:\nipython\njupyter lab --ip=0.0.0.0\n```\n\n**Machine learning with GPU:**\n```bash\ncd ml-quickstart \u0026\u0026 bash setup.sh\n# Then:\nconda activate ml\npython gpu_check.py\n```\n\n**GPU-accelerated data science with RAPIDS:**\n```bash\ncd rapids \u0026\u0026 bash setup.sh\n# Then:\nconda activate rapids\npython ~/rapids-examples/benchmark.py  # See 20x+ speedup!\n```\n\n**Local LLM with Ollama:**\n```bash\ncd ollama \u0026\u0026 bash setup.sh\n# Then:\nollama run llama3.2\nollama list\n```\n\n**Fast LLM fine-tuning with Unsloth:**\n```bash\ncd unsloth \u0026\u0026 bash setup.sh\n# Then:\nconda activate unsloth\npython ~/unsloth-examples/test_install.py\n```\n\n**Universal LLM proxy with LiteLLM:**\n```bash\ncd litellm \u0026\u0026 bash setup.sh\n# Then use any LLM with OpenAI SDK:\n# openai.api_base = \"http://localhost:4000\"\n```\n\n**Vector database with Qdrant:**\n```bash\ncd qdrant \u0026\u0026 bash setup.sh\n# Then:\npip install qdrant-client\npython ~/qdrant_example.py\n```\n\n**Image generation with ComfyUI:**\n```bash\ncd comfyui \u0026\u0026 bash setup.sh\n# Then open: http://localhost:8188\n```\n\n**Modern terminal:**\n```bash\ncd terminal-setup \u0026\u0026 bash setup.sh\n# Automatically drops you into zsh, then:\nll    # Better ls\ncat file.txt  # Syntax highlighting\nfzf   # Fuzzy finder\n```\n\n**Local database:**\n```bash\ncd databases \u0026\u0026 bash setup.sh\n# Then:\ndocker exec -it postgres psql -U postgres\ndocker exec -it redis redis-cli\n```\n\n**OOM protection with earlyoom:**\n```bash\ncd earlyoom \u0026\u0026 bash setup.sh\n# Then:\nsudo systemctl status earlyoom\nsudo journalctl -u earlyoom -f  # Watch memory monitoring\n```\n\n## File Structure\n\n```\nbrev-setup-scripts/\n├── README.md                    # This file\n├── python-dev/\n│   ├── setup.sh                 # Python development environment\n│   └── README.md\n├── nodejs-dev/\n│   ├── setup.sh                 # Node.js development environment\n│   └── README.md\n├── terminal-setup/\n│   ├── setup.sh                 # Modern terminal with zsh\n│   └── README.md\n├── k8s-local/\n│   ├── setup.sh                 # Local Kubernetes\n│   └── README.md\n├── ml-quickstart/\n│   ├── setup.sh                 # PyTorch ML environment\n│   └── README.md\n├── ollama/\n│   ├── setup.sh                 # Ollama LLM inference\n│   └── README.md\n├── unsloth/\n│   ├── setup.sh                 # Unsloth fast fine-tuning\n│   └── README.md\n├── litellm/\n│   ├── setup.sh                 # Universal LLM proxy\n│   └── README.md\n├── qdrant/\n│   ├── setup.sh                 # Vector database\n│   └── README.md\n├── comfyui/\n│   ├── setup.sh                 # ComfyUI for Stable Diffusion\n│   └── README.md\n├── databases/\n│   ├── setup.sh                 # PostgreSQL + Redis\n│   └── README.md\n├── marimo/\n│   ├── setup.sh                 # Marimo reactive notebooks\n│   └── README.md\n├── earlyoom/\n│   ├── setup.sh                 # Early OOM daemon\n│   └── README.md\n└── rapids/\n    ├── setup.sh                 # RAPIDS GPU-accelerated data science\n    └── README.md\n```\n\n## Contributing\n\nWant to add a script? Keep it simple:\n\n1. **One purpose** - Install one thing well\n2. **Short** - Under 150 lines\n3. **Fast** - Completes in \u003c 10 minutes\n4. **Verify** - Include a verification step\n5. **Document** - Show quick start commands\n\n## License\n\nApache 2.0\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbrevdev%2Fsetup-scripts","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fbrevdev%2Fsetup-scripts","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fbrevdev%2Fsetup-scripts/lists"}