https://github.com/rob-bl8ke/llm-dev-container
A fully containerized, memory-efficient development environment for LLM and AI workflows using Conda, Mamba, and Jupyter Lab.
https://github.com/rob-bl8ke/llm-dev-container
anaconda docker docker-compose jupyterlab llm mamba
Last synced: 2 months ago
JSON representation
A fully containerized, memory-efficient development environment for LLM and AI workflows using Conda, Mamba, and Jupyter Lab.
- Host: GitHub
- URL: https://github.com/rob-bl8ke/llm-dev-container
- Owner: rob-bl8ke
- Created: 2025-08-02T10:23:42.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2025-08-02T11:34:50.000Z (2 months ago)
- Last Synced: 2025-08-02T13:19:55.311Z (2 months ago)
- Topics: anaconda, docker, docker-compose, jupyterlab, llm, mamba
- Language: Dockerfile
- Homepage:
- Size: 5.86 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# π§ LLM Engineering (Dockerized Dev Environment)
This project sets up a **fully containerized** environment for developing with LLMs, AI libraries, and Jupyter Lab β with **no local Conda or Python installation** required.
βΌοΈEnsure to take a look at the possible gotchas getting this to run for the first time.
## π§ What's Included
- **Miniconda** + Mamba for Python environment management
- **Tiered install**: splits dependencies to avoid memory issues
- **Jupyter Lab** as the main UI
- Preinstalled libraries for:
- π Data science: pandas, numpy, matplotlib, scikit-learn
- π€ AI/LLM: torch, transformers, langchain, sentence-transformers
- π§° Tools: dotenv, openai, pydub, psutil, twilio## π Getting Started
### 1. Clone & build
```bash
git clone https://github.com/your-org/llm_engineering.git
cd llm_engineering
docker compose up --build
````### 2. Access Jupyter
Visit [http://localhost:8888](http://localhost:8888) in your browser. The token will appear in the console log.
## π¦ Daily Development
Using `docker compose down` is quite destructive to use often because of the length of time it takes to do a `docker compose up`. Better to stop the environment and restart it again until you have to clean up resources.
```bash
docker compose stop
docker compose start
```
which is the equivalent of...
```bash
docker compose restart
```## π API Keys via `.env`
Create a `.env` file in the project root:
```
OPENAI_API_KEY=sk-xxx
GOOGLE_API_KEY=xxx
ANTHROPIC_API_KEY=xxx
```These are accessible in your notebooks using `dotenv`.
## π‘ Notes
* All dependencies are installed inside the container.
* Environment is built in tiers to avoid memory crashes.
* Your project files are mounted into the container via volume.## π¦ Clean Up
```bash
docker compose down # Stop and remove the container
docker system prune -a # Remove all images/containers (careful!)
```# Environment without spinning up ollama locally
If all you want to do is run labs and connect to LLM cloud APIs then you will not need a local installation of `ollama`. In this case, go with a simpler version of `docker-compose.yml` that does not depend on an `ollama` container.
```docker
services:
llm-env:
build:
context: .
container_name: llm-engineering
volumes:
- .:/workspace
ports:
- "8888:8888"
entrypoint: ["/workspace/bootstrap.sh"]
tty: true
stdin_open: true
```# π₯ Gotchas
### `bootstrap.sh` no such file or directory
When trying to do a `docker compose up` you run into the issue below:
```
=> resolving provenance for metadata file 0.0s
[+] Running 3/3
β llm-env Built 0.0s
β Network llm-dev-container_default Created 0.1s
β Container llm-dev-container Created 0.1s
Attaching to llm-dev-container
llm-dev-container | exec /workspace/bootstrap.sh: no such file or directory
llm-dev-container exited with code 255
```What's really happening (if you're on a Windows machine) is that if you run this in your editor (eg. Visual Studio Code) after cloning the repository, your`bootstrap.sh` file is replacing `\n` with `\r\n` (CRLF) the classic line-ending issue. When the file is copied to the container the file cannot be read properly by the Linux system. To fix this, the easiest thing to do is to ensure that your file is changed to LF instead of CRLF. You can do this in Visual Studio on the bottom status bar or simply set the defaults in settings.