Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/videmsky/aws-ts-ollama
a pulumi project that creates ollama resources in AWS
https://github.com/videmsky/aws-ts-ollama
aws ollama pulumi typescript
Last synced: 17 days ago
JSON representation
a pulumi project that creates ollama resources in AWS
- Host: GitHub
- URL: https://github.com/videmsky/aws-ts-ollama
- Owner: videmsky
- Created: 2024-04-07T23:25:54.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2024-04-19T14:32:06.000Z (9 months ago)
- Last Synced: 2024-04-19T15:45:29.346Z (9 months ago)
- Topics: aws, ollama, pulumi, typescript
- Language: TypeScript
- Homepage:
- Size: 52.7 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
This repository uses Pulumi to create a Ollama server on a AWS GPU backed EC2 instance and an Open-WebUI client on ECS. Trialing models locally can be extremely slow and therefefore this repo aims to provide a simple way to run models with AWS GPU resources. Use with caution as GPU instances can get expensive!
[Ollama](https://github.com/ollama/ollama)
[Open WebUI](https://github.com/open-webui/open-webui)### Prerequisites
- AWS account and credentials configured with Pulumi ESC
- Pulumi CLI installed
- SSH keys added to Pulumi ESC### ESC Environment Definition Sample
```yaml
values:
aws:
creds:
fn::open::aws-login:
oidc:
duration: 1h
roleArn: arn:aws:iam::12345678910:role/laci-pulumi-corp
sessionName: pulumi-environments-session
environmentVariables:
AWS_ACCESS_KEY_ID: ${aws.creds.accessKeyId}
AWS_SECRET_ACCESS_KEY: ${aws.creds.secretAccessKey}
AWS_SESSION_TOKEN: ${aws.creds.sessionToken}
pulumiConfig:
publicKey: ssh-ed25519 AAAAC...
privateKey:
fn::secret:
ciphertext: ZXNjeAAAAA...
```### Deploying and Running the Program
1. Create a new stack:
`$ pulumi stack init dev`2. Set local configuration variables:
* `$ pulumi config set aws-ts-ollama:instanceType "g4dn.xlarge"`
* `$ pulumi config set aws-ts-ollama:vpcNetworkCidr "10.9.0.0/16"`
* `$ pulumi config set aws-ts-ollama:containerPort "8080"`
* `$ pulumi config set aws-ts-ollama:cpu "256"`
* `$ pulumi config set aws-ts-ollama:memory "512"`3. `Pulumi.dev.yaml` should look like this:
```yaml
environment:
- laci-dev
config:
aws-ts-ollama:instanceType: g4dn.xlarge
aws-ts-ollama:vpcNetworkCidr: "10.9.0.0/16"
aws-ts-ollama:containerPort: "8080"
aws-ts-ollama:cpu: "256"
aws-ts-ollama:memory: "512"
```### Clean Up
1. To clean up resources, run:
`$ pulumi pulumi destroy -s dev`