Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/angellozan/local_deep_seek

How to run DeepSeek on your local Mac
https://github.com/angellozan/local_deep_seek

ai deepseek deepseek-ai deepseek-chat deepseek-coder deepseek-llm deepseek-r1 open-web vscode vscode-extension

Last synced: 3 days ago
JSON representation

How to run DeepSeek on your local Mac

Awesome Lists containing this project

README

        

# Run DeepSeek Locally with Docker-Compose

Running DeepSeek locally with Docker-Compose is possible with a Mac, though a lighter-weight implementation of the model is recommended.

This will take you through how to run DeepSeek on localhost with a web-ui interface.

![alt text](image.png)

AND

![alt text](image-1.png)


## Steps to run with web interface

1. Install [Ollama](https://martech.org/how-to-run-deepseek-locally-on-your-computer/)

2. Pick a model based on your hardware:
```
ollama pull deepseek-r1:8b # Fast, lightweight

ollama pull deepseek-r1:14b # Balanced performance

ollama pull deepseek-r1:32b # Heavy processing

ollama pull deepseek-r1:70b # Max reasoning, slowest

ollama pull deepseek-coder:1.3b # Code completion assist
```
3. Test the model locally via the terminal
```
ollama run deepseek-r1:8b
```

4. Install [Docker](https://www.docker.com/get-started)

5. Install [Docker-Compose](https://formulae.brew.sh/formula/docker-compose)

6. Create Docker-Compose file as seen in this repo.

7. Open the docker app and run `docker-compose up -d`

8. Visit `http://localhost:3000` to see your chat.


## Steps to run with VScode:

1. Once you run steps 1-2 above, you can also install the CodeGPT for VScode extension.
![alt text](image-2.png)
![alt text](image-3.png)

2. Navigate to the Local LLMs section. This is likely accessed from the initial model selection drop down (pictured with claude selected).
![alt text](image-9.png)
![alt text](image-10.png)

3. From the available options, select 'Ollama' as the local LLM provider.
![alt text](image-11.png)

4. Select your DeepSeek Model and you're done.
![alt text](image-12.png)


## Coming soon: Running Open-webui without locally without internet.