Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/simonw/llm-mlx-llama
Run Llama 2 using MLX on macOS
https://github.com/simonw/llm-mlx-llama
Last synced: 4 months ago
JSON representation
Run Llama 2 using MLX on macOS
- Host: GitHub
- URL: https://github.com/simonw/llm-mlx-llama
- Owner: simonw
- License: mit
- Created: 2023-12-06T17:25:56.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2023-12-18T12:30:24.000Z (about 1 year ago)
- Last Synced: 2024-10-06T20:54:32.357Z (5 months ago)
- Language: Python
- Size: 6.84 KB
- Stars: 33
- Watchers: 4
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# llm-mlx-llama
[](https://pypi.org/project/llm-mlx-llama/)
[](https://github.com/simonw/llm-mlx-llama/releases)
[](https://github.com/simonw/llm-mlx-llama/actions?query=workflow%3ATest)
[](https://github.com/simonw/llm-mlx-llama/blob/main/LICENSE)Using MLX on macOS to run Llama 2. **Highly experimental**.
## Installation
Install this plugin in the same environment as [LLM](https://llm.datasette.io/).
```bash
llm install https://github.com/simonw/llm-mlx-llama/archive/refs/heads/main.zip
```
## UsageDownload `Llama-2-7b-chat.npz` and `tokenizer.model` from [mlx-llama/Llama-2-7b-chat-mlx](https://huggingface.co/mlx-llama/Llama-2-7b-chat-mlx/tree/main).
Pass paths to those files as options when you run a prompt:
```bash
llm -m mlx-llama \
'five great reasons to get a pet pelican:' \
-o model Llama-2-7b-chat.npz \
-o tokenizer tokenizer.model
```
Chat mode and continuing a conversation are not yet supported.## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-mlx-llama
python3 -m venv venv
source venv/bin/activate
```
Now install the dependencies and test dependencies:
```bash
llm install -e '.[test]'
```
To run the tests:
```bash
pytest
```