Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/simonw/llm-mlx-llama
Run Llama 2 using MLX on macOS
https://github.com/simonw/llm-mlx-llama
Last synced: 26 days ago
JSON representation
Run Llama 2 using MLX on macOS
- Host: GitHub
- URL: https://github.com/simonw/llm-mlx-llama
- Owner: simonw
- License: mit
- Created: 2023-12-06T17:25:56.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2023-12-18T12:30:24.000Z (11 months ago)
- Last Synced: 2024-10-06T20:54:32.357Z (about 1 month ago)
- Language: Python
- Size: 6.84 KB
- Stars: 33
- Watchers: 4
- Forks: 3
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# llm-mlx-llama
[![PyPI](https://img.shields.io/pypi/v/llm-mlx-llama.svg)](https://pypi.org/project/llm-mlx-llama/)
[![Changelog](https://img.shields.io/github/v/release/simonw/llm-mlx-llama?include_prereleases&label=changelog)](https://github.com/simonw/llm-mlx-llama/releases)
[![Tests](https://github.com/simonw/llm-mlx-llama/workflows/Test/badge.svg)](https://github.com/simonw/llm-mlx-llama/actions?query=workflow%3ATest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-mlx-llama/blob/main/LICENSE)Using MLX on macOS to run Llama 2. **Highly experimental**.
## Installation
Install this plugin in the same environment as [LLM](https://llm.datasette.io/).
```bash
llm install https://github.com/simonw/llm-mlx-llama/archive/refs/heads/main.zip
```
## UsageDownload `Llama-2-7b-chat.npz` and `tokenizer.model` from [mlx-llama/Llama-2-7b-chat-mlx](https://huggingface.co/mlx-llama/Llama-2-7b-chat-mlx/tree/main).
Pass paths to those files as options when you run a prompt:
```bash
llm -m mlx-llama \
'five great reasons to get a pet pelican:' \
-o model Llama-2-7b-chat.npz \
-o tokenizer tokenizer.model
```
Chat mode and continuing a conversation are not yet supported.## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-mlx-llama
python3 -m venv venv
source venv/bin/activate
```
Now install the dependencies and test dependencies:
```bash
llm install -e '.[test]'
```
To run the tests:
```bash
pytest
```