Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/dorukgezici/pydantic-ai-mlx
Add mlx-lm support to Pydantic AI, run MLX compatible HF models on Apple silicon.
https://github.com/dorukgezici/pydantic-ai-mlx
mlx mlx-lm pydantic-ai
Last synced: 10 days ago
JSON representation
Add mlx-lm support to Pydantic AI, run MLX compatible HF models on Apple silicon.
- Host: GitHub
- URL: https://github.com/dorukgezici/pydantic-ai-mlx
- Owner: dorukgezici
- Created: 2025-01-20T17:09:09.000Z (19 days ago)
- Default Branch: main
- Last Pushed: 2025-01-28T15:36:06.000Z (11 days ago)
- Last Synced: 2025-01-28T16:31:31.268Z (11 days ago)
- Topics: mlx, mlx-lm, pydantic-ai
- Language: Python
- Homepage: https://pypi.org/project/pydantic-ai-mlx
- Size: 180 KB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
Run MLX compatible HuggingFace models on Apple silicon locally with Pydantic AI.
Two options are provided as backends;
- LM Studio backend (OpenAI compatible server that can also utilize mlx-lm, model runs on a separate background process)
- mlx-lm backend (direct integration with Apple's library, model runs within your applicaiton, *experimental support*)*STILL IN DEVELOPMENT, NOT RECOMMENDED FOR PRODUCTION USE YET.*
Contributions are welcome!
### Features
- [x] LM Studio backend, should be fully supported
- [x] Streaming text support for mlx-lm backend
- [ ] Tool calling support for mlx-lm backend_Apple's MLX seems more performant on Apple silicon than llama.cpp (Ollama), as of January 2025._
## Installation
```bash
uv add pydantic-ai-mlx
```## Usage
### LM Studio backend
```python
from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from pydantic_ai_lm_studio import LMStudioModelmodel = LMStudioModel(model_name="mlx-community/Qwen2.5-7B-Instruct-4bit") # supports tool calling
agent = Agent(model, system_prompt="You are a chatbot.")async def stream_response(user_prompt: str, message_history: list[ModelMessage]):
async with agent.run_stream(user_prompt, message_history) as result:
async for message in result.stream():
yield message
```### mlx-lm backend
```python
from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from pydantic_ai_mlx_lm import MLXModelmodel = MLXModel(model_name="mlx-community/Llama-3.2-3B-Instruct-4bit")
# See https://github.com/ml-explore/mlx-examples/blob/main/llms/README.md#supported-models
# also https://huggingface.co/mlx-communityagent = Agent(model, system_prompt="You are a chatbot.")
async def stream_response(user_prompt: str, message_history: list[ModelMessage]):
async with agent.run_stream(user_prompt, message_history) as result:
async for message in result.stream():
yield message
```