https://github.com/emgeee/mcp-ollama
Query model running with Ollama from within Claude Desktop or other MCP clients
https://github.com/emgeee/mcp-ollama
Last synced: 6 months ago
JSON representation
Query model running with Ollama from within Claude Desktop or other MCP clients
- Host: GitHub
- URL: https://github.com/emgeee/mcp-ollama
- Owner: emgeee
- License: mit
- Created: 2025-02-04T22:37:21.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-02-05T02:41:38.000Z (8 months ago)
- Last Synced: 2025-04-05T08:35:28.025Z (6 months ago)
- Language: Python
- Size: 92.8 KB
- Stars: 17
- Watchers: 1
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- awesome-mcp-servers - **mcp-ollama** - Query model running with Ollama from within Claude Desktop or other MCP clients `python` `mcp` `pip install git+https://github.com/emgeee/mcp-ollama` (AI/ML)
- awesome-mcp-servers - **mcp-ollama** - Query model running with Ollama from within Claude Desktop or other MCP clients `python` `mcp` `pip install git+https://github.com/emgeee/mcp-ollama` (AI/ML)
- mcp-index - Ollama Integration Server - Integrates Ollama models with various MCP clients, enabling seamless interaction with AI models in applications like Claude Desktop. Requires Python and Ollama to operate, with models pulled to use effectively. (APIs and HTTP Requests)
README
# MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
## Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g., `ollama pull llama2`)### Configure Claude Desktop
Add to your Claude Desktop configuration (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):
```json
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
```### Development
Install in development mode:
```bash
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
```Test with MCP Inspector:
```bash
mcp dev src/mcp_ollama/server.py
```## Features
The server provides four main tools:
- `list_models` - List all downloaded Ollama models
- `show_model` - Get detailed information about a specific model
- `ask_model` - Ask a question to a specified model## License
MIT