https://github.com/ruixingshi/deepseek-thinker-mcp
https://github.com/ruixingshi/deepseek-thinker-mcp
Last synced: 6 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/ruixingshi/deepseek-thinker-mcp
- Owner: ruixingshi
- Created: 2025-02-13T02:25:03.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-02-13T08:24:07.000Z (8 months ago)
- Last Synced: 2025-02-13T08:24:20.840Z (8 months ago)
- Language: JavaScript
- Size: 15.6 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- Awesome-MCP-Servers-directory - deepseek-thinker-mcp - A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server (AI Services)
- awesome-mcp-servers - Deepseek Thinker MCP - A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's CoT from the Deepseek API service or a local Ollama server. (Table of Contents / AI Services)
- awesome-mcp-servers - Deepseek Thinker MCP - A MCP provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's CoT from the Deepseek API service or a local Ollama server. (Table of Contents / AI Services)
- mcp-index - Deepseek Thinker - Leverage Deepseek's reasoning capabilities to access structured outputs from its thought processes for enhanced AI interactions. Supports integration with AI clients through both OpenAI API and local Ollama server modes. (Developer Tools)
README
# Deepseek Thinker MCP Server
[](https://smithery.ai/server/@ruixingshi/deepseek-thinker-mcp)
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek's thought processes from the Deepseek API service or from a local Ollama server.
## Core Features
- 🤖 **Dual Mode Support**
- OpenAI API mode support
- Ollama local mode support- 🎯 **Focused Reasoning**
- Captures Deepseek's thinking process
- Provides reasoning output## Available Tools
### get-deepseek-thinker
- **Description**: Perform reasoning using the Deepseek model
- **Input Parameters**:
- `originPrompt` (string): User's original prompt
- **Returns**: Structured text response containing the reasoning process## Environment Configuration
### OpenAI API Mode
Set the following environment variables:
```bash
API_KEY=
BASE_URL=
```### Ollama Mode
Set the following environment variable:
```bash
USE_OLLAMA=true
```## Usage
### Integration with AI Client, like Claude Desktop
Add the following configuration to your `claude_desktop_config.json`:```json
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "",
"BASE_URL": ""
}
}
}
}
```### Using Ollama Mode
```json
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"USE_OLLAMA": "true"
}
}
}
}
```
### Local Server Configuration```json
{
"mcpServers": {
"deepseek-thinker": {
"command": "node",
"args": [
"/your-path/deepseek-thinker-mcp/build/index.js"
],
"env": {
"API_KEY": "",
"BASE_URL": ""
}
}
}
}
```## Development Setup
```bash
# Install dependencies
npm install# Build project
npm run build# Run service
node build/index.js
```## FAQ
### Response like this: "MCP error -32001: Request timed out"
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.## Tech Stack
- TypeScript
- @modelcontextprotocol/sdk
- OpenAI API
- Ollama
- Zod (parameter validation)## License
This project is licensed under the MIT License. See the LICENSE file for details.