https://github.com/66julienmartin/MCP-server-Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
https://github.com/66julienmartin/MCP-server-Deepseek_R1
Last synced: 8 months ago
JSON representation
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
- Host: GitHub
- URL: https://github.com/66julienmartin/MCP-server-Deepseek_R1
- Owner: 66julienmartin
- License: mit
- Created: 2025-02-05T13:13:00.000Z (9 months ago)
- Default Branch: main
- Last Pushed: 2025-03-20T01:40:56.000Z (8 months ago)
- Last Synced: 2025-03-20T02:38:32.497Z (8 months ago)
- Language: JavaScript
- Size: 27.3 KB
- Stars: 31
- Watchers: 2
- Forks: 4
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-MCP-Servers-directory - Deepseek_R1 - A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3) (AI Services)
- awesome-mcp-servers - Deepseek R1 MCP Server - A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3) (Table of Contents / AI Services)
- awesome-mcp-servers - Deepseek R1 MCP Server - A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3) (Table of Contents / AI Services)
- metorial-index - Deepseek R1 MCP Server - The Deepseek R1 MCP Server allows the Deepseek R1 language model to connect with various tools and data sources, enabling enhanced reasoning and conversation capabilities with a large context window of 8192 tokens. (Cloud Services)
- awesome-mcp-servers - **MCP-server-Deepseek_R1** - A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3) `javascript` `mcp` `server` `http` `git` `npm install 66julienmartin/MCP-server-Deepseek_R1` (🌐 Web Development)
README
# Deepseek R1 MCP Server
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js?
This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.
## Quick Start
### Installing manually
```bash
# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
# Set up environment
cp .env.example .env # Then add your API key
# Build and run
npm run build
```
## Prerequisites
- Node.js (v18 or higher)
- npm
- Claude Desktop
- Deepseek API key
## Model Selection
By default, this server uses the **deepseek-R1** model. If you want to use **DeepSeek-V3** instead, modify the model name in `src/index.ts`:
```typescript
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
```
## Project Structure
```
deepseek-r1-mcp/
├── src/
│ ├── index.ts # Main server implementation
├── build/ # Compiled files
│ ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json
```
## Configuration
1. Create a `.env` file:
```
DEEPSEEK_API_KEY=your-api-key-here
```
2. Update Claude Desktop configuration:
```json
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": ["/path/to/deepseek-r1-mcp/build/index.js"],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
```
## Development
```bash
npm run dev # Watch mode
npm run build # Build for production
```
## Features
- Advanced text generation with Deepseek R1 (8192 token context window)
- Configurable parameters (max_tokens, temperature)
- Robust error handling with detailed error messages
- Full MCP protocol support
- Claude Desktop integration
- Support for both DeepSeek-R1 and DeepSeek-V3 models
## API Usage
```typescript
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
```
## The Temperature Parameter
The default value of `temperature` is 0.2.
Deepseek recommends setting the `temperature` according to your specific use case:
| USE CASE | TEMPERATURE | EXAMPLE |
|----------|-------------|---------|
| Coding / Math | 0.0 | Code generation, mathematical calculations |
| Data Cleaning / Data Analysis | 1.0 | Data processing tasks |
| General Conversation | 1.3 | Chat and dialogue |
| Translation | 1.3 | Language translation |
| Creative Writing / Poetry | 1.5 | Story writing, poetry generation |
## Error Handling
The server provides detailed error messages for common issues:
- API authentication errors
- Invalid parameters
- Rate limiting
- Network issues
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
## License
MIT