https://github.com/sammcj/mcp-llm
  
  
     An MCP server that provides LLMs access to other LLMs 
    https://github.com/sammcj/mcp-llm
  
anthropic bedrock claude llama llm mcp mcp-server ollama openai
        Last synced: 4 months ago 
        JSON representation
    
An MCP server that provides LLMs access to other LLMs
- Host: GitHub
 - URL: https://github.com/sammcj/mcp-llm
 - Owner: sammcj
 - License: mit
 - Created: 2025-03-06T06:45:01.000Z (8 months ago)
 - Default Branch: main
 - Last Pushed: 2025-03-15T05:37:54.000Z (8 months ago)
 - Last Synced: 2025-03-15T06:27:53.832Z (8 months ago)
 - Topics: anthropic, bedrock, claude, llama, llm, mcp, mcp-server, ollama, openai
 - Language: JavaScript
 - Homepage: https://smcleod.net
 - Size: 396 KB
 - Stars: 7
 - Watchers: 1
 - Forks: 3
 - Open Issues: 0
 - 
            Metadata Files:
            
- Readme: README.md
 - Changelog: CHANGELOG.md
 - Funding: .github/FUNDING.yml
 - License: LICENSE
 
 
Awesome Lists containing this project
- awesome-mcp-servers - **mcp-llm** - An MCP server that provides LLMs access to other LLMs `javascript` `anthropic` `bedrock` `claude` `llama` `npm install sammcj/mcp-llm` (🤖 AI/ML)
 - awesome-mcp-servers - **mcp-llm** - An MCP server that provides LLMs access to other LLMs `javascript` `anthropic` `bedrock` `claude` `llama` `npm install sammcj/mcp-llm` (AI/ML)
 - metorial-index - LLM Server - Generate code, write documentation, and answer questions using advanced language models. Utilize the LlamaIndexTS library to enhance coding tasks and improve documentation quality. (Developer Tools)
 
README
          # MCP LLM
[](https://smithery.ai/server/@sammcj/mcp-llm)
An MCP server that provides access to LLMs using the LlamaIndexTS library.

## Features
This MCP server provides the following tools:
- `generate_code`: Generate code based on a description
- `generate_code_to_file`: Generate code and write it directly to a file at a specific line number
- `generate_documentation`: Generate documentation for code
- `ask_question`: Ask a question to the LLM


## Installation
### Installing via Smithery
To install LLM Server for Claude Desktop automatically via [Smithery](https://smithery.ai/server/@sammcj/mcp-llm):
```bash
npx -y @smithery/cli install @sammcj/mcp-llm --client claude
```
### Manual Install From Source
1. Clone the repository
2. Install dependencies:
```bash
npm install
```
3. Build the project:
```bash
npm run build
```
4. Update your MCP configuration
### Using the Example Script
The repository includes an example script that demonstrates how to use the MCP server programmatically:
```bash
node examples/use-mcp-server.js
```
This script starts the MCP server and sends requests to it using curl commands.
## Examples
### Generate Code
```json
{
  "description": "Create a function that calculates the factorial of a number",
  "language": "JavaScript"
}
```
### Generate Code to File
```json
{
  "description": "Create a function that calculates the factorial of a number",
  "language": "JavaScript",
  "filePath": "/path/to/factorial.js",
  "lineNumber": 10,
  "replaceLines": 0
}
```
The `generate_code_to_file` tool supports both relative and absolute file paths. If a relative path is provided, it will be resolved relative to the current working directory of the MCP server.
### Generate Documentation
```json
{
  "code": "function factorial(n) {\n  if (n <= 1) return 1;\n  return n * factorial(n - 1);\n}",
  "language": "JavaScript",
  "format": "JSDoc"
}
```
### Ask Question
```json
{
  "question": "What is the difference between var, let, and const in JavaScript?",
  "context": "I'm a beginner learning JavaScript and confused about variable declarations."
}
```
## License
- [MIT LICENSE](LICENSE)