{"id":31956356,"url":"https://github.com/profullstack/mcp-server","last_synced_at":"2025-10-14T14:50:07.931Z","repository":{"id":292866453,"uuid":"982197253","full_name":"profullstack/mcp-server","owner":"profullstack","description":"A generic, modular server for implementing the Model Context Protocol (MCP). ","archived":false,"fork":false,"pushed_at":"2025-08-14T15:41:00.000Z","size":1185,"stargazers_count":41,"open_issues_count":1,"forks_count":1,"subscribers_count":2,"default_branch":"master","last_synced_at":"2025-09-16T11:54:53.473Z","etag":null,"topics":["mcp-server"],"latest_commit_sha":null,"homepage":"https://mcp.profullstack.com","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"isc","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/profullstack.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2025-05-12T14:13:21.000Z","updated_at":"2025-08-14T15:41:04.000Z","dependencies_parsed_at":"2025-05-12T15:49:40.899Z","dependency_job_id":"8ec546a6-aa52-4959-b79c-fcfc9125b2ae","html_url":"https://github.com/profullstack/mcp-server","commit_stats":null,"previous_names":["profullstack/mcp-server"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/profullstack/mcp-server","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/profullstack%2Fmcp-server","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/profullstack%2Fmcp-server/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/profullstack%2Fmcp-server/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/profullstack%2Fmcp-server/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/profullstack","download_url":"https://codeload.github.com/profullstack/mcp-server/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/profullstack%2Fmcp-server/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":279019140,"owners_count":26086685,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-10-14T02:00:06.444Z","response_time":60,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["mcp-server"],"created_at":"2025-10-14T14:48:39.918Z","updated_at":"2025-10-14T14:50:07.924Z","avatar_url":"https://github.com/profullstack.png","language":"JavaScript","readme":"# MCP Server (Model Context Protocol)\n\n\u003c!-- readme-badges:start --\u003e\n\n[![Node](https://img.shields.io/badge/Node-6DA55F.svg?logo=node.js\u0026logoColor=fff\u0026style=for-the-badge)](https://github.com/profullstack/mcp-server)\n\n\u003c!-- readme-badges:end --\u003e\n\nA generic, modular server for implementing the Model Context Protocol (MCP). This server provides a framework for controlling and interacting with various models through a standardized API.\n\n[![Crypto Payment](https://paybadge.profullstack.com/badge.svg)](https://paybadge.profullstack.com/?tickers=btc%2Ceth%2Csol%2Cusdc)\n\n## Features\n\n- Modular architecture for easy extension\n- Dynamic module loading\n- Core model management functionality\n- Standardized API for model context\n- Simple configuration system\n- Logging utilities\n- Enhanced module structure with proper separation of concerns\n- Package.json support for modules with dependency management\n- Comprehensive testing infrastructure with Mocha and Chai\n- Powerful module search functionality\n- Module metadata display in API responses\n- Integration with real AI model providers (OpenAI, Stability AI, Anthropic, Hugging Face)\n- Support for text generation, image generation, and speech-to-text models\n- Streaming inference support for compatible models\n\n## Getting Started\n\n### Prerequisites\n\n- Node.js 18.x or higher\n- pnpm 10.x or higher\n\nThis project uses ES Modules (ESM) exclusively. All imports use the `import` syntax rather than `require()`.\n\n### Installation\n\n```bash\n# Clone the repository\ngit clone https://github.com/yourusername/mcp-server.git\ncd mcp-server\n\n# Install dependencies\npnpm install\n```\n\n### Running the Server\n\n```bash\n# Install dependencies\npnpm install\n\n# Start the server\npnpm start\n\n# Start the server in development mode (with auto-reload)\npnpm dev\n```\n\nThe server will start on http://localhost:3000 by default.\n\n### Configuration\n\nCopy the sample environment file and edit it with your API keys:\n\n```bash\n# Copy the sample environment file\ncp sample.env .env\n\n# Edit the file with your favorite editor\nnano .env\n```\n\nAt minimum, you'll need to add API keys for the model providers you want to use:\n\n```\n# OpenAI API (for GPT-4 and Whisper)\nOPENAI_API_KEY=your_openai_api_key_here\n\n# Stability AI API (for Stable Diffusion)\nSTABILITY_API_KEY=your_stability_api_key_here\n\n# Anthropic API (for Claude models)\nANTHROPIC_API_KEY=your_anthropic_api_key_here\n```\n\nYou can get these API keys from:\n\n- OpenAI: https://platform.openai.com/api-keys\n- Stability AI: https://platform.stability.ai/account/keys\n- Anthropic: https://console.anthropic.com/settings/keys\n\n### Testing the Server\n\nThe repository includes comprehensive testing using Mocha and Chai:\n\n```bash\n# Run all tests\npnpm test\n\n# Run only module tests\npnpm test:modules\n\n# Run all tests (both core and modules)\npnpm test:all\n```\n\nThe testing infrastructure includes:\n\n1. Core server tests for module loading, routing, and other core functionality\n2. Module-specific tests for each module's functionality\n3. Support for ES modules in tests\n4. Mocking and stubbing utilities with Sinon\n\nTests are organized in a structured way:\n\n- Core tests in `/test/core/`\n- Module tests in each module's `test/` directory\n\nThis comprehensive testing ensures code quality and makes it easier to detect regressions when making changes.\n\n### Pre-commit Hooks\n\nThe repository includes pre-commit hooks using Husky and lint-staged:\n\n```bash\n# The hooks are automatically installed when you run\npnpm install\n```\n\nThe pre-commit hooks:\n\n1. Run ESLint on JavaScript files\n2. Run Prettier on all staged files\n\nThis ensures that all code committed to the repository follows coding standards and maintains code quality. The test suite is continuously being improved to provide better coverage and reliability, and will be enabled in the pre-commit hook once it's more stable.\n\n### Docker Support\n\nThe repository includes Docker support for easy containerization and deployment:\n\n```bash\n# Build and run with Docker\ndocker build -t mcp-server .\ndocker run -p 3000:3000 mcp-server\n\n# Or use Docker Compose\ndocker-compose up\n```\n\nThe Docker configuration:\n\n- Uses Node.js 20 Alpine as the base image\n- Exposes port 3000\n- Mounts the modules directory as a volume for easy module management\n- Includes health checks\n\n## Standard MCP Methods\n\nThe MCP server implements a standardized set of methods that all MCP servers should provide:\n\n### Server Information\n\n- `GET /` - Basic server information\n- `GET /status` - Detailed server status\n- `GET /health` - Health check endpoint\n- `GET /metrics` - Server metrics\n\n### Model Management\n\n- `GET /models` - List available models\n- `GET /model/:modelId` - Get model information\n- `POST /model/:modelId/activate` - Activate a specific model\n- `POST /model/deactivate` - Deactivate the current model\n- `GET /model/active` - Get information about the active model\n\n### Inference\n\n- `POST /model/infer` - Perform inference with the active model\n- `POST /model/:modelId/infer` - Perform inference with a specific model\n\n#### Supported Models\n\nThe MCP server supports the following model types:\n\n| Model Type       | Provider     | Capabilities     | Example IDs                    |\n| ---------------- | ------------ | ---------------- | ------------------------------ |\n| GPT Models       | OpenAI       | Text generation  | gpt-4, gpt-3.5-turbo           |\n| Whisper          | OpenAI       | Speech-to-text   | whisper, whisper-1             |\n| Stable Diffusion | Stability AI | Image generation | stable-diffusion-xl-1024-v1-0  |\n| Claude Models    | Anthropic    | Text generation  | claude-3-opus, claude-3-sonnet |\n| Custom Models    | Hugging Face | Various          | (any Hugging Face model ID)    |\n\n#### Inference Examples\n\nText generation with GPT-4:\n\n```bash\n# Activate the model\ncurl -X POST http://localhost:3000/model/gpt-4/activate \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\"config\": {\"temperature\": 0.7}}'\n\n# Perform inference\ncurl -X POST http://localhost:3000/model/infer \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"prompt\": \"Explain quantum computing in simple terms\",\n    \"temperature\": 0.5,\n    \"max_tokens\": 200\n  }'\n```\n\nImage generation with Stable Diffusion:\n\n```bash\n# Activate the model\ncurl -X POST http://localhost:3000/model/stable-diffusion/activate \\\n  -H \"Content-Type: application/json\" \\\n  -d '{}'\n\n# Generate an image\ncurl -X POST http://localhost:3000/model/infer \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"prompt\": \"A beautiful sunset over mountains\",\n    \"height\": 1024,\n    \"width\": 1024,\n    \"steps\": 30\n  }'\n```\n\nStreaming text generation:\n\n```bash\n# Enable streaming\ncurl -X POST http://localhost:3000/model/infer \\\n  -H \"Content-Type: application/json\" \\\n  -d '{\n    \"prompt\": \"Write a short story about a robot\",\n    \"stream\": true\n  }'\n```\n\n### Module Management\n\n- `GET /modules` - List installed modules\n- `GET /modules/:moduleId` - Get module information\n- `GET /modules/search/:query` - Search modules by any field in their package.json or metadata\n\n### Tools and Resources\n\n- `GET /tools` - List available tools\n- `GET /resources` - List available resources\n\nFor detailed information about these methods, see [MCP Standard Methods](docs/mcp_standard_methods.md).\n\n## Configuration\n\nConfiguration is loaded from environment variables and stored in `src/core/config.js`. The easiest way to configure the server is to edit the `.env` file in the project root.\n\n### Environment Variables\n\nKey environment variables include:\n\n| Variable            | Description                          | Default                            |\n| ------------------- | ------------------------------------ | ---------------------------------- |\n| PORT                | Server port                          | 3000                               |\n| HOST                | Server host                          | localhost                          |\n| NODE_ENV            | Environment (development/production) | development                        |\n| OPENAI_API_KEY      | OpenAI API key                       | (required for OpenAI models)       |\n| STABILITY_API_KEY   | Stability AI API key                 | (required for Stable Diffusion)    |\n| ANTHROPIC_API_KEY   | Anthropic API key                    | (required for Claude models)       |\n| HUGGINGFACE_API_KEY | Hugging Face API key                 | (required for Hugging Face models) |\n\nSee `sample.env` for a complete list of configuration options.\n\n## Examples\n\nThe repository includes several examples to help you get started:\n\n- **Client Example**: `examples/client.js` demonstrates how to interact with the MCP server from a client application.\n- **Custom Module Example**: `examples/custom-module/` shows how to create a custom module that adds a calculator tool to the server.\n\nTo run the client example:\n\n```bash\nnode examples/client.js\n```\n\nTo use the custom module example, copy it to the modules directory:\n\n```bash\ncp -r examples/custom-module mcp_modules/calculator\n```\n\n## Creating Modules\n\nModules are the primary way to extend the MCP server. Each module is a self-contained package that can add new functionality to the server.\n\n### Module Structure\n\nModules now follow an enhanced structure with better organization:\n\n```\nmcp_modules/your-module/\n├── assets/          # Static assets (images, CSS, etc.)\n├── docs/            # Documentation files\n├── examples/        # Example usage\n├── src/             # Source code\n│   ├── controller.js  # HTTP route handlers\n│   ├── service.js     # Business logic\n│   └── utils.js       # Utility functions\n├── test/            # Test files\n│   ├── controller.test.js\n│   └── service.test.js\n├── index.js         # Main module file with register function\n├── package.json     # Module metadata, dependencies, and scripts\n└── README.md        # Module documentation\n```\n\nEach module should include a `package.json` file with:\n\n- Name, version, description\n- Author and license information\n- Dependencies and dev dependencies\n- Scripts (especially for testing)\n- Keywords and other metadata\n\nThis structure provides better separation of concerns, makes testing easier, and improves module discoverability.\n\n### Module Implementation\n\nThe main module file (`index.js`) must export a `register` function that will be called when the module is loaded:\n\n```javascript\n/**\n * Register this module with the Hono app\n * @param {import('hono').Hono} app - The Hono app instance\n */\nexport async function register(app) {\n  // Register routes, middleware, etc.\n  app.get('/your-module/endpoint', c =\u003e {\n    return c.json({ message: 'Your module is working!' });\n  });\n}\n\n// Optional: Export module metadata\nexport const metadata = {\n  name: 'Your Module',\n  version: '1.0.0',\n  description: 'Description of your module',\n  author: 'Your Name',\n};\n```\n\n### Example Modules\n\n- A simple example module is provided in `mcp_modules/example/` to demonstrate how to create a module.\n- A more complex example with a calculator tool is provided in `examples/custom-module/`.\n- A health check module is provided in `mcp_modules/health-check/` for system monitoring.\n- A template for creating new modules is available in `mcp_modules/template/`.\n\n### Creating New Modules\n\nYou can create a new module using the provided script:\n\n```bash\n# Create a new module\npnpm create-module\n\n# Or with a module name\npnpm create-module my-module\n```\n\nThe script will:\n\n1. Create a new module directory in `mcp_modules/`\n2. Copy the template files\n3. Replace placeholders with your module information\n4. Provide next steps for implementing your module\n\n## Module Search\n\nThe MCP server includes a powerful search functionality that allows you to find modules based on any information in their package.json or metadata.\n\n### Search Endpoints\n\n- `GET /modules/search/:query` - Search for modules containing the specified query string in any field\n\n### Search Examples\n\n```bash\n# Find modules by name or description\ncurl http://localhost:3000/modules/search/craigslist\n\n# Find modules by dependency\ncurl http://localhost:3000/modules/search/jsdom\n\n# Find modules by keyword\ncurl http://localhost:3000/modules/search/mcp\n\n# Find modules by author\ncurl http://localhost:3000/modules/search/\"MCP Server Team\"\n\n# Find modules by license\ncurl http://localhost:3000/modules/search/ISC\n```\n\n### JavaScript Example\n\n```javascript\n// Function to search modules by any field\nasync function searchModules(query) {\n  const response = await fetch(`http://localhost:3000/modules/search/${query}`);\n  const data = await response.json();\n\n  console.log(`Found ${data.count} modules matching \"${query}\":`);\n  data.results.forEach(module =\u003e {\n    console.log(`- ${module.name} (${module.directoryName}): ${module.description}`);\n  });\n\n  return data.results;\n}\n```\n\nThe search is comprehensive and will find matches in any field, including nested objects like dependencies, keywords, and other metadata.\n\n## Model Providers\n\nThe MCP server integrates with several AI model providers:\n\n### OpenAI\n\nOpenAI provides GPT models for text generation and Whisper for speech-to-text:\n\n```javascript\n// Text generation example\nconst response = await fetch('http://localhost:3000/model/infer', {\n  method: 'POST',\n  headers: { 'Content-Type': 'application/json' },\n  body: JSON.stringify({\n    prompt: 'Write a poem about artificial intelligence',\n    temperature: 0.7,\n    max_tokens: 200,\n  }),\n});\n\n// Speech-to-text example (requires multipart form data)\nconst formData = new FormData();\nformData.append('file', audioFile);\nformData.append('model', 'whisper-1');\nformData.append('language', 'en');\n\nconst response = await fetch('http://localhost:3000/model/whisper/infer', {\n  method: 'POST',\n  body: formData,\n});\n```\n\n### Stability AI\n\nStability AI provides Stable Diffusion for image generation:\n\n```javascript\nconst response = await fetch('http://localhost:3000/model/infer', {\n  method: 'POST',\n  headers: { 'Content-Type': 'application/json' },\n  body: JSON.stringify({\n    prompt: 'A photorealistic image of a futuristic city',\n    height: 1024,\n    width: 1024,\n    steps: 30,\n    cfg_scale: 7,\n  }),\n});\n\n// The response includes base64-encoded images\nconst result = await response.json();\nconst imageBase64 = result.response[0].base64;\n```\n\n### Anthropic\n\nAnthropic provides Claude models for text generation:\n\n```javascript\nconst response = await fetch('http://localhost:3000/model/infer', {\n  method: 'POST',\n  headers: { 'Content-Type': 'application/json' },\n  body: JSON.stringify({\n    prompt: 'Explain how neural networks work',\n    temperature: 0.5,\n    max_tokens: 300,\n  }),\n});\n```\n\n### Hugging Face\n\nHugging Face provides access to thousands of open-source models:\n\n```javascript\nconst response = await fetch('http://localhost:3000/model/custom-model-name/infer', {\n  method: 'POST',\n  headers: { 'Content-Type': 'application/json' },\n  body: JSON.stringify({\n    prompt: 'Input for the model',\n    parameters: {\n      // Model-specific parameters\n    },\n  }),\n});\n```\n\n## Documentation\n\n- [MCP Standard Methods](docs/mcp_standard_methods.md): Documentation of the standard methods that all MCP servers should implement.\n- [MCP Interface](docs/mcp_interface.ts): TypeScript interface definitions for the MCP protocol.\n- [Architecture](docs/architecture.md): Overview of the MCP server architecture.\n\n\u003ca href=\"https://glama.ai/mcp/servers/@profullstack/mcp-server\"\u003e\n  \u003cimg width=\"380\" height=\"200\" src=\"https://glama.ai/mcp/servers/@profullstack/mcp-server/badge\" /\u003e\n\n## License\n\nISC\n","funding_links":[],"categories":["📚 Projects (1974 total)"],"sub_categories":["MCP Servers"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprofullstack%2Fmcp-server","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fprofullstack%2Fmcp-server","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fprofullstack%2Fmcp-server/lists"}