https://github.com/hideya/mcp-client-langchain-ts
MCP Client Implementation Using LangChain / TypeScript
https://github.com/hideya/mcp-client-langchain-ts
langchain langchain-typescript mcp mcp-client modelcontextprotocol nodejs tool-call tool-calling typescript
Last synced: 6 months ago
JSON representation
MCP Client Implementation Using LangChain / TypeScript
- Host: GitHub
- URL: https://github.com/hideya/mcp-client-langchain-ts
- Owner: hideya
- License: mit
- Created: 2024-12-31T13:54:03.000Z (10 months ago)
- Default Branch: main
- Last Pushed: 2025-03-28T02:38:48.000Z (7 months ago)
- Last Synced: 2025-03-28T03:26:09.469Z (7 months ago)
- Topics: langchain, langchain-typescript, mcp, mcp-client, modelcontextprotocol, nodejs, tool-call, tool-calling, typescript
- Language: TypeScript
- Homepage:
- Size: 132 KB
- Stars: 7
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MCP Client Using LangChain / TypeScript [](https://github.com/hideya/mcp-langchain-client-ts/blob/main/LICENSE)
This simple [Model Context Protocol (MCP)](https://modelcontextprotocol.io/)
client demonstrates the use of MCP server tools by LangChain ReAct Agent.It leverages a utility function `convertMcpToLangchainTools()` from
[`@h1deya/langchain-mcp-tools`](https://www.npmjs.com/package/@h1deya/langchain-mcp-tools).
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into an array of LangChain-compatible tools
([`StructuredTool[]`](https://api.js.langchain.com/classes/_langchain_core.tools.StructuredTool.html)).LLMs from Anthropic, OpenAI and Groq are currently supported.
A python version of this MCP client is available
[here](https://github.com/hideya/mcp-client-langchain-py)## Prerequisites
- Node.js 16+
- npm 7+ (`npx`) to run Node.js-based MCP servers
- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/)
installed to run Python-based MCP servers
- API keys from [Anthropic](https://console.anthropic.com/settings/keys),
[OpenAI](https://platform.openai.com/api-keys), and/or
[Groq](https://console.groq.com/keys)
as needed.## Setup
1. Install dependencies:
```bash
npm install
```2. Setup API keys:
```bash
cp .env.template .env
```
- Update `.env` as needed.
- `.gitignore` is configured to ignore `.env`
to prevent accidental commits of the credentials.3. Configure LLM and MCP Servers settings `llm_mcp_config.json5` as needed.
- [The configuration file format](https://github.com/hideya/mcp-client-langchain-ts/blob/main/llm_mcp_config.json5)
for MCP servers follows the same structure as
[Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
with one difference: the key name `mcpServers` has been changed
to `mcp_servers` to follow the snake_case convention
commonly used in JSON configuration files.
- The file format is [JSON5](https://json5.org/),
where comments and trailing commas are allowed.
- The format is further extended to replace `${...}` notations
with the values of corresponding environment variables.
- Keep all the credentials and private info in the `.env` file
and refer to them with `${...}` notation as needed.## Usage
Run the app:
```bash
npm start
```Run in verbose mode:
```bash
npm run start:v
```See commandline options:
```bash
npm run start:h
```At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
Example queries can be configured in `llm_mcp_config.json5`