https://github.com/PipedreamHQ/mcp-chat
Examples of using Pipedream's MCP server in your app or AI agent.
https://github.com/PipedreamHQ/mcp-chat
ai development mcp tools
Last synced: 4 months ago
JSON representation
Examples of using Pipedream's MCP server in your app or AI agent.
- Host: GitHub
- URL: https://github.com/PipedreamHQ/mcp-chat
- Owner: PipedreamHQ
- License: mit
- Created: 2025-06-16T11:03:04.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2025-08-27T03:33:19.000Z (5 months ago)
- Last Synced: 2025-08-27T11:56:50.026Z (5 months ago)
- Topics: ai, development, mcp, tools
- Homepage: https://pipedream.com/docs/connect/mcp/developers
- Size: 2.51 MB
- Stars: 130
- Watchers: 2
- Forks: 26
- Open Issues: 5
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
MCP Chat is a free, open-source chat app built using the AI SDK, and Pipedream MCP, which provides access to nearly 3,000 APIs and more than 10,000 tools. Use this as a reference to build powerful AI chat applications.
Features ·
Model Providers ·
Prerequisites ·
Deploy Your Own ·
Running Locally
> **Check out the app in production at [chat.pipedream.com](https://chat.pipedream.com) and refer to [Pipedream's developer docs](https://pipedream.com/docs/connect/mcp/developers) for the most up to date information.**
## Features
- **MCP integrations**: Connect to thousands of APIs through Pipedream's MCP server with built-in auth
- **Automatic tool discovery**: Execute tool calls across different APIs via chat
- **Uses the [AI SDK](https://sdk.vercel.ai/docs)**: Unified API for generating text, structured objects, and tool calls with LLMs
- **Flexible LLM and framework support**: Works with any LLM provider or framework
- **Data persistence**: Uses [Neon Serverless Postgres](https://vercel.com/marketplace/neon) for saving chat history and user data and [Auth.js](https://authjs.dev) for simple and secure sign-in
## Model Providers
The demo app currently uses models from Anthropic, OpenAI, and Gemini, but the AI SDK supports [many more](https://sdk.vercel.ai/providers/ai-sdk-providers).
### Prerequisites
To run or deploy this app, you'll need:
1. A [Pipedream account](https://pipedream.com/auth/signup)
2. A [Pipedream project](https://pipedream.com/docs/projects/#creating-projects). Accounts connected via MCP will be stored here.
3. [Pipedream OAuth credentials](https://pipedream.com/docs/rest-api/auth/#oauth)
4. An [OpenAI API key](https://platform.openai.com/api-keys)
## Deploy Your Own
One-click deploy this app to Vercel:
[](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2FPipedreamHQ%2Fmcp-chat&env=PIPEDREAM_CLIENT_ID,PIPEDREAM_CLIENT_SECRET,PIPEDREAM_PROJECT_ID,PIPEDREAM_PROJECT_ENVIRONMENT,AUTH_SECRET,GOOGLE_CLIENT_ID,GOOGLE_CLIENT_SECRET,OPENAI_API_KEY,EXA_API_KEY,POSTGRES_URL&envDescription=API%20keys%20need%20to%20run%20the%20app)
## Running locally
1. Copy the environment file and add your credentials:
```bash
cp .env.example .env # Edit with your values
```
Note that for easier development, chat persistence and application sign-in are disabled by default in the `.env.example` file:
```bash
# In your .env file
DISABLE_AUTH=true
DISABLE_PERSISTENCE=true
```
2. Install dependencies and start the app:
```bash
pnpm install
pnpm dev
```
Your local app should now be running on [http://localhost:3000](http://localhost:3000/) 🎉
### Enabling chat persistence
1. Run all required local services:
```bash
docker compose up -d
```
2. Run migrations:
```bash
POSTGRES_URL=postgresql://postgres@localhost:5432/postgres pnpm db:migrate
```