https://github.com/mastra-ai/mastra
The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama.
https://github.com/mastra-ai/mastra
agents ai chatbots evals javascript llm mcp nextjs nodejs reactjs tts typescript workflows
Last synced: 6 months ago
JSON representation
The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama.
- Host: GitHub
- URL: https://github.com/mastra-ai/mastra
- Owner: mastra-ai
- License: other
- Created: 2024-08-06T20:44:31.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2025-05-14T21:39:52.000Z (6 months ago)
- Last Synced: 2025-05-14T22:05:03.303Z (6 months ago)
- Topics: agents, ai, chatbots, evals, javascript, llm, mcp, nextjs, nodejs, reactjs, tts, typescript, workflows
- Language: TypeScript
- Homepage: https://mastra.ai
- Size: 171 MB
- Stars: 12,934
- Watchers: 53
- Forks: 711
- Open Issues: 177
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE.md
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
- awesome-mcp-devtools - Mastra AI - Mastra is a Typescript-based AI agent framework that can be used to author MCP servers. (SDKs / JavaScript/TypeScript)
- awesome-agents - Mastra - ai/mastra?style=social) (Frameworks)
- awesome-ChatGPT-repositories - mastra - The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama. (Chatbots)
- awesome-repositories - mastra-ai/mastra - The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama. (TypeScript)
- Awesome-RAG - Mastra
- StarryDivineSky - mastra-ai/mastra - 4、Claude、Gemini和Llama等。该框架旨在简化AI Agent的开发流程,提供强大的工具和基础设施。通过Mastra,开发者可以轻松创建智能助手,利用RAG技术提升生成内容的质量,并监控Agent的运行状态。它是一个灵活且功能丰富的平台,适用于各种AI应用场景。Mastra利用TypeScript的优势,提供类型安全和可维护的代码库。该项目致力于成为AI Agent开发的首选框架。 (A01_文本生成_文本对话 / 大语言对话模型及数据)
- metorial-index - Mastra MCP Client - Integrate with MCP-compatible AI models and tools for enhanced functionality, allowing the management of multiple server connections and tool execution. It simplifies resource discovery, automatic error handling, and logging. (APIs and HTTP Requests)
- my-awesome-list - mastra - 4, Claude, Gemini, Llama. | mastra-ai | 18205 | (TypeScript)
- AiTreasureBox - mastra-ai/mastra - 11-03_17961_2](https://img.shields.io/github/stars/mastra-ai/mastra.svg)|the TypeScript AI agent framework| (Repos)
- Awesome-AI-Agents - Mastra - Mastra is an opinionated TypeScript framework that helps you build AI applications and features quickly.  (Frameworks / Advanced Components)
- awesome-hacking-lists - mastra-ai/mastra - The TypeScript AI agent framework. ⚡ Assistants, RAG, observability. Supports any LLM: GPT-4, Claude, Gemini, Llama. (TypeScript)
- awesome-llm-agents - Mastra - TypeScript AI agent framework with (Frameworks)
README
# Mastra
[](https://www.npmjs.com/package/@mastra/core)

[](https://discord.gg/BTYqqHKUrf)
[](https://x.com/mastra_ai)
[](https://www.npmjs.com/package/@mastra/core)

Mastra is an opinionated TypeScript framework that helps you build AI applications and features quickly. It gives you the set of primitives you need: workflows, agents, RAG, integrations and evals. You can run Mastra on your local machine, or deploy to a serverless cloud.
The main Mastra features are:
| Features | Description |
| ------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| LLM Models | Mastra uses the [Vercel AI SDK](https://sdk.vercel.ai/docs/introduction) for model routing, providing a unified interface to interact with any LLM provider including OpenAI, Anthropic, and Google Gemini. You can choose the specific model and provider, and decide whether to stream the response. |
| [Agents](https://mastra.ai/docs/agents/overview) | Agents are systems where the language model chooses a sequence of actions. In Mastra, agents provide LLM models with tools, workflows, and synced data. Agents can call your own functions or APIs of third-party integrations and access knowledge bases you build. |
| [Tools](https://mastra.ai/docs/agents/adding-tools) | Tools are typed functions that can be executed by agents or workflows, with built-in integration access and parameter validation. Each tool has a schema that defines its inputs, an executor function that implements its logic, and access to configured integrations. |
| [Workflows](https://mastra.ai/docs/workflows/overview) | Workflows are durable graph-based state machines. They have loops, branching, wait for human input, embed other workflows, do error handling, retries, parsing and so on. They can be built in code or with a visual editor. Each step in a workflow has built-in OpenTelemetry tracing. |
| [RAG](https://mastra.ai/docs/rag/overview) | Retrieval-augemented generation (RAG) lets you construct a knowledge base for agents. RAG is an ETL pipeline with specific querying techniques, including chunking, embedding, and vector search. |
| [Integrations](https://mastra.ai/docs/integrations) | In Mastra, integrations are auto-generated, type-safe API clients for third-party services that can be used as tools for agents or steps in workflows. |
| [Evals](https://mastra.ai/docs/08-running-evals) | Evals are automated tests that evaluate LLM outputs using model-graded, rule-based, and statistical methods. Each eval returns a normalized score between 0-1 that can be logged and compared. Evals can be customized with your own prompts and scoring functions. |
## Quick Start
### Prerequisites
- Node.js (v20.0+)
## Get an LLM provider API key
If you don't have an API key for an LLM provider, you can get one from the following services:
- [OpenAI](https://platform.openai.com/)
- [Anthropic](https://console.anthropic.com/settings/keys)
- [Google Gemini](https://ai.google.dev/gemini-api/docs)
- [Groq](https://console.groq.com/docs/overview)
- [Cerebras](https://inference-docs.cerebras.ai/introduction)
If you don't have an account with these providers, you can sign up and get an API key. Anthropic require a credit card to get an API key. Some OpenAI models and Gemini do not and have a generous free tier for its API.
## Create a new project
The easiest way to get started with Mastra is by using `create-mastra`. This CLI tool enables you to quickly start building a new Mastra application, with everything set up for you.
```bash
npx create-mastra@latest
```
### Run the script
Finally, run `mastra dev` to open the Mastra playground.
```bash copy
npm run dev
```
If you're using Anthropic, set the `ANTHROPIC_API_KEY`. If you're using Gemini, set the `GOOGLE_GENERATIVE_AI_API_KEY`.
## Contributing
Looking to contribute? All types of help are appreciated, from coding to testing and feature specification.
If you are a developer and would like to contribute with code, please open an issue to discuss before opening a Pull Request.
Information about the project setup can be found in the [development documentation](./DEVELOPMENT.md)
## Support
We have an [open community Discord](https://discord.gg/BTYqqHKUrf). Come and say hello and let us know if you have any questions or need any help getting things running.
It's also super helpful if you leave the project a star here at the [top of the page](https://github.com/mastra-ai/mastra)